Download:
pdf |
pdfExperimental Investigations of Instruction
and the Language of Instruction with
Spanish-speaking English-language Learners
David J. Francis, Organizer
University of Houston, Texas Institute for Measurement,
Evaluation, and Statistics
AERA, San Francisco, April 9, 2006
Overview of Symposium
Randomized Trials of Research-based Instruction
Coleen D. Carlson, University of Houston
A Randomized Study of Language of Reading Instruction:
First Year Findings
Robert E. Slavin, Johns Hopkins University
Project ELLA
Rafael Lara-Alecio, Texas A&M University
Discussion
Larry V. Hedges, Northwestern University
Funding for all projects provided by Institute of Education
Sciences, National Center for Education Evaluation
Randomized Trials of Research-based
Instruction for Spanish-speaking ELLs:
Effects under two Language of Instruction
Models
University of Houston
University of Texas at Austin
Center for Applied Linguistics
University of Miami
Research Team and Acknowledgements
University of Houston, TIMES
David Francis, Coleen Carlson, Hector Rivera
Center for Applied Linguistics
Diane August, Debbie Short, Carolyn Ager
University of Texas – Austin
Sylvia Linan-Thompson, Sharon Vaughn
University of Miami
Maria Carlo
Institute of Education Sciences, NCEE
Susan Sanchez, Project Officer
Primary Objectives
Optimize the language, literacy, and academic outcomes of
ELLs
Reframe the current debate around language of instruction for
ELLs
Focus on improving the use of language in instruction in order
to optimize achievement outcomes through all approaches
Primary Objectives
Develop, implement, and test research enhanced versions of
two instruction models
Structured English Immersion (English Only) model
Transitional Bilingual Education model
Use random assignment of teachers to provide a strong test of
the efficacy of the enhanced versions over current approaches
Use quasi-experimental design to compare the two research
enhanced approaches with respect to English and Spanish
language outcomes
Overview of Study Design
Brownsville Independent School District
Two Language of Instruction Models
English only (EO: Non-LEP/PD)
Transitional Bilingual Education (TBE)
School continues to implement the general model(s) that the
school has adopted
Study is implemented in one grade per year
Study Design: Program Models
English Only
TBE
Kindergarten: emphasis
on English language and
literacy development
with some Spanish
language support
K– Grade 1: emphasis on
Spanish language and
literacy development with
English oral language
development
Grades 1–3: emphasis on
English language and
literacy development
Grades 2-3: emphasis on
Spanish and English
language and literacy
development
Study Design: Assignment
In 2004-2005, Kindergarten teachers were randomly assigned
to: 1) implement the research enhanced instruction (TX), or;
2) continue to implement current practice (CO)
In 2004-2005, G1 teachers were also randomly assigned to
TX and CO so that training could begin for G1 teachers in
2004-2005
In 2005-2006, students in the K TX classrooms matriculated
into the G1 TX classrooms, and K CO into G1 CO
classrooms
Teachers assigned to implement current practice have the
option to receive training in the research enhanced methods
the following year
Study Design: Student Sampling
2004-2005
2005-2006
2006-2007
K
1
2
1
2
2
Overview of Study Design (continued)
Students are assessed in English and in Spanish
Student assessment in the beginning of each school year and
at the end of the year
Assessment coordinated with school, but conducted by
research team
School, district, and state assessments of language and
achievement will be incorporated with permission from
parents
3-Tier Model of Instruction
Tier I (core instruction)
Enhanced Language Enrichment/Esperanza (Grades 1-2)
Enhanced McMillan (Grades 1-2)
SIOP (Grades K-2)
LECTURA (English and Spanish) (Grades K-2)
Tier II - Secondary Intervention (Grades K-2)
Classroom-based supplemental reading instruction
Tier III - Tertiary Intervention (Grade 2)
Intensive small-group pull-out intervention
The SIOP Model
A means for making grade-level academic content
(science, social studies, math) more accessible for
ELLs while at the same time promoting their English
language development.
The practice of highlighting key language features
and incorporating strategies that make the content
comprehensible to students.
The SIOP Model
Preparation
Building Background
Comprehensible Input
Strategies
Interaction
Practice & Application
Lesson Delivery
Review & Assessment
Sheltered Instruction for Academic Achievement (Echevarria, Vogt, & Short, 2004)
Study Questions
Compare outcomes between research enhanced and current
practice classrooms within each language of instruction model
Compare outcomes between language of instruction models
Compare
Language and literacy outcomes
Outcomes in both English and Spanish
Outcomes over time
Study Questions (continued)
Go beyond simple mean comparisons:
examine student growth over time
identify characteristics of students that relate to optimal
growth under each instruction model
identify characteristics of instruction that lead to
optimal growth
examine if/how differences between language
outcomes change over time
Kindergarten Sample
Schools =
English
Spanish
14* 2 programs
8
13
Teachers =
55
English
Treatment 9
Control
10
Spanish
Treatment 18
Control
18
Students = 1,192
Tier 2
Study
English
Spanish
Group
No
Yes
TX
163
33
CO
184
38
TX
314
69
CO
324
67
Overall Attrition (Fall-Spring) 10.2%
Outcome Measures
Letter Names and Sounds
Phonological Awareness (CTOPP and TOPPS)
Woodcock Language Proficiency Battery
Picture Vocabulary
Letter Word ID
Listening Comprehension
Analyses of Kindergarten Student
Outcomes
Mixed model ANCOVA (PROC MIXED) with
Pre-test performance as student-level covariate
Selection into Tier 2 as student-level covariate
Treatment as a teacher-level effect
Interaction of Pre-test and Tier-2 status as cross-level
effects
Random effects of teacher within school
Analyses conducted within each Program Model
Results for English Only Program
Pre-Test
Students in Tier 2 performing significantly lower than
those not in Tier 2 on all skills.
No Differences between Treatment and Control
Post-Test:
Shown on next slides
English Means and Standard Deviations
Measure
Letter-Name
Identification
Letter Sound
Identification
Phonological
Awareness
Group
Tier 2
0
1
0
Treatment
1
0
Comparison
1
0
Treatment
1
0
Comparison
1
0
Treatment
1
Comparison
Pre-Test
Mean sd
20.95 7.92
14.91 10.21
23.12 5.05
19.40 6.82
19.99 7.58
11.30 9.51
21.67 6.01
16.55 7.67
37.92 13.45
24.00 7.23
42.87 15.21
27.64 8.26
Post-Test
Mean sd
23.32 5.64
19.50 7.86
24.43 4.27
23.35 4.52
23.27 5.02
19.53 8.18
24.47 2.81
22.00 5.23
49.54 12.38
34.78 8.60
54.78 14.55
40.66 11.19
Measure
WLPB: Letter
Word Identification
WLPB: Picture
Vocabulary
WLPB: Listening
Comprehension
Group
Comparison
Treatment
Comparison
Treatment
Comparison
Treatment
Tier 2
0
1
0
1
0
1
0
1
0
1
0
1
Pre-Test
Mean sd
98.66 18.06
84.53 15.73
102.77 14.87
91.76 10.03
75.37 23.52
69.55 24.64
82.93 18.94
75.24 12.82
60.37 19.96
52.56 19.75
68.68 17.98
53.16 17.46
Post-Test
Mean sd
102.81 19.29
89.84 15.71
107.84 14.79
99.41 13.96
83.29 23.92
77.59 18.92
87.92 19.45
78.89 14.80
68.98 22.88
58.81 21.13
75.51 19.22
63.62 14.73
English Program / English Outcomes
Measure
Letter Naming
Letter Sound
Identification
Num
DF
Den
DF
F Value
Pr > F
Wave 1
1
346
321.50
<.0001
Tier 2
1
346
0.54
0.4637
Treatment
1
17
10.41
0.0050
Wave 1 * Tier 2
1
346
0.18
0.6687
Wave 1 * Treatment
1
346
8.82
0.0032
Tier 2 * Treatment
1
346
3.68
0.0558
Wave 1 * Tier 2 * Treatment
1
346
2.94
0.0874
Wave 1
1
346
224.91
<.0001
Tier 2
1
346
1.38
0.2401
Treatment
1
17
5.39
0.0330
Wave 1 * Tier 2
1
346
1.48
0.2251
Wave 1 * Treatment
1
346
5.04
0.0254
Tier 2 * Treatment
1
346
0.12
0.7329
Wave 1 * Tier 2 * Treatment
1
346
0.04
0.8363
Effect
For all other measures, there were no main effects or interactions
involving TX.
Main Effect: W ave 1 (p<.001)
Main Effect: Treatment Group (p<.005)
Interaction: Wave 1 * Treatment Group (p<.003)
26
EARLY ENGLISH PROGRAM
ENGLISH
LETTER NAMING
26.0
24
25.8
23.7
23.5
22
20.2
20
18
18.5
Total Correct
16
14
12
10
8
6
4
2
0
LOW
AVERAGE
Comparison
HIGH
LOW
AVERAGE
Treatment
HIGH
Main Effect: W ave 1 (p<.001)
Main Effect: Treatment Group (p<.03)
Interaction: Wave 1 * Treatment Group (p<.02)
EARLY ENGLISH PROGRAM
ENGLISH
LETTER SOUND IDENTIFICATION
26
26.0
24
25.7
23.5
23.5
22
20
20.5
19.4
18
Total Correct
16
14
12
10
8
6
4
2
0
LOW
AVERAGE
Comparison
HIGH
LOW
AVERAGE
Treatment
HIGH
Variance Components for English-Only
Instruction
Covariates Only
Measure
Treatment Plus Covariates
Residual
Teacher
ICC
Residual
Teacher
Letter Name
10.95
0.54
0.05
10.60
0.34
0.03
Letter Sounds
9.87
1.27
0.13
9.71
0.98
0.10
67.50
7.77
0.12
65.88
7.10
0.11
WLPB: LC
190.43
92.22
0.48
184.41
90.12
0.49
WLPB: LW
88.83
18.67
0.21
86.17
15.26
0.18
WLPB: PV
123.90
16.12
0.13
119.94
18.74
0.16
PA
ICC
Results for Transitional Bilingual
Education Program – Spanish Outcomes
Pre-Test:
Students in Tier 2 performing significantly lower than
those not in Tier 2 on all skills.
No Differences Between Treatment and Control
Post-program
Shown on next slides
Spanish Means and Standard Deviations
Measure
Letter-Name
Identification
Letter Sound
Identification
Phonological
Awareness
Group
Comparison
Treatment
Comparison
Treatment
Comparison
Treatment
Tier 2
0
1
0
1
0
1
0
1
0
1
0
1
Pre-Test
Mean sd
22.11 8.10
14.18 9.24
21.22 8.33
14.06 9.42
22.45 7.51
12.49 9.07
20.56 8.89
11.41 8.77
36.69 14.35
23.86 8.13
37.31 15.74
24.21 7.14
Post-Test
Mean sd
26.78 4.89
18.97 9.72
25.64 5.96
20.24 8.41
26.87 5.44
18.05 10.14
25.74 6.12
18.66 9.18
52.40 17.53
32.21 13.22
50.45 16.95
32.89 11.57
Pre-Test
Group Tier 2
Mean sd
0 101.26 20.27
WLPB: Letter Comparison
1 84.63 14.49
Word
0 101.22 21.36
Treatment
Identification
1 84.46 14.41
0 80.32 24.78
Comparison
WLPB: Picture
1 64.11 27.22
Vocabulary
0 78.22 27.39
Treatment
1 64.02 17.93
0 80.71 17.57
WLPB:
Comparison
1 66.96 24.25
Listening
0 79.18 18.03
Treatment
Comprehension
1 68.34 18.54
Measure
Post-Test
Mean sd
117.44 25.68
86.71 17.47
112.82 25.73
90.08 16.61
87.38 28.29
65.45 26.17
85.17 32.41
68.90 20.35
84.72 18.45
66.54 25.44
84.82 19.28
73.44 16.63
Spanish Program / Spanish Outcomes
Measure
Letter Naming
Letter Sound
Identification
Num
DF
Den
DF
F Value
Pr > F
Wave 1
1
620
516.79
<.0001
Tier 2
1
620
44.81
<.0001
Treatment
1
34
3.02
0.0915
Wave 1 * Tier 2
1
620
37.06
<.0001
Wave 1 * Treatment
1
620
2.32
0.1280
Tier 2 * Treatment
1
620
8.77
0.0032
Wave 1 * Tier 2 * Treatment
1
620
6.30
0.0123
Wave 1
1
616
379.68
<.0001
Tier 2
1
616
41.23
<.0001
Treatment
1
34
4.33
0.0450
Wave 1 * Tier 2
1
616
31.45
<.0001
Wave 1 * Treatment
1
616
10.63
0.0012
Tier 2 * Treatment
1
616
0.00
0.9920
Wave 1 * Tier 2 * Treatment
1
616
1.59
0.2084
Effect
For all other measures, there were no main effects or interactions
involving TX.
Main Effect: W ave 1 (p<.0001)
Main Effect: Tier 2 (p<.0001)
BILINGUAL PROGRAM
Interaction: W ave 1 * Treatment Group (p<.0001)
SPANISH
Interaction: W ave 1 * Tier 2 (p<.003)
Interaction: W ave 1 * Treatment Group * Tier 2 (p<.01)
LET T ER NAM ING
30
29.8
28.7
28
27.7
26
24
25.4
23.2
22
21.1
20.8
20
Total Correct
27.0
25.9
20.6
18
16
14
14.1
14.0
12
10
8
6
4
2
0
LOW
AVERAGE
HIGH
LOW
Tier 2 = 0
AVERAGE
Tier 2 = 1
Com paris on
HIGH
LOW
AVERAGE
HIGH
LOW
Tier 2 = 0
AVERAGE
Tier 2 = 1
Treatm ent
HIGH
Main Effect: W ave 1 (p<.001)
Main Effect: Tier 2 (p<.001)
Interaction: W ave 1 * Treatment Group (p<.001)
Interaction: W ave 1 * Tier 2 (p<.001)
BILINGUAL PROGRAM
SPANISH
LET T ER SOUND IDENT IFICAT ION
30
29.0
28
26
25.7
25.6
24
22
22.3
Total Correct
20
19.2
18
16
14
12.7
12
10
8
6
4
2
0
LOW
AVERAGE
TIER 2 = 0
HIGH
LOW
AVERAGE
TIER 2 = 1
HIGH
Main Effect: Wave 1 (p<.0001)
Main Effect: Tier 2 (p<.0001)
Interaction: Wave 1 * Treatment Group (p<.0001)
Interaction: Wave 1 * Tier 2 (p<.0001)
BILINGUAL PROGRAM
SPANISH
LETTER SOUND IDENTIFICATION
30
30.0
29.0
28
26
25.9
25.0
24
22
Total Correct
20
20.5
19.7
18
16
14
12
10
8
6
4
2
0
LOW
AVERAGE
Comparison
HIGH
LOW
AVERAGE
Treatment
HIGH
Variance Components for Transitional Bilingual
Instruction
Covariates Only
Measure
Treatment Plus Covariates
Residual
Teacher
ICC
Residual
Teacher
ICC
Letter Name
15.98
7.97
0.50
14.59
7.33
0.50
Letter Sounds
21.02
8.95
0.43
19.33
9.18
0.48
PA
99.14
31.96
0.32
92.78
26.74
0.29
WLPB: LC
147.27
21.63
0.15
143.81
19.05
0.13
WLPB: LW
257.83
85.70
0.33
243.62
71.30
0.29
WLPB: PV
309.73
20.17
0.07
305.68
18.54
0.06
Percent Observed Language of Instruction (in RD/LA)
Across Language Models and Program Group
English
100%
90%
1%
2%
Spanish
None/Inaudible
2%
2%
11%
30%
80%
Average Percentage of Time
70%
60%
81%
86%
50%
87%
40%
69%
30%
20%
10%
17%
12%
0%
TX
CO
Early English
TX
CO
Bilingual
Percentage of Observational Time Instructional Language was English
Across Language Models and Program Group (+/- 1 SD)
100%
90%
87%
80%
Average Percentage of Time
70%
69%
60%
50%
40%
30%
20%
17%
12%
10%
0%
TX
CO
Early English
TX
CO
Bilingual
Discussion
Effects observed so far are small and in the form of
interactions with baseline performance
In general, TX mitigates the effect of baseline
performance
Increased exposure and implementation in Grade 1
Gains
TX and CO gains on PA, LC, PV, and LW were large (.3
to 1.0 sd on WLPB standard scores)
Smaller gains were noted for students in Tier 2 instruction
It will be instructive to monitor growth of students
Discussion
Important to keep in mind context for intervention
(11 of 13 Schools are RF schools; Texas Reading
Initiative)
ICCs for Spanish outcomes in TBE instruction are
generally large relative to those in English
Exception to the rule is English Listening Comprehension
Suggests significant opportunity to impact English LC
through instruction
Next Steps: Cross-language and cross-program
comparisons
U.S. Department of Education
Grant Performance Report (ED 524B)
Project Status Chart
OMB No. 1890 - 0004
Expiration: 10-31-2007
PR/Award #: R30P030031
SECTION C
Detailed Progress Report Narrative
Report on Student Performance Data
David J. Francis, University of Houston
Coleen D. Carlson, University of Houston
Hector Rivera, University of Houston
Design and Data Collections
Of the fourteen schools participating in the project during the 2004-2005 school year, 13 continued participation in the 2005-2006 school year (one school using only structured English immersion, six
schools using only transitional bilingual education, and six schools using both programs). One school of
the original 14 (using both programs) refused participation in this second year of the study because the
teachers felt the program took too much time and effort.
In the 2004-2005 school year, first grade teachers were randomly assigned to treatment or control
conditions within language programs within each school. These teachers participated in trainings during
the 2004-205 school year but did not implement and were not assigned mentors. In 2005-2006, the first
grade teachers began implementing the program and were assigned mentors. Of the 33 first grade teachers
from the 2004-2005 school year who were assigned to treatment conditions, eight left their teaching positions at the school or were moved to higher grade levels by principal decision. Therefore, 25 teachers who
received training in the 2004-2005 school year remained in the project during the 2005-2006 school year.
Only seven new teachers were hired within the schools to replace the 8 teachers who left the project (one
school had a smaller first grade population than in the previous year and did not replace the teacher who
left). In all, a total of 65 teachers participated in the project in the 2005-2006 school year. In the 7 schools
using structured English immersion, a total of 21 teachers participated (11 in treatment and 10 in control).
In the 12 schools using transitional bilingual education, 44 participated (21 in treatment and 23 in control).
Due to the relatively high number of first grade teachers who were trained in 2004-2005 and lost due to
attrition (8 of 33 or 24%), and the high cost associated with training teachers who leave the school prior to
implementation, it was decided that second grade teachers would not be selected or trained until the summer of 2006.
In the memorandum of understanding signed with the schools, principals agreed to keep the Kindergarten students from the 2004-2005 school year in the same type of classroom in first grade (2005-2006)
to which they had been assigned in Kindergarten (i.e., treatment or control, and English immersion or Transitional bilingual). A total of 1,214 students participated in the project in the 2005-2006 school year. Students in the current year were either: 1) those that participated in Kindergarten and were followed into first
grade; or 2) those that were selected to replace students lost to attrition. All first grade students in each
participating classroom (returning and new students) were selected to participate in the student assessment
portion of the project. Table 1 below outlines the number of students in the study during the 2005-2006
school year.
Students were administered a battery of language and literacy related skill assessments twice during the school year; once in October/November and again in April/May. Data from the second assessment
has been scanned and verified and is currently undergoing quality checks. Student assessment data for the
entire first grade year (fall and spring) will be ready for analyses by August 1, 2006. A summer project
Section C Additional Information
R30P030031 – p. 2
meeting is scheduled for August 21, 2006, at which preliminary analyses from the first grade year will be
presented and discussed. Additional analyses on longitudinal performance form K to grade 1 will be completed in the fall of 2006.
In addition, to student assessments, mentors visited with teachers assigned to the treatment groups
on a weekly basis beginning in the fall of 2005. Furthermore, observers visited each treatment teacher’s
classroom three times during the year to code fidelity to treatment. At each of the three time points, 4 fidelity observations were conducted over the course of an entire day: (1) SIOP, (2) Tier II intervention, (3)
Language and Literacy classroom intervention instruction (English and/or Spanish), (3) Language Enrichment (English)/Esperanza (Spanish), and (4) McMillan (English and/or Spanish). General observations of
classroom instruction were also completed three times per year, on a separate day, in all treatment and control classrooms during the reading and language arts block.
Analyses
Previous analyses were completed to examine the effectiveness of the randomization and tests of
fixed effects (i.e., mean differences between Treatment and Control groups) indicated that we can be confident that the randomization was carried out effectively and that no serious pre-treatment bias existed between treatment and control classrooms in either arm of the study. Analyses of all Kindergarten spring
performance data were completed to determine the effect of treatment on student achievement during the
kindergarten year. We fit models estimating random effects at the teacher level, with teachers nested
within school. We did not estimate separate random effects at the school level, but did estimate separate
random effects for teachers in English-Only and Teachers in Transitional Bilingual Education classrooms.
Specifically, analyses were completed using multilevel mixed models with pre-test performance and Tier 2
designation as student level covariates and with students nested within classrooms. Treatment was examined as a teacher –level effect and the interactions of pre-test, tier 2 status and treatment as cross-level effects. Tables 1 and 2 present the pre-test and end of year means and standard deviations for each performance measure by Treatment group and Tier 2 status. Table 2 presents English performance measures for
the English only program model group, while Table 3 presents Spanish performance measures of the Bilingual Education group.
Results of the mixed model analyses are presented in Tables 4 and 5 (English and Spanish, respectively). Significant effects involving treatment were found for letter naming and sound identification in
English for the English only group, and in Spanish for the Bilingual Education group. Specifically, for the
English only group, for both letter naming and letter sound identification there were significant interactions
between pre-test performance and treatment group. Figures 1 and 2 illustrate the interactions from these
two models. As can be seen in these figures, treatment and control students who begin the year in the mid
to high performance range tend to be performing similarly at the end of the year. However, of the students
who begin the year performing at lower levels, those in the treatment group tend to be performing at higher
levels at the end of the year than those in the control group. Thus, the impact of treatment on English performance in the English only group appears to on increasing the alphabetic skills of students who begin the
year with lower levels of performance.
In the Bilingual Education group, there was a significant interaction for letter naming between pretest performance, tier 2 status and treatment. In addition, a significant interaction was found between pretest performance and treatment group for letter sound identification. Figure 3 illustrates the interaction
found for Spanish letter naming performance. As seen in this graph, the impact of treatment appears to
function primarily to decrease the spread of scores within the group of students who are designated as Tier
2, especially those who also begin the year with lower scores within the Tier 2 group. Figure 4 illustrates
the interaction found for Spanish letter sound identification performance. As seen in this figure, the impact
of treatment appears to be primarily with those students who began the year in the mid- to high performance range, who, at the end of the year, are performing slightly higher than students in the control group
who began the year at similar levels.
In addition to providing tables of means and standard deviations and results of statistical comparisons of Treatment and Control classrooms by Program Model, we provide information in Tables 6 and 7
Section C Additional Information
R30P030031 – p. 3
about variability across classrooms. In Tables 6 and 7 we report the estimate of variance at the teacher (i.e.,
classroom) level as well as an estimate of the pooled within-classroom variance, and the computed Intraclass correlation (ICC) for models including only covariates, and those including covariates and treatment
group for EO and TBE classrooms respectively (Table 5 = EO and Table 6 = TBE). The ICC is the ratio of
the between classroom variance to the sum of the between classroom variance and within-classroom variance. As can be seen in both tables, there is a significant amount of both within and between classroom
variance in student performance scores at the end of the kindergarten year that are not accounted for by
information from the pre-test performance and Tier 2 status alone (Covariates Only). Examination of the
variance in the models that also account from treatment group (Treatment Plus Covariates) suggests that
within and between classroom variance tends to be reduced when treatment group is added to the model.
Because treatment was only provided for 12 weeks during the Kindergarten year, we would expect that this
trend will be stronger when examining performance differences in first grade and across the kindergarten
and first grade years.
It is important to keep in mind that the results reported here are for Kindergarten data only.
Analyses in the summer and fall of 2006 will examine student growth over time, identify characteristics of
students that relate to optimal growth under each instructional model, identify characteristics of instruction
that lead to optimal growth, and examine if/how differences between language outcomes change over time.
Table 1. Sample information for first grade 2005-2006
2005-2006 (First Grade)
Total number of students in Wave 1
Returning students
New Grade 1 students
Male
Gender
Female
Unknown
Caucasian
Hispanic
Ethnicity
African-American
Other/Unknown
N
1214
856
358
626
586
2
6
1181
0
27
Number of students leaving from Wave 1 to Wave 2
Total number of students participating in Wave 2
Male
Gender
Female
Unknown
Caucasian
Hispanic
Ethnicity
African-American
Other/Unknown
157
1057
545
510
2
5
1052
0
0
Section C Additional Information
R30P030031 – p. 4
Table 2. – Means and Standard Deviations for English Language Measures for English Only Students by
Tier 2 status.
Pre-Test
Post-Test
Measure
Group
Tier 2
Mean
sd
Mean
sd
0
20.95
7.92
23.32
5.64
Comparison
1
14.91
10.21
19.50
7.86
Letter-Name
Identification
0
23.12
5.05
24.43
4.27
Treatment
1
19.40
6.82
23.35
4.52
0
19.99
7.58
23.27
5.02
Comparison
1
11.30
9.51
19.53
8.18
Letter Sound
Identification
0
21.67
6.01
24.47
2.81
Treatment
1
16.55
7.67
22.00
5.23
0
37.92
13.45
49.54
12.38
Comparison
1
24.00
7.23
34.78
8.60
Phonological
Awareness
0
42.87
15.21
54.78
14.55
Treatment
1
27.64
8.26
40.66
11.19
0
98.66
18.06
102.81
19.29
Comparison
1
84.53
15.73
89.84
15.71
WLPB: Letter Word
Identification
0
102.77
14.87
107.84
14.79
Treatment
1
91.76
10.03
99.41
13.96
0
75.37
23.52
83.29
23.92
Comparison
1
69.55
24.64
77.59
18.92
WLPB: Picture
Vocabulary
0
82.93
18.94
87.92
19.45
Treatment
1
75.24
12.82
78.89
14.80
0
60.37
19.96
68.98
22.88
Comparison
1
52.56
19.75
58.81
21.13
WLPB: Listening
Comprehension
0
68.68
17.98
75.51
19.22
Treatment
1
53.16
17.46
63.62
14.73
Section C Additional Information
R30P030031 – p. 5
Table 3. – Means and Standard Deviations for Spanish Language Measures for Transitional bilingual Students by Tier 2 status.
Pre-Test
Post-Test
Measure
Group
Tier 2
Mean
sd
Mean
sd
0
22.11
8.10
26.78
4.89
Comparison
1
14.18
9.24
18.97
9.72
Letter-Name
Identification
0
21.22
8.33
25.64
5.96
Treatment
1
14.06
9.42
20.24
8.41
0
22.45
7.51
26.87
5.44
Comparison
1
12.49
9.07
18.05
10.14
Letter Sound
Identification
0
20.56
8.89
25.74
6.12
Treatment
1
11.41
8.77
18.66
9.18
0
36.69
14.35
52.40
17.53
Comparison
1
23.86
8.13
32.21
13.22
Phonological
Awareness
0
37.31
15.74
50.45
16.95
Treatment
1
24.21
7.14
32.89
11.57
0
101.26
20.27
117.44
25.68
Comparison
1
84.63
14.49
86.71
17.47
WLPB: Letter Word
Identification
0
101.22
21.36
112.82
25.73
Treatment
1
84.46
14.41
90.08
16.61
0
80.32
24.78
87.38
28.29
Comparison
1
64.11
27.22
65.45
26.17
WLPB: Picture
Vocabulary
0
78.22
27.39
85.17
32.41
Treatment
1
64.02
17.93
68.90
20.35
0
80.71
17.57
84.72
18.45
Comparison
1
66.96
24.25
66.54
25.44
WLPB: Listening
Comprehension
0
79.18
18.03
84.82
19.28
Treatment
1
68.34
18.54
73.44
16.63
Section C Additional Information
R30P030031 – p. 6
Table 4. – Mixed model results with significant treatment effects for English language measures for English only students
Num Den
Measure
Effect
F Value
Pr > F
DF
DF
1 346
321.50
Wave 1
<.0001
Tier 2
1 346
0.54
0.4637
1
17
10.41
Treatment
0.0050
Letter Naming
Wave 1 * Tier 2
1 346
0.18
0.6687
1 346
8.82
Wave 1 * Treatment
0.0032
Tier 2 * Treatment
1 346
3.68
0.0558
Wave 1 * Tier 2 * Treatment
1 346
2.94
0.0874
1 346
224.91
Wave 1
<.0001
Tier 2
1 346
1.38
0.2401
1
17
5.39
Treatment
0.0330
Letter Sound
Wave 1 * Tier 2
1 346
1.48
0.2251
Identification
1 346
5.04
Wave 1 * Treatment
0.0254
Tier 2 * Treatment
1 346
0.12
0.7329
Wave 1 * Tier 2 * Treatment
1 346
0.04
0.8363
Wave 1
1 346
128.82
<.0001
Tier 2
1 346
0.39
0.5331
Treatment
1
17
0.98
0.3371
Phonological
Wave 1 * Tier 2
1 346
5.89
0.0157
Awareness
Wave 1 * Treatment
1 346
0.03
0.8542
Tier 2 * Treatment
1 346
0.21
0.6485
Wave 1 * Tier 2 * Treatment
1 346
0.06
0.8026
Wave 1
1 340
108.43
<.0001
Tier 2
1 340
0.03
0.8696
Treatment
1
17
0.84
0.3727
Listening ComWave 1 * Tier 2
1 340
1.35
0.2468
prehension
Wave 1 * Treatment
1 340
0.48
0.4871
Tier 2 * Treatment
1 340
0.04
0.8404
Wave 1 * Tier 2 * Treatment
1 340
0.02
0.9013
Wave 1
1 345
211.12
<.0001
Tier 2
1 345
0.45
0.5030
Treatment
1
17
1.36
0.2597
Letter Word
Wave 1 * Tier 2
1 345
0.65
0.4207
Identification
Wave 1 * Treatment
1 345
0.58
0.4463
Tier 2 * Treatment
1 345
2.05
0.1529
Wave 1 * Tier 2 * Treatment
1 345
1.90
0.1694
Wave 1
1 346
249.61
<.0001
Tier 2
1 346
0.42
0.5197
Treatment
1
17
2.52
0.1308
Picture
Wave 1 * Tier 2
1 346
0.99
0.3214
Vocabulary
Wave 1 * Treatment
1 346
2.41
0.1216
Tier 2 * Treatment
1 346
0.52
0.4722
Wave 1 * Tier 2 * Treatment
1 346
0.60
0.4402
Section C Additional Information
R30P030031 – p. 7
Table 5. – Mixed model results with significant treatment effects for Spanish language measures for Transitional bilingual students
Num Den
Measure
Effect
F Value
Pr > F
DF
DF
1
620
516.79
Wave 1
<.0001
1
620
44.81
Tier 2
<.0001
Treatment
1
34
3.02
0.0915
Letter Naming
1
620
37.06
Wave 1 * Tier 2
<.0001
Wave 1 * Treatment
1
620
2.32
0.1280
1
620
8.77
Tier 2 * Treatment
0.0032
1
620
6.30
Wave 1 * Tier 2 * Treatment
0.0123
1
616
379.68
Wave 1
<.0001
1
616
41.23
Tier 2
<.0001
Treatment
1
34
4.33
0.0450
Letter Sound
1
616
31.45
Wave 1 * Tier 2
<.0001
Identification
1
616
10.63
Wave 1 * Treatment
0.0012
Tier 2 * Treatment
1
616
0.00
0.9920
Wave 1 * Tier 2 * Treatment
1
616
1.59
0.2084
Wave 1
1
621
193.78
<.0001
Tier 2
1
621
14.91
0.0001
Treatment
1
34
0.04
0.8471
Phonological
Wave 1 * Tier 2
1
621
0.03
0.8566
Awareness
Wave 1 * Treatment
1
621
0.24
0.6280
Tier 2 * Treatment
1
621
0.10
0.7502
Wave 1 * Tier 2 * Treatment
1
621
0.20
0.6522
Wave 1
1
622
549.78
<.0001
Tier 2
1
622
0.01
0.9028
Treatment
1
34
0.41
0.5258
Listening ComWave 1 * Tier 2
1
622
1.51
0.2193
prehension
Wave 1 * Treatment
1
622
0.02
0.8818
Tier 2 * Treatment
1
622
1.01
0.3159
Wave 1 * Tier 2 * Treatment
1
622
0.67
0.4138
Wave 1
1
616
209.59
<.0001
Tier 2
1
616
1.08
0.2986
Treatment
1
34
1.24
0.2737
Letter Word
Wave 1 * Tier 2
1
616
0.01
0.9364
Identification
Wave 1 * Treatment
1
616
0.97
0.3255
Tier 2 * Treatment
1
616
0.05
0.8224
Wave 1 * Tier 2 * Treatment
1
616
0.03
0.8522
Wave 1
1
611
412.20
<.0001
Tier 2
1
611
0.58
0.4483
Treatment
1
34
1.63
0.2107
Picture
Wave 1 * Tier 2
1
611
0.00
0.9947
Vocabulary
Wave 1 * Treatment
1
611
2.08
0.1495
Tier 2 * Treatment
1
611
0.24
0.6266
Wave 1 * Tier 2 * Treatment
1
611
0.49
0.4823
Figure 1. Mixed model interaction (pre-test by treatment) effect for letter naming in English - for English only students
Main Effect: Wave 1 (p<.001)
Main Effect: Treatment Group (p<.005)
Interaction: Wave 1 * Treatment Group (p<.003)
EARLY ENGLISH PROGRAM
ENGLISH
LETTER NAMING
26
26.0
24
25.8
23.7
23.5
22
20.2
20
18
18.5
Total Correct
16
14
12
10
8
6
4
2
0
LOW
AVERAGE
Comparison
HIGH
LOW
AVERAGE
Treatment
HIGH
Section C Additional Information
R30P030031 – p. 9
Figure 2. Mixed model interaction (pre-test by treatment) effect for letter sound identification in English - for English only students
Main Effect: Wave 1 (p<.001)
Main Effect: Treatment Group (p<.03)
Interaction: Wave 1 * Treatment Group (p<.02)
EARLY ENGLISH PROGRAM
ENGLISH
LETTER SOUND IDENTIFICATION
26
26.0
24
25.7
23.5
23.5
22
20.5
20
19.4
18
Total Correct
16
14
12
10
8
6
4
2
0
LOW
AVERAGE
Comparison
HIGH
LOW
AVERAGE
Treatment
HIGH
Section C Additional Information
R30P030031 – p. 10
Figure 3. Mixed model interaction (pre-test by treatment) effect for letter naming in Spanish - for Bilingual Education students
Main Effect: Wave 1 (p<.0001)
Main Effect: Tier 2 (p<.0001)
Interaction: Wave 1 * Treatment Group (p<.0001)
Interaction: Wave 1 * Tier 2 (p<.003)
Interaction: Wave 1 * Treatment Group * Tier 2 (p<.01)
BILINGUAL PROGRAM
SPANISH
LETTER NAMING
30
29.8
28.7
28
27.7
26
24
25.4
23.2
22
21.1
20.8
20
Total Correct
27.0
25.9
20.6
18
16
14
14.1
14.0
12
10
8
6
4
2
0
LOW
AVERAGE
HIGH
LOW
Tier 2 = 0
AVERAGE
Tier 2 = 1
Comparison
HIGH
LOW
AVERAGE
HIGH
LOW
Tier 2 = 0
AVERAGE
Tier 2 = 1
Treatment
HIGH
Section C Additional Information
R30P030031 – p. 11
Figure 4. Mixed model interaction (pre-test by treatment) effect for letter sound identification in Spanish - for Bilingual Education students
Main Effect: Wave 1 (p<.0001)
Main Effect: Tier 2 (p<.0001)
Interaction: Wave 1 * Treatment Group (p<.0001)
Interaction: Wave 1 * Tier 2 (p<.0001)
BILINGUAL PROGRAM
SPANISH
LETTER SOUND IDENTIFICATION
30
30.0
29.0
28
26
25.9
25.0
24
22
Total Correct
20
20.5
19.7
18
16
14
12
10
8
6
4
2
0
LOW
AVERAGE
Comparison
HIGH
LOW
AVERAGE
Treatment
HIGH
Table 6. – Variance Components and Intra-Class Correlations for English Outcome Measures for Teachers
in English-Only (EO) Classrooms.
Covariates Only
Measure
Residual
Teacher
Treatment Plus Covariates
ICC
Residual
Teacher
ICC
Letter Name
10.95
0.54
0.05
10.60
0.34
0.03
Letter Sounds
9.87
1.27
0.13
9.71
0.98
0.10
Phonological
Awareness
67.50
7.77
0.12
65.88
7.10
0.11
WLPB: LC
190.43
92.22
0.48
184.41
90.12
0.49
WLPB: LW
88.83
18.67
0.21
86.17
15.26
0.18
WLPB: PV
123.90
16.12
0.13
119.94
18.74
0.16
Table 7. – Variance Components and Intra-Class Correlations for Spanish Outcome Measures for Teachers
in Transitional Bilingual (TBE) Classrooms.
Covariates Only
Measure
Residual
Teacher
Treatment Plus Covariates
ICC
Residual
Teacher
ICC
Letter Name
15.98
7.97
0.50
14.59
7.33
0.50
Letter Sounds
21.02
8.95
0.43
19.33
9.18
0.48
Phonological
Awareness
99.14
31.96
0.32
92.78
26.74
0.29
WLPB: LC
147.27
21.63
0.15
143.81
19.05
0.13
WLPB: LW
257.83
85.70
0.33
243.62
71.30
0.29
WLPB: PV
309.73
20.17
0.07
305.68
18.54
0.06
Section C Additional Information
R30P030031 – p. 13
Progress in Development and Implementation of the Language and Literacy Curriculum
Diane August, Center for Applied Linguistics
Maria Carlo, University of Miami
Elsa Hagan, Texas Institute for Measurement, Evaluation and Statistics
Tier 1: Curriculum and Instruction
For the first grade classrooms, we developed scripted materials that were used during the 90 minute reading block to build phonological awareness, phonics, fluency, vocabulary, and comprehension in
English-language learners. as well as 30 minutes of English-as-a-second language materials for the first
grade students to be used during the 60 minute ESL period. The materials for the ESL period included
scripted teacher read-alouds using high quality children’s literature and picture cards to teach vocabulary
aligned with the read-alouds. The materials developed for the bilingual program were in Spanish while
those developed for the structured English immersion program were in English. See Appendix A for a sample week lesson in English.
For both the bilingual program model and the structured English-immersion program model we
have developed curriculum that is closely aligned with state and district standards. We are working closely
with the district Directors of Reading First and Bilingual Education to ensure that this is the case. We also
developed review lessons to help students consolidate what they had learned. In addition, our materials
provided for ongoing assessment to help teachers carefully monitor student progress.
Tier 1: Strategies that Integrate Language Acquisition and Academic Achievement
For the first grade English-as-a-second language program, we have developed prototype curriculum that uses children’s literature to teach content knowledge, as well as language and literacy. Scripting
of actual lessons and development of curriculum materials for grade 2 will be completed during the summer months.
Tier 1: English Proficiency through Peer Interaction
All of our materials used Partner Talk and Partner Reading to give students an opportunity to talk
with each other. Teachers are instructed to pair children so children strong in English proficiency and literacy skills are paired with students who are acquiring these skills.
Tier 1: Professional Development
We provided professional development to all first grade treatment teachers throughout the school
year (training was also made available for Kindergarten control teachers from the previous year). We developed a high quality 30 minute training video that demonstrated the strategies needed to implement the
first grade curriculum. We provided two-days of professional development prior to the implementation of
the curriculum with three days of follow-up visits to observe teachers implementing the curriculum and to
work with teachers to refine their skills. Each teacher was assigned a mentor to assist with the implementation of the curriculum. Classroom observations were conducted as well as feedback sessions on a biweekly basis. In addition, the principal investigator conducted bi-weekly telephone conferences with the
mentors to provide further guidance on the effective implementation of the curriculum. We are also training second grade teachers this summer in preparation for the upcoming school year
Section C Additional Information
R30P030031 – p. 14
The SIOP Model Intervention for Grade 1
Debbie Short, Center for Applied Linguistics
During the 2005-2006 school year, one major accomplishment was delivery of the professional
development program on the SIOP Model of sheltered instruction to the Grade 1 teachers (in the bilingual
and ESL programs), to the mentors who provide site-based coaching, and to the observers who collect data
on teacher implementation of the model. The professional development program consisted of 5 days of
workshop training for all teachers by CAL SIOP researchers, distributed throughout the year. The focus
was on using the SIOP Model in mathematics. During these sessions, the teachers learned about and practiced instructional techniques that integrate academic language development with content area instruction,
observed and analyzed videotaped instruction of teachers using the SIOP Model in real classrooms, and
developed lesson plans that they could deliver to their students.
Mentors were trained on the SIOP Model before the school year began and attended most of the
teachers’ sessions as well. The mentors conducted biweekly observations and feedback sessions with first
grade teachers on their SIOP math instruction. In these sessions, mentors used the SIOP observation form
to organize their observation notes; and in follow-up sessions with teachers, the mentors were able to offer
concrete guidance for the teachers’ instructional practices. In addition, a closed listserv was set up so the
mentors and CAL SIOP researchers could communicate regularly. Combined, these professional development activities enabled the first grade teachers to reach a high level of implementation by the year’s end.
Two waves of teacher implementation data were conducted this past year so researchers could determine the fidelity of implementation to the model. CAL researchers also made informal observations in
intervention classrooms to assess the teachers’ level of implementation and to determine needs for future
workshops.
Finally, the SIOP researchers presented information about the study and the first year’s efforts at
several professional conferences, including the National Association of Bilingual Education, the International Reading Association, and the Teachers of English to Speakers of Other Languages association. A
Web page describing the project was developed and can be found on the Center for Applied Linguistics’
Web site (www.cal.org).
Section C Additional Information
R30P030031 – p. 15
Development and Implementation of Tier 2 and 3 Intervention
Sharon R. Vaughn, University of Texas at Austin
Sylvia Linan-Thompson, University of Texas at Austin
Summer 2005
The Austin team updated the Spanish Tier 2 intervention for first grade, Lectura Proactiva para
Principiantes: Intensive Small Group Instruction for Spanish Speaking Readers — a 400-page, dailyscripted, supplemental instruction program in Spanish that maps the acquisition of reading in Spanish for
bilingual students. In addition they developed and piloted a two-day professional development to train tutors, teachers, mentors and observers.
Fall 2005
Each month the researchers in Austin participated in weekly conference calls held among all the
principle investigators. These calls served as a forum for discussing issues and making decisions pertaining
to the study.
Twenty-one Brownsville teachers, 7 mentors and observers received professional development
from Sharolyn Pollard-Durodola. The Austin team provided all of the materials needed to begin implementing the Spanish Tier 2 intervention for first grade, Lectura Proactiva para Principiantes.
Winter 2006
The Austin team prepared Reading Games/Juegos de lectura materials needed for 30 Spanish and
10 English Brownsville Kindergarten control group teachers. Teachers received a Game Plan book of 75
lessons and all supporting materials necessary for implementation of this kindergarten reading intervention.
Kathryn Prater provided professional development for 40 teachers.
Spring 2006
Twenty-one Brownsville first grade teachers implemented, Lectura Proactiva para Principiantes
from October through April and then tests were conducted to gauge the effectiveness of the intervention.
The Austin team worked on the development of Tier 3 instructional materials.
Much of the information in these descriptions will remain unchanged over the life of a multi-year
study. Updated information should be incorporated in the descriptions under the appropriate elements, (1 –
8) above, for subsequent reports.
Element 9 (update) should summarize progress since the last performance report. For this element, please describe the steps you have undertaken to fulfill grant requirements, and describe how you
expect the research to proceed during the next year. In addition, please describe any difficulties encountered during this performance period and how you have addressed those difficulties (or how you propose to
address them).
Appendix A – Sample Lesson Plan for Core Reading (MacMillan)
Week 17: Grammar and Writing Chart
Day 1
Grammar
and Usage
Grammar
Mechanics
Day 2
Review
Contractions
with Not
Day 3
Review
Contractions
with Not
Review of
Writing
Sentences
Generative Writing to a
prompt:
Writing
Generative
writing using
mechanics
reviewed above
*Transparency
Materials
Day 1- Photo
Day 4
Day 5
Review of
Writing
Sentences
Prewriting/shared
writing of
narratives
Draft: shared
writing of
narratives
Revise and edit:
shared writing of
narratives
*Transparency:
week 17- Parts
of speech
Transparency
Day 3
Transparency
Day 4
McGraw-Hill
Grammar
Practice Book:
p. 91
Optional 92
McGraw-Hill
Grammar
Practice Book:
pp. 31 and 32
Publish and
shared reading
of narratives
Echo Reading
of their story
*Transparencies
: Grammar
*Week 17: Day
Mechanics
2 transparency:
*Transparency
Day 2:-2a – d
Practice or
Assessment
McGraw-Hill
Grammar
Practice Book:
pp. 29 and 30
*Transparency:
17- Day 2 Photo
McGraw-Hill
Grammar
Practice Book:
pp.89
Optional – p.90
Grammar
assessment McGraw-Hill
Grammar
Practice Book:
pp. 94
Week 17 DAY 1
Materials:
Erasable Transparency Markers
Week 17 – Day 1: Grammar Mechanics Transparency
McGraw-Hill Grammar Practice Book: pp. 29 and 30
Week 17 Day 1 - Photo
1. Writing Mechanics: Writing Sentences
Introduce a Concept
Remember there are different kinds of sentences. There are
statements, questions and exclamations.
A statement tells us something and ends with a period.
A question asks something and ends with a question mark.
An exclamation is a sentence that shows strong feelings and ends with
an exclamation mark.
When we write sentences we need to begin each sentence with a capital
letter and end it with the correct punctuation mark. Let’s look at this
transparency.
(Direct student’s attention to the transparency: Week 17 Grammar Mechanics Day 1).
Who would like to read the first sentence? (Select a volunteer).
The dog is black and white.
This is a statement because it tells us something. It tells us that the dog
is black and white.
Who would like to read the second sentence? (Select a volunteer).
Is the dog black and white?
This is a question because it is asking something. It is asking if the dog
is black and white.
Who would like to read the third sentence? (Select a volunteer).
That is a big dog!
This is an exclamation because it shows emotion. The person is
surprised that the dog is so big.
Model
Direct children’s attention to the following sentences on the transparency.
4. the baby snake is very little
5. do you know about baby snakes
6. that boy would like to have a snake for a pet
7. what a big snake
Read the first sentence aloud. Say: I know that this is a statement because
this sentence is telling me something. It is telling me that the baby
snake is little. But this sentence is not correct. It does not begin with a
capital letter. It is a statement so it must end with a period. Select a
volunteer to make the corrections.
Read the next sentence aloud. Say: I know that this is a question because
this sentence is asking me something. It is asking me if I know about
baby snakes. But this sentence is not correct. It does not begin with a
capital letter. It is a question so it must end with a question mark.
Select a volunteer to make the corrections.
Follow the same procedure to with the remaining two sentences.
Practice
Direct children’s attention to page 29 of the McGraw Hill Grammar Practice
Book. Read the title. Read the instructions. Work with the class to complete the
first question. Allow children to work independently to complete the remainder of
the activity.
Direct children’s attention to page 30 of the McGraw Hill Grammar Practice
Book. Make sure children understand the instructions. Complete question one with
the entire class. Allow children to work independently to complete the remainder of
the activity.
2. Generative Writing
Present Concept
Direct the class’ attention Writing Photo Week 17-1 My Story Writing and have
students discuss the picture.
We are going to write a sentence about this picture.
Do you remember Chet and Jake? They had backpacks and they
went camping at the lake.
Model Concept
Look at this picture. Who can tell what the man is doing?
Solicit
responses from the children and elaborate on them. Write one response on the board, but
omit all punctuation and capitalization. Guide the students to correct punctuation
and capitalization on the board.
Practice
Have students write their own sentence about the picture in their notebook. Have
them attend to capitalization and punctuation.
Week 17 DAY 2
Materials:
Transparency: Week 17 Day 2
Overhead Transparency- Grammar: Macmillan SAILL - Week 17- Day 2 Photo
McGraw-Hill Grammar Practice Book: pp. 29 & 30
1. Grammar
Lesson Background
The overhead transparency for the grammar concept contains a chart with six
columns. The columns are labeled: nouns, verbs, pronouns, adjectives, adverbs and
other. Words that are placed in the nouns, verb, pronoun and adjective columns need
to be accompanied with an explanation. The words that belong in the remaining 2
columns do not require any reason. The teacher simply states: I will put this word in
this column.
Introduce a Concept
Direct children’s attention to Transparency Week 17-2.
My mom and dad camp with me. It is fun. We set up
the tent by the lake but we can’t swim. We tramp in the
woods with our backpacks. We can get fish for dinner.
At night we sleep in the tent and dream.
Noun
Verbs
Pronouns
Adjective
Adverb s
Other
Have the class read the story on the overhead transparency. Review the description
of nouns, verbs, pronouns and adjectives.
Model
Place the overhead transparency so children can see the headings. Review each of
the parts of speech: noun [words that names a person, place or thing], verb [words of
being or action words], pronoun [words that takes the place of a noun], and adjective
[words that tell about or describe nouns are adjectives].
First, I’m going to show you what to do. Let’s see, the first word I am
going to talk about is my. I am going to put it in the verb column. The
next word is mom. Mom names a person so it is a noun. I will place
mom in the noun column. I will put and in this column. (The column
labeled ‘other’). Dad is a noun, too. Camp is a word that does two jobs.
It is a word that is a name of a thing so it goes in this column [Noun].
Camp is also an action word so it goes in the verb column, too.
Practice
Let’s put the rest of the words into different columns? As children tell
you the words, write them on the overhead transparency. Organize them according
to function: nouns in one column, verbs in another column, etc. (nouns, verbs
pronouns, adjectives, adverbs, and other). You can elaborate as to why nouns,
action verbs, pronouns and adjectives are being placed in the particular column.
However, you do not need to explain why the words are placed in the remaining
columns at this time.
Have children read all of the words in each column.
The completed chart should look like this.
Nouns
mom
Dad
camp
tent
lake
woods
backpacks
Fish
dinner
night
dream
Verbs
camp
set
is
put
tramp
can
fish
sleep
dream
Pronouns
me
we
it
Adjectives Adverbs Other
my
up
the
fun
at
by
our
in
with
for
can’t
Introduce a Concept:
We are going to be talking about two words in the chart- can and
can’t. Remember, we talked about two words that we can squeeze
together and they make one word. These words are called
contractions. But when we squeeze the words together, some letters
get squeezed out. We use an apostrophe to show that these letters
have been taken out.
Model
For example, some people can sing and some people cannot sing. We
can say: The man cannot sing or we can say The man can’t sing. These
sentences mean the same thing.
Write The man cannot sing & The man can’t sing on the board. Can’t
is the short form – the contraction- for cannot. What changes have we
made to cannot to shorten it? [deleted n & o]. Good. What do we do to
show that letters are missing? [use an apostrophe].
Let’s look at these words on the transparency. Read them with me.
are not
is not
These are the long forms of the words. We can squeeze two words
together to make the short forms – the contractions. The first two
words are are not. I can make a new word: aren’t. Write aren’t beside are
not. When we squeeze the two words together, what letters oozed out?
[n o ]. Correct. But I see an apostrophe. It takes the place of those two
letters. Follow the same procedure for is not. Look at these sentences. [Read
the first sentence aloud ].
1. They are not working on the picture.
They ________ working on the picture.
2. The girl is not here.
The girl _______ here.
I know that aren’t means the same thing as are not. So I will write
aren’t in the blank. Write aren’t in the blank. Follow the same procedure for the
next question.
Practice
Direct children’s attention to the last four sentences on the transparency.
3. The boys are not playing ball.
4. The snake isn’t very big.
5. Those children aren’t going to swim in the lake.
6. The dog is not eating.
Read sentence number 3. Does this sentence have the long form of
two words or the short form? [long form].
Proceed in the same manner to complete the last three sentences.
Distribute page 89 of the McGraw-Hill Reading Grammar Practice Book. Discuss the
material in the box. After reviewing the instructions and completing the first question
as an example, allow children to complete this activity.
Distribute page 90 of the McGraw-Hill Reading Grammar Practice Book. Discuss the
material in the box. After reviewing the instructions and completing the first question
as an example, allow children to complete this activity.
2. Generative Writing
Introduce a Concept
This week we are going to write a story together. There are many
steps in writing a story. We have to think about ideas for a story.
Then we have to organize our thoughts and write them down. Next,
we make sure that we have written our sentences correctly. Finally,
we make our story ready to share with others. Today, we are going
begin the first step in writing a story. The first step we take is that we
have to think about things we write about. In order to write a story
we need to think about and share ideas with each other. We will use
an illustration to guide us in this step. This step is known as the
prewriting stage. Show the picture for Week 17.
Show the picture for Week 17.
Model
What do we see in this picture? Let’s generate ideas. As we discuss
these ideas I will take notes so tomorrow I will be reminded of all the
ideas we generated today.
Show the transparencies Day 2-a through Day 2-d to model the pre-writing phase of
the writing process. Fill it out as you discuss the questions with the students. Use
point form to address the questions so it takes on the appearance that ideas were
brainstormed. The pre-writing stage is not written in sentences.
Today we generated a bunch of ideas. Tomorrow we will write a
draft form of our story. A draft form is when you first write the
story. It is not ready to share with other people because it is not
completely finished. So tomorrow we will continue to work on our
story.
Week 17: DAY 3
Materials:
Week 17: Transparency – Day 3
McGraw-Hill Grammar Practice Book: p. 91
1. Grammar: Contractions with Not
Introduce a Concept
Yesterday we learned about joining two words together to make a
short form or a contraction. When we join two words together some
letters are squeezed out. We use an apostrophe to take the place of
the letters that are left out.
Direct children’s attention to the first sentence on the transparency.
This isn’t a big ship.
Isn’t is a short form of the two words is not. When the two words
were joined together the letter o was squeezed out. The apostrophe
is there to show that this letter is missing. Isn’t is the contraction for
is not.
Model
Let’s look at these sentences on the transparency.
1. The man is not packing to go on a trip.
2. They are not playing ball in the park..
3. Dan and Bob are not making a mask for the school play.
4. The lamp is not lit.
Read each sentence with the class. Each of these sentences have two
underlined words. We are going to write the contraction for these
words. The two underlined words in sentence number 1are is not. I
know that when we make a contraction we put two words together.
When we join two words some letters are left out. An apostrophe
takes the place of the missing letter. I know that the contraction for
is not is isn’t. When we squeeze together is and not the letter o gets
oozed out. So I will write the contraction and put an apostrophe
where the o was. I will write the two words close together to make
one word. [Write the word on the transparency]. Now the sentence reads
The man isn’t packing to go on a trip.
Follow the same procedure for the remaining sentences. Select volunteers to write
the contraction on the transparency.
Direct children’s attention to the remaining sentences on the transparency.
1. That isnt my dog.
2. They arent sleeping in the tent.
3. The girl isnt jumping up and down.
4. The frogs arent jumping on that branch.
Let’s read the first sentence. I know that isn’t is a contraction. But I
know a contraction must have an apostrophe to show that letters
have been left out. What letter is missing when we join is and not?
[o]. Good. So I must put an apostrophe in the place where the letter
o was. So I will put the apostrophe between the letters n and t.
Follow the same procedure for the remaining sentences. Allow students to place
the apostrophe in the appropriate place on the transparency.
.
Practice
Distribute page 91 of the McGraw-Hill Reading Grammar Practice Book.
Discuss the material in the box. After reviewing the instructions and completing
the first question as an example, allow children to complete this activity.
2. Generative Writing
Introduce a Concept The Writing Process- Draft
Yesterday we looked at a picture of people camping. We talked about
many different ideas. We talked about the characters and the setting
of the story. We even talked about things that were happening and
things that might happen next. We wrote down all of our ideas.
[Show the completed transparencies from the previous day—
transparencies for Day 2-a through Day 2-d. Review the content of these
transparencies.]
Today, we will use these ideas as we focus on the next phase for
writing the story. Today we are going to write a draft of our story.
This is the part of the writing process where we organize our ideas
and write them on paper. We are going to focus on putting our story
ideas into sentences. We are not going to worry about capital letters
and punctuation. We are just going to try to get our ideas into
sentences so that they make sense.
Practice
After reviewing the transparencies from the previous day guide children to form a
sequence of events using transparencies e-g. Guide them to generate complete
sentences. Write these sentences on a transparencies e-g. Write all words in lower
case letters and omit some of the ending punctuation.
We have created our draft of the story. Tomorrow we will proceed to
the next phase of writing stories. We will edit and revise our story so
that it will be ready to share with others.
Week 17 DAY 4
Materials:
McGraw-Hill Grammar Practice Book: pp. 31 and 32
Transparency Grammar Week 17: Day 4
1. Grammar-Writing Sentences
Introduce a Concept
On Monday we worked with writing sentences. All sentences begin
with a capital letter and must have an ending mark. Some sentences
end with a period, some sentences end with a question mark and
some sentences end with an exclamation mark.
Direct children’s attention to the transparency: Week 17- Day 4.
Look at the sentences on the transparency.
I saw three geese in the pond
what did you see in the woods
that bear is big
What type of sentence is the first one? [a statement]. What do we need
to put at the end of this sentence?[ a period.] What kind of sentence is
what did you see in the woods? [a question]. What do we need to put at
the end of this sentence? [ a question mark.] What kind of sentence is
that bear is big? [an exclamation ]. What do we need to put at the end
of this sentence? [ an exclamation mark.]
Model
Here are some sentences. Look at the part with the line under it. Is
there a mistake?
1. The boys went fishing in the lake. Did they get a fish They like to eat fish for
dinner.
2. Where do you want to go? the girl wants to go to the shop. There are lots of toys
in that shop.
3. The duck is in the pond. It likes to swim. It can swim very fast.
I will read the first set. Read sentence 1. I know that the underlined
sentence is asking a question. It is asking if the boys caught any fish
when they went fishing. But there is a mistake. A question must end
with a question mark. I will put a question mark on this sentence to
make it right. Follow the same procedure for the next 2 questions. Note: there
are no mistakes in question 3.
Practice
Distribute page 31 and 32 of the McGraw-Hill Reading Grammar Practice Book..
Read the instructions. Work with the class to complete the first question and have
them complete this activity.
2. Generative Writing
Introduce a Concept The Writing Process- Revise and Edit
We have been working on writing a story. On Tuesday we
generated ideas for our story. Yesterday we completed the draft of
our story. Today we will edit and revise our story. Let’s look at our
draft. Refer to transparencies e-g completed the previous day.
Let’s look at our sentences very carefully. Where do we need to put
capital letters? Begin with the first sentence and let children state that this
sentence needs to begin with a capital letter. Does every sentence end with a
period? Select volunteers to report where ending punctuation is needed.
During this phase of the writing process, the teacher will use markers to edit the
sentences on the overhead projector. The teacher should rewrite each sentence
onto the poster paper after the lesson for use on day 5.
Week 17 DAY 5
1. Grammar
Materials:
McGraw-Hill Grammar Practice Book: p. 93
McGraw-Hill Grammar Practice Book: p. 94
1. Grammar Assessment (10 minutes)
Administer assessment for questions and exclamations on page 94 of the McGraw
Hill Grammar Practice Book. Let the children know that they should do their very
best on this test; it will be used to help the teacher understand how well the children
are progressing.
2. Generative Writing
Introduce a Concept The Writing Process- Publish
Today we are going to complete the last phase of story writing. Now
that we have completed the editing and revision our story is now ready
to share with everyone. We have written a wonderful story and now
we will read it together. Then we will post our story in the classroom
so that we can read it all the time.
Echo Reading [See page 3 of Teacher’s Guide: McMillan-English/SAILL ]
First, read a sentence while the children. Next, have children read the same sentence
along with you. Finally, instruct children to read the sentence with you, but teacher
stops reading after the first two words so children can finish the sentence on their
own. Then proceed to the next sentence and follow the same procedure for each
sentence thereafter. Read with proper intonation.
First I’m going to read a sentence of the story and you are going to
follow what I am reading by looking at the chart. Don’t read it with
me this time; just follow with your eyes. [Read sentence.] Now, I am
going to read this sentence again and you are going to follow with
your eyes and read with me. [Read sentence again with children.] Now, we
are going to read the sentence one more time. But this time I am going
to stop reading after the first few words. You need to keep reading
until the end of the sentence. [Read the line once more and stop after the first
few words. Repeat until the story is read.]
Appendix B – Sample Lesson Plan for Language Enrichment
Language Enrichment
SAILL Project
Day One Hundred Thirty Five
Check LLP
Reading Deck Activity
Let’s begin with a review of these cards from our reading deck.
Tell me the name of the letter, the key word, and the sound.
New Concept Introduction
Students take out you mirrors so that you review a concept you
have already learned.
Listen and echo each word after me while you look in your
mirrors.
Sinless, nameless, useless
What do you hear in the final position of each of these words?
Yes, the /less/ sound
Look at the board.
sin + less = sinless,
name + less = nameless,
use + less = useless
Each of these words has /less/ at the end. I have added /less/ to the
base word. If you take off /less/ from each of these words there is a
base word. This is suffix -less . Suffix -less means without. We
code suffix -less with a box. Good Job!
What comes next?
Reading Practice
Now we will apply what you have learned. Take out a sheet of
paper. I will say words with suffix –less. You will write the word.
701
We will check the word that you have written. We will read the
words you have written. The first word is: restless. The next word
is: spotless, endless, helpless, hapless
Now let’s read the words.
restless, spotless, endless, helpless, hapless
Now I will check your work.
Good job reading and writing the suffix –less.
What comes next?
Review
Now we will review what you have learned.
(Teacher shows the Suffix Deck Card and asks.) What can you tell
me about suffix -less? (Students respond “Suffix –less means
without.) “ (If the students are not able to respond, the teacher
says.) “My turn. Suffix -less means without.)
(Teacher shows the Closed Syllable Concept Card and asks.) What
can you tell me about closed syllables? (Students respond “A
closed syllable ends in at least one consonant. The vowel in a
closed syllable is short; coded with a breve.”) (If the students are
not able to respond, the teacher says.) “My turn.” “A closed
syllable ends in at least one consonant. The vowel in a closed
syllable is short; coded with a breve.”
Good Job!
“What comes next?” The students will respond, “Oral Language
and Reading Comprehension.”
702
Language Enrichment
SAILL Project
Day One Hundred Thirty Six
Check LLP
Reading Deck Activity
Students each day our lesson will begin with a review of these
cards from our reading deck. Each day you will tell me the name of
the letter, the key word, and the sound.
“What comes next?”
New Concept Introduction
Students take out you mirrors so that you can discover a new
sound, a key word to unlock the sound and the letter or letters that
represent that sound.
Listen and echo each word after me while you look in your
mirrors.
Listen for the sound that is the same in all these words.
Boot, soon, smooth, shampoo, monsoon
Tell me the sound that is the same in all these words.
Yes, the sound is /oo/.
Say the sound again while looking in your mirror. Is the sound
open or blocked?
Yes, it is open.
Place your fingers on your vocal cords and say the sound one more
time.
Is the sound voiced or unvoiced?
Yes, it is voiced.
Is this a vowel sound or a consonant sound?
Yes, it is a vowel sound because it is open and voiced.
703
Listen to this riddle and discover the key word that will unlock this
sound.
The astronauts who blasted off
Made headlines very soon.
They thought they’d gone to heaven
When they landed on the (MOON).
Let me write the words on the board.
Boot, soon, smooth, shampoo, monsoon
What letter or letters are the same in all these words?
(Teacher shows students the reading deck card.)
Yes, the name of the vowel pair syllable is ___. (oo)
The key word is ____. (Moon)
The sound is ____. (/oo/)
Look at the reading deck card. The vowel pair syllable is
pronounced / oo /. The / oo/ sound is open and voiced.
The OO is a vowel pair syllable. It will be placed with the section
of the reading deck named vowel pair syllable.
“What comes next?” The students will respond:
Reading Practice
Now we will apply what you have learned. We will code the first
row together.
1.
2.
3.
4.
5.
boot
goose
proof
igloo
moon
scoop
root
wool
gloom
shoot
rooster
wood
room
soon
broom
Review
704
Now we will review what you have learned.
(Show students the IRD card (2.64). Students say the name of the
letter, the key word, and the sound). If the students don’t remember
the information, the teacher says, “My turn. OO, Moon, / oo /,
Your turn” (Students respond).
(Teacher shows the Vowel Concept Card and asks.) What can you
tell me about a vowel? (Students respond “Vowels are open and
voiced. The vowels are a, e, i, o, u.). (If students are not able to
respond, the teacher says.) “My turn. Vowels are open and voiced.
The vowels are a, e, i o, u. Your turn” (Students respond.)
(Teacher shows the vowel pair syllables and asks.) What can you
tell me about vowel pair syllables? (Students respond “A vowel
pair syllable has two adjacent vowels.”) (If the students are not
able to respond, the teacher says.) “My turn.” “A vowel pair
syllable has two adjacent vowels.”)
Good Job!
“What comes next?” The students will respond, “Oral Language
and Reading Comprehension”:
705
Reading Practice 136
OO= /oo/
1. boot
scoop
rooster
2. goose
root
wood
3. proof
wool
room
4. igloo
gloom
soon
5. moon
shoot
broom
706
Language Enrichment
SAILL Project
Day One Hundred Thirty Seven
Check LLP
Reading Deck Activity
Students each day our lesson will begin with a review of these
cards from our reading deck. Each day you will tell me the name of
the letter, the key word, and the sound.
“What comes next?”
New Concept Introduction
Students take out you mirrors so that you can review a new sound,
a key word to unlock the sound and the letter or letters that
represent that sound.
Listen and echo each word after me while you look in your
mirrors.
Listen for the sound that is the same in all these words.
Boon, soon, smooth, monsoon
What medial sound is the same?
Yes, the letters are sound is /oo/
The letters OO are pronounced / oo /. These two letters produce
one sound. This combination is called a diagraph. The sound o-o is
elongated so we code it with a long macron. The long macron
shows that the sound is elongated. The macron does not show that
the sound is long.
What comes next?
Reading Practice
Now we will apply what you have learned. Take out a sheet of
paper. I will say words with the diagraph OO. You will write each
707
word. We will check the words that you have written. We will read
the words you have written. The first word is: doom. The next
word is: shampoo, roof, stool, Cooper
Now let’s read the words.
doom, shampoo, roof, stool, Cooper
Now let’s check your work.
Good job reading and writing the diagraph OO.
What comes next?
Review
Now we will review what you have learned.
(Show students the IRD card (2.64). Students say the name of the
letter, the key word, and the sound). If the students don’t remember
the information, the teacher says, “My turn.OO, moon, /oo/, Your
turn” (Students respond).
(Teacher shows the Vowel Concept Card and asks.) What can you
tell me about a vowel? (Students respond “Vowels are open and
voiced. The vowels are a, e, i, o, u.). (If students are not able to
respond, the teacher says.) “My turn. Vowels are open and voiced.
The vowels are a, e, i o, u. Your turn” (Students respond.)
(Teacher shows the vowel pair syllables and asks.) What can you
tell me about vowel pair syllables? (Students respond “A vowel
pair syllable has two adjacent vowels.”) (If the students are not
able to respond, the teacher says.) “My turn.” “A vowel pair
syllable has two adjacent vowels.”)
Good Job!
“What comes next?” The students will respond, “Oral Language
and Reading Comprehension.”
708
Language Enrichment
SAILL Project
Day One Hundred Thirty Eight
Check LLP
Reading Deck Activity
Let’s begin with a review of these cards from our reading deck.
Tell me the name of the letter, the key word, and the sound.
New Concept Introduction
Students take out you mirrors so that you can review a concept you
have already learned.
Listen and echo each word after me while you look in your
mirrors.
frantic, rabbit, socket,
How many vowels can you hear in each of these words?
That’s right there are two vowels.
When you say a vowel, your mouth opens. A syllable is made with
one opening of the mouth. A syllable has one vowel sound.
How many syllables are there in the word frantic?
That’s right there are two syllables.
How many syllables are there in the word RABBIT?
That’s right there are two syllables.
When you pronounce the word “SOCKET” your mouth opens
twice. The word socket is made up of two syllables.
When you pronounce the word “FRANTIC” your mouth opens
twice. The word frantic is made up of two syllables.
Look at the word on the blackboard. (overhead)
When I place my finger under the vowels, there are two consonants
between the vowels. VOWEL, CONSONANT, CONSONANT,
709
VOWEL. The word is divided between the two consonants. We
place the accent on the first syllable because most English words
are accented on the first syllable.
The first syllable is closed. The vowel in a closed syllable is short.
Code it with a breve. The second syllable is closed. We also code
the short vowel with a breve.
What comes next?
Reading Practice
Now we will apply what you have learned. You will divide the
syllables between the two consonants. Code each syllable and
place the accent on the first syllable.
1.
2.
3.
4.
bucket
suffix
puppet
popper
hopper
locket
content
sandal
tonsil
impact
sandal
mantis
What comes next?
Review
Now we will review what you have learned.
(Teacher shows the VCCV pattern Concept Card) Students what
can you tell me about a VCCV pattern? (Students respond “Words
with the VCCV pattern usually divide between the consonants with
an accent on the first syllable.”) (If the students are not able to
respond, the teacher says.) “My turn.” “Words with the VCCV
pattern usually divide between the consonants with an accent on
the first syllable.”
(Teacher shows the Closed Syllable Concept Card and asks.) What
can you tell me about closed syllables? (Students respond “A
closed syllable ends in at least one consonant. The vowel in a
closed syllable is short; code it with a breve.”) (If the students are
710
not able to respond, the teacher says.) “My turn.” “A closed
syllable ends in at least one consonant. The vowel in a closed
syllable is short; code it with a breve.”
(Teacher shows the Open Syllable Concept Card and asks.) What
can you tell me about an open syllable? (Student responds “An
open syllable ends in one vowel. The vowel in an open, accented
syllable is long; code it with a macron.”) (If the students are not
able to respond, the teacher says.) “My turn” An open syllable
ends in one vowel. The vowel in an open, accented syllable is long;
code it with a macron.)
Good Job!
“What comes next?” The students will respond, “Oral Language
and Reading Comprehension.”
711
Reading Practice 138
VCCV
1.
bucket
hopper
tonsil
2.
suffix
locket
impact
3.
puppet
content
sandal
4.
popper
sandal
mantis
712
Language Enrichment
SAILL Project
Day One Hundred Thirty Nine
Check LLP
Reading Deck Activity
Students each day our lesson will begin with a review of these
cards from our reading deck. Each day you will tell me the name of
the letter, the key word, and the sound.
“What comes next?”
New Concept Introduction
Students take out you mirrors so that you can review a new sound,
a key word to unlock the sound and the letter or letters that
represent that sound.
Listen and echo each word after me while you look in your
mirrors.
Listen for the sound that is the same in all these words.
Picnic, racket, basket
What medial sound is the same?
Yes, the sound is sound is /k/
Look at the words on the blackboard. (overhead)
When I place my finger under the vowels, there are two consonants
between the vowels. VOWEL, CONSONANT, CONSONANT,
VOWEL. The word is divided between the two consonants. We
place the accent on the first syllable because most English words
are accented on the first syllable.
The first syllable is closed. The vowel in a closed syllable is short.
Code it with a breve. The second syllable is closed. We also code
the short vowel with a breve.
713
What comes next?
Reading Practice
Now we will apply what you have learned. Take out a sheet of
paper. I will say words with the VCCV pattern. You will write
each word. We will check the words that you have written. We will
read the words you have written. The first word is: racket. The next
word is: basket, trumpet, sandal
Now let’s read the words.
racket, basket, trumpet, ruckus
Now let’s check your work.
Good job reading and writing syllable division VCCV.
What comes next?
Review
Now we will review what you have learned.
(Teacher shows the VCCV pattern Concept Card) Students what
can you tell me about a VCCV pattern? (Students respond “Words
with the VCCV pattern usually divide between the consonants with
an accent on the first syllable.”) (If the students are not able to
respond, the teacher says.) “My turn.” “Words with the VCCV
pattern usually divide between the consonants with an accent on
the first syllable.”
(Teacher shows the Closed Syllable Concept Card and asks.) What
can you tell me about closed syllables? (Students respond “A
closed syllable ends in at least one consonant. The vowel in a
closed syllable is short; code it with a breve.”) (If the students are
not able to respond, the teacher says.) “My turn.” “A closed
syllable ends in at least one consonant. The vowel in a closed
syllable is short; code it with a breve.”
(Teacher shows the Open Syllable Concept Card and asks.) What
can you tell me about an open syllable? (Student responds “An
714
open syllable ends in one vowel. The vowel in an open, accented
syllable is long; code it with a macron.”) (If the students are not
able to respond, the teacher says.) “My turn” An open syllable
ends in one vowel. The vowel in an open, accented syllable is long;
code it with a macron.)
Good Job!
“What comes next?” The students will respond, “Oral Language
and Reading Comprehension.”
715
Language Enrichment
SAILL Project
Day One Hundred Forty
Check LLP
Reading Deck Activity
Let’s begin with a review of these cards from our reading deck.
Tell me the name of the letter, the key word, and the sound.
New Concept Introduction
Students take out you mirrors so that you can review a concept you
have already learned
Listen and echo each word after me while you look in your
mirrors.
Yelled, seemed
What do you hear in the final position of each of these words?
Yes, the /d/ sound
Look at the board.
yell + ed= yelled, seem + ed = seemed,
If you take off the -ed from each of these words, there is still a
base word. The suffix -ed with a final sound of /d/ means
happened in the past. Remember, a base word is a plain word with
nothing added to it -ed is a suffix that is added to a base word.
Since it begins with a consonant it is called a consonant suffix. We
will code the suffix -ed with a box.
Tell me the sounds that are at the end of these words,) yelled,
seemed)
Yes, the sound are /d/.
716
Say the sound again while looking in your mirror. Is the sound
open or blocked?
Yes, it is blocked by the teeth.
Place your fingers on your vocal cords and say the sound one more
time.
Is the sound voiced or unvoiced?
Yes, it is voiced.
When suffix –ed comes immediately after a voiced sound, it will
say /d/.
Listen and echo each word after me while you look in your
mirrors.
Missed, helped, jumped
What do you hear in the final position of each of these words?
Yes, the /t/ sound
Look at the board.
miss + ed= missed, help + ed = helped, jump + ed = jumped
If you take off the -ed from each of these words, there is still a
base word. The suffix -ed with a final sound of /t/ means
happened in the past. Remember, a base word is a plain word with
nothing added to it -ed is a suffix that is added to a base word.
Since it begins with a consonant it is called a consonant suffix. We
will code the suffix -ed with a box.
Tell me the sound that is the same in all these words.
Yes, the sound is /t/.
Say the sound again while looking in your mirror. Is the sound
open or blocked?
Yes, it is blocked
Place your fingers on your vocal cords and say the sound one more
time.
Is the sound voiced or unvoiced?
Yes, it is unvoiced.
717
When suffix -ed comes after an unvoiced sound, it will say /t/.
What comes next?
Reading Practice
Now we will apply what you have learned. Code suffix -ed with a
box. Remember it will be pronounced as /d/ immediately after a
voiced sound. It will be pronounced as /t/ immediately after a
voiceless sound. We will code the first row together.
1.
2.
3.
4.
5.
filmed
spilled
spelled
smelled
skilled
banged
thrilled
drilled
yelled
filled
filled
willed
billed
dabbed
nabbed
1.
2.
3.
4.
5.
jumped
bumped
camped
picked
helped
kissed
tricked
packed
stacked
kissed
stamped
snacked
dumped
blocked
missed
What comes next?
Review
Now we will review what you have learned.
(Teacher shows the Suffix Deck Card and asks.) What can you tell
me about suffix -ed? (Students respond “Suffix -ed means
happened in the past.) “ (If the students are not able to respond, the
teacher says.) “My turn. Suffix -ed means happened in the past.)
Good Job
“What comes next?” The students will respond, “Oral Language
and Reading Comprehension.” :
718
Reading Practice 140
-ed
= /d/
1.
filmed
banged
filled
2.
spilled
thrilled
willed
3.
spelled
drilled
billed
4.
smelled
yelled
dabbed
5.
skilled
filled
nabbed
719
Reading Practice 140-A
-ed
= /t/
1.
jumped
kissed
stamped
2.
bumped
tricked
snacked
3.
camped
packed
dumped
4.
picked
stacked
blocked
5.
helped
kissed
missed
720
National Research and Development Center for English Language Learners
Section C – Additional Information p. 49
Project 3
The Impact of the SIOP Model on Middle School Science and Language Learning
Year 1 Annual Report (8/1/05 - 7/31/06)
Deborah Short, Center for Applied Linguistics, Co-PI
Jana Echevarria, CSU Long Beach, Co-PI
Abstract
Most English language learners confront an educational landscape where they
must study and be tested on grade-level curricula in a new language at the same time they
are learning that language. This is not only difficult for the students themselves but also
for their teachers. We intend to investigate this critical problem by focusing our research
questions to determine which delivery model is the most effective delivery model for
instruction, particularly in science.
One approach that has shown promise is the research-validated Sheltered
Instruction Observation Protocol (SIOP) Model (Echevarria, Short & Powers, 2006). The
SIOP Model shares many features recommended for high quality instruction for all
students. However, the SIOP Model adds key features for the academic success of ELLs,
such as the inclusion of language objectives in every content lesson, the acquisition of
content-related vocabulary, and the emphasis on academic literacy practice (Echevarria,
Vogt, & Short, 2000, 2004). The SIOP Model offers a framework for organizing
instruction with required features for each lesson so that teachers can accommodate the
distinct second language development needs of ELLs.
In our research, we investigate the impact of the SIOP Model on student academic
achievement in science, a subject area with high language demands. We have developed
and pilot-tested SIOP lesson plans and assessments that focus on the acquisition of
science concepts and language development among English language learners in middle
school. We will train science teachers in the SIOP Model so that they implement the
lesson plans effectively. Then we will test student performance on the assessments and
compare the results to those of control students.
Theoretical and Conceptual Background of Study
The overall academic performance of ELLs in U.S. schools is problematic with a
dramatic, lingering divide in achievement in many subject areas between Caucasian
students and those from culturally and linguistically diverse groups (California
Department of Education, 2004; Siegel, 2002; Snow & Biancarosa, 2004). Part of the
reason for the achievement gap is that many teachers are underprepared to make content
comprehensible to English language learners who are not proficient in the language of
instruction (i.e., English). Until recently, they lacked a proven, effective model of
instruction. In addition, ELLs are asked to demonstrate their content area knowledge on
high stakes tests, particularly those required as part of the No Child Left Behind
legislation, while they are still developing proficiency in English, which is usually the
language of the tests. While ELLs have been tested in mathematics and reading to date, in
49
National Research and Development Center for English Language Learners
Section C – Additional Information p. 50
2007, tests in science will be added to the battery of assessments students must take. Our
teachers need instructional interventions that can reduce the achievement gap between
English language learners and native English-speaking students and we posit that the
Sheltered Instruction Observation Protocol (SIOP) Model, which provides a framework
for teachers to incorporate attention to second language development needs, will offer a
successful approach to teaching science to ELLs.
The SIOP Model is a research-based model of sheltered instruction developed by
researchers at the Center for Applied Linguistics (CAL) and California State University,
Long Beach (CSULB) for the National Center for Research on Education, Diversity &
Excellence (CREDE) (Echevarria, Vogt, & Short, 2000, 2004). It incorporates best
practices for teaching academic English and provides teachers with a coherent, usable
approach for improving the achievement of their students. The model comprises 30 items
grouped into eight components essential for making content comprehensible for English
learners—Preparation, Building Background, Comprehensible Input, Strategies,
Interaction, Practice/Application, Lesson Delivery, and Review/Assessment. Teachers
present curricular content concepts aligned to state standards through strategies and
techniques that make academic content comprehensible to students. While doing so,
teachers develop students’ academic English language skills across the four domains—
reading, writing, listening, and speaking.
The SIOP Model shares many features recommended for high quality instruction
for all students, such as cooperative learning, strategies for reading comprehension,
writers workshop, and differentiated instruction. However, the SIOP Model adds key
features for the academic success of ELLs, such as the inclusion of language objectives in
every content lesson, the development of background knowledge, the acquisition of
content-related vocabulary, and the emphasis on academic literacy practice. The SIOP
Model offers a framework for organizing instruction with required features for each
lesson that accommodate the distinct second language development needs of ELLs. It
allows for some variation in classroom implementation while at the same time provides
teachers with specific lesson features that, when implemented consistently and to a high
degree, lead to improved academic outcomes for English language learners (Echevarria,
Short, & Powers, 2006).
Research Issues
In the CREDE study, researchers worked with middle school teachers in four,
large metropolitan school districts—two on the East Coast and two on the West Coast—
to identify key practices for sheltered instruction and develop a professional development
model to enable more teachers to use sheltered instruction effectively in their classes. The
teachers taught mathematics, science, or social studies using the SIOP Model to ELLs.
The students of middle school teachers using the SIOP Model outperformed comparable
students on a standardized test of academic writing (Illinois Measure of Annual Growth
in English). SIOP implementation was verified by in-class observations using the
Sheltered Instruction Observation Protocol. (See Echevarria, Short, & Powers, 2006 for
research results.)
The SIOP Model is currently being implemented in school districts and used in
university teacher preparation programs in all 50 states around the U.S. Moreover, many
resources have been developed by the researchers to help support the professional
50
National Research and Development Center for English Language Learners
Section C – Additional Information p. 51
development of teachers using the model. However, the implementation of the SIOP
Model has outpaced the research on its features. While the preliminary research results
were significant, the student outcomes were focused on academic literacy, in particular
writing, and on attendance data. All core subject areas were combined in that study, so
the effects on any one subject area could not be determined.
In this new research project, we are testing the effects of the SIOP Model on
student academic success in one specific subject area with high language demands—
science. In a series of controlled, randomized studies, we are investigating whether the
SIOP Model has a significant impact on the acquisition of science concepts and scientific
language development among English language learners in middle school at a time when
they are beyond the initial age for literacy development. In order to do this study, we will
provide professional development to teachers on the model, with a focus on identifying
language and content objectives for each lesson, selecting appropriate techniques to
ensure coverage of those objectives, and designing activities that promote science reading
comprehension and student interaction. For one group of teachers, we will provide
researcher-developed curriculum units with SIOP science lessons for them to implement
for part of the academic year. We expect that students’ vocabulary development and
comprehension of scientific concepts will significantly increase in classes with SIOP
trained teachers.
In the final 2 years of the Research Center, we will integrate what we have
learned from our SIOP studies with findings from other Center studies on reading
strategies, text modification, background building, and language development to enhance
the SIOP Model. In a large scale, randomized study in Year 5 at new research sites, we
plan to evaluate the effects of the integrated SIOP Model in the science and social studies
content areas.
Hypothesis
We hypothesize that the students of teachers trained in the SIOP Model will
outperform students of teachers not trained in the model on measures of Grade 7 science
content and scientific language. In addition, teachers who receive training in the model
plus project-developed SIOP science curriculum units will implement the model to a
higher degree than teachers who receive training alone, and the students of teachers with
training + SIOP science lessons will perform better than students of teachers with training
alone.
Research Questions
1. What are the effects of the SIOP Model of sheltered instruction on academic language
and concept comprehension among English language learners in middle school science
classrooms?
2. What are the effects of an integrated SIOP Model of sheltered instruction (that
incorporates findings from other Center studies on reading strategies, language
development and text modification) on academic language and concept comprehension
among English language learners in middle school science classrooms?
51
National Research and Development Center for English Language Learners
Section C – Additional Information p. 52
Significance of Proposed Study for Research, Policy, or Practice
Across the U.S., school districts are seeking research-based approaches with
proven results to help ELLs develop academic language proficiency and understanding of
science concepts. This study, over 5 years, will investigate the professional development
and curriculum design of SIOP Model lessons and the subsequent effect on student
achievement in multiple sites with diverse student populations. The Year 1 pilot was
designed to develop and refine science curriculum lessons that incorporate the SIOP
Model features and to field-test academic science language assessments. The results of
the pilot study will be used in Years 2 and 3 as the research is scaled up to multiple sites
across the U.S. It is anticipated that the data gathered from Years 1-3 will be combined
with the research findings from other NRDCELL research studies and will ultimately
coalesce into a successful school reform intervention that benefits English language
learners and non-ELLs alike.
Study Design: Sample Selection Criteria, Sample Size, Methods, and Data Analysis
The student population will include seventh grade English language learners in
Life Science classes. They will participate as part of their regular instructional day. The
number of students and classes participating varies according to the research year. During
the pilot in Year 1, approximately 120 students participated in two districts, Arlington,
Virginia and Long Beach, California. In Year 2, it is anticipated that 150-200 students
will participate in Long Beach, and in Year 3, 250-300 in a district yet to be determined.
The number of participating students in Year 5 will be determined in later years.
Regarding the teacher participants, five teachers field-tested the SIOP science
units and assessments in the Year 1 pilot study in the two districts. In Years 2 and 3, the
teachers will be selected from randomly assigned schools as treatment or control
participants. For Year 2, it is anticipated that 10 teachers will join the study (5 as
treatment and 5 as control), and 15 teachers will join in Year 3 (10 as treatment and 5 as
control). As with the students, the number of teachers for Year 5 will be determined in
the future. In the Year 2 study, schools will be randomly assigned to treatment or control
and teachers within schools will implement the assigned condition in all of their seventh
grade science sections.
We are developing two types of materials for the research study: SIOP science
curriculum units and scientific language assessments. For the pilot study, we created
SIOP lessons for several seventh grade science units with corresponding assessments tied
to state science and English language development standards. These lessons were taught
by seventh grade science teachers in their classes. The teachers consulted on the lesson
design and provided feedback once the lessons had been taught. These lessons will be
revised based on teacher feedback and observation records. The revised lessons will be
used in Years 2 and 3 as the study scales up.
We are also developing unit-specific scientific language assessments. Pre-and
post assessments are being piloted in the study to ensure that the test items, instructions
and format of the test are working. The items for these language assessments are
informed by the World-class Instructional Design and Assessment (WIDA) English
Language Proficiency Standards for English language learners, based on the TESOL
Standards for English language learners. The test tasks are adapted from the test tasks
used by Butler, et al. (2004) and Bailey, et al. (2005) from the National Center for
52
National Research and Development Center for English Language Learners
Section C – Additional Information p. 53
Research on Evaluation, Standards, and Student Testing (CRESST). Butler and Bailey
developed a framework for developing tests for ELLs based on analyses of the academic
language demands in science and other subject textbooks. The teachers in this project
administer pre- and post-assessments of lesson-related scientific language and content at
the start and close of each unit. The post assessments serve as the unit test the teachers
would normally give.
Data collection for the pilot in Year 1 included observation notes made during
delivery of the SIOP science lessons, teacher feedback on the lessons, and student
performance on lesson tasks and assessments. In Years 2 and 3, SIOP protocol ratings of
classroom observations and videotapes of classroom instruction will serve as additional
data.
Data analysis in Year 1 includes qualitative review of the feedback and
observation notes to determine necessary revisions to the SIOP lessons. Analysis of
student performance on the assessments, including an item-analysis examination, will be
used to revise the assessments after Year 1. Student knowledge growth will be
determined by comparative analyses of their pre- and post-test scores. Differential
analysis of each test item will help determine areas of weakness and strength among the
students. Test items from student assessments will be combined with teacher observation
scores on the SIOP and field notes from the observations to identify the relationship
between what is taught, how it is taught, and how students perform on the test. In Year 2,
the analyses will include a comparison of the gains achieved by the treatment and control
groups in the pre and post administrations of the content and language assessments. We
will also determine whether students’ gains are larger for certain science units. The
statistical model is multi-level repeated measures design with students nested within
class, class within teacher, and teacher within school. Treatment effects are measured at
the school-level, but with repeated measures at the teacher level (multiple sections per
teacher), and the measure of learning is post-test performance adjusted for pre-test
performance at the student level. In order to minimize the chances for unhappy
randomization at the school-level given the small number of units to be assigned, we
grouped the schools based on the number of ELLs and randomly assigned them to
conditions within groups in accordance with recommendations from Murray (1998) (see
below for additional details).
Principal Dependent, Independent, and Control measures
During the pilot phase of Year 1, we did not have dependent, independent, or
control measures. We field tested a lesson design template, curriculum units, and an
approach to measuring the language of science. We sought to determine, through pre- and
post-assessment, the growth in students’ science content and language skills after
receiving 2-3 weeks of instruction by teachers using lessons designed according to the
SIOP Model.
In Year 2, the independent variables will be the staff development for the
treatment teachers and the SIOP science lesson plans. In addition, teachers who are
determined to be low implementers of the SIOP based on the results of their observation
ratings will receive coaching from the researchers to enhance their implementation. The
dependent variable will be the curriculum objectives for Grade 7 science and SIOP
protocol rating scale.
53
National Research and Development Center for English Language Learners
Section C – Additional Information p. 54
The schools participating in Year 2 will be randomly selected as treatment or
control sites and then teachers will be assigned. In Year 2, we will have two conditions:
Treatment teachers receive professional development in the SIOP Model plus the SIOP
science units, and Control teachers provide instruction on the same units in their usual
way. In Year 3, we will have three conditions: Treatment A teachers receive professional
development in the SIOP Model plus the SIOP science units, Treatment B teachers
receive professional development in the SIOP Model without the SIOP science units, and
Control teachers instruct the same units in their usual way.
Findings and Accomplishments Since the Last Performance Report
School Recruitment
On the east coast, in Fall 2005, we contacted several school districts in the
metropolitan Washington, DC area to determine their middle school program models for
English language learners, number of ELLs they serve in Grade 7, and their interest in
participating in the pilot stage of this study. The contacted districts included Arlington
Public Schools, Fairfax County Public Schools, Prince William County Public Schools,
and the City of Manassas Public Schools in Virginia, and Montgomery County Public
Schools, in Maryland. After several discussions with ESL and science coordinators at
these districts, we pursued a collaborative research relationship with Arlington Public
Schools (APS) which provides the High Intensity Language Training (HILT) program
model for their students and which was the site of the original SIOP Model research
under the Center for Research on Education, Diversity & Excellence. We submitted a
research application to APS’s Research and Evaluation office for consideration by their
research committee in January 2006. We also met with the ESOL Science Resource
specialist, the ESOL secondary specialist, and the head of the ESOL Program for the
district. Our application was passed on to the Board of Education and approval was
granted in late March 2006.
The ESOL central office staff identified teachers and schools for the pilot and we
formally invited their participation. After approaching nine teachers, four teachers from
two middle schools agreed to join the study. The teachers received 3 hours of training in
the SIOP Model, an overview of the SIOP science lesson plans, and a discussion of the
science language assessments. The teachers received the lessons and associated teaching
materials (e.g., student handouts). The language and content assessments were provided
according to the lesson implementation schedule. Teachers received the pre-assessments
at the start of each unit and the post assessments at the end.
A similar process occurred on the west coast. Several California districts were
contacted in Fall 2005 for possible participation in the study. Given the time frame, it was
also important to identify the district for the Year 2 research that served a large
population of ELLs. The districts contacted included Garden Grove Unified School
District, ABC Unified School District and Long Beach Unified School District. Los
Angeles Unified was also considered but not pursued because they were already
committed to another SIOP grant proposal. Long Beach Unified School District seemed
the best choice given the size of the student and teacher population; the large number of
middle schools with ELLs allowed for 5 schools to be randomly assigned to each
condition. After several meetings with district personnel (i.e., the assistant
54
National Research and Development Center for English Language Learners
Section C – Additional Information p. 55
superintendent, district ELL coordinator, science coordinator, and middle school
coordinator), we received permission from Long Beach Unified School District (LBUSD)
to participate in the pilot study and three Grade 7 science teachers were identified to
support the effort. Researchers held a meeting at the pilot school site with the principal
and science teachers to prepare for the pilot study. In addition, LBUSD gave approval to
conduct the Year 2 study in 10 middle schools. The researchers worked with the district
ELL coordinator to determine the number of ELLs at each school site. After some
schools were eliminated because of year-round status or special programs, the remaining
sites were classified as high, medium or low relative to the number of ELLs in the student
population. Sites were randomly assigned to treatment or control, beginning with the high
and medium sites, until there were 5 schools in each condition.
At the west coast site, in mid March, we conducted a 3-hour training for the
science teachers at the pilot school site. This staff development session included an
overview of the SIOP Model as well as a session for critiquing SIOP lesson plans. At that
point, two teachers informed the researchers they were teaching health rather than life
science in the spring and could not participate in the pilot. One teacher did continue with
the study. The teacher was provided with a binder of all lessons from the cell division and
genetics/heredity units and associated materials for teaching the lessons.
SIOP Science Lesson Plans
We developed a unit design and lesson plan template for the pilot SIOP Science
lessons that aligned with the Grade 7 life science standards and the middle school ELD
standards for Virginia and California. After reviewing the pacing guides and speaking
with administrators from both districts, we selected a unit on genetics and heredity for the
pilot because both districts would teach topics from this unit in the fourth quarter of the
2005-06 school year when we would conduct the research. We then drafted specific
lessons for this unit that corresponded to California’s and Virginia’s content standards for
genetics and investigation and experimentation. Additionally, the lessons supported
specific skills and tasks outlined in the curriculum frameworks for APS and LBUSD.
Most of the same lessons will be taught at both sites, but some will be different because
the scope and sequence of science instruction varies somewhat between the two school
districts. We intended to have teachers work with us to design the SIOP science units, but
the delay in district approvals and teacher selection required that we write the lessons and
then have teachers review them. We wrote lessons on cell division and genetics and
heredity to start the study. By the time the pilot started in the east coast district, however,
three of the four teachers had taught these topics. Therefore, an additional unit on biomes
was developed. The LBUSD teacher piloted the cell division and genetics lessons. All
four APS teachers piloted the biomes lessons, and one piloted the genetics and heredity
unit as well. A sample lesson plan is attached in Appendix P3-A.
Science Language Assessments
In preparation for developing pre-and post- assessments to determine the
effectiveness of the SIOP intervention in our pilot study, the project assessment specialist
reviewed the literature to investigate how other researchers have defined and tested the
construct of academic language in science. She examined the TESOL PreK-12 standards
for English language learners and the national science education standards to guide the
55
National Research and Development Center for English Language Learners
Section C – Additional Information p. 56
test development process and also reviewed available science tests at the national, state,
and school district levels and researcher-developed tests that assessed academic language
in science, to obtain ideas for possible assessment format and content. To assist in the test
design, the project acquired resources for test and lesson plan development, such as a free
online software that analyzes the lexile level of reading passages and a vocabulary
database that provides the grade and difficulty levels of words.
In collaboration with the project PIs, the assessment specialist defined the
construct of academic language in the context of the project and developed test
specifications for the pre- and post- tests. Each SIOP science unit would have a
corresponding science language assessment. Test passages and questions were drafted to
align to the language skills associated with cell division, genetics/heredity, and biome
knowledge. These sample items involved reading and writing tasks, using scientific
language. A scoring rubric, with descriptions of the quality and quantity of vocabulary
and syntactic complexity required for each score, has been developed for the essay
writing portion of the test. Sample responses for each numeric score on the rubric have
also been written for each essay question. The pilot teachers also reviewed these items.
After an initial item analysis of the genetics and heredity language pre-assessments
revealed the students were correctly answering too many of the vocabulary and reading
items, the biome assessment was modified to include items that were more appropriate to
the age and English proficiency level of the students. A sample science language
assessment is attached in Appendix P3-A.
For the content assessments, we relied on LBUSD’s end of course exam which
meets the California state testing requirement for science knowledge for the west coast
students. This 42 item multiple choice test, revised slightly each year, is based on the
California science standards and is given to all students in the Long Beach district at the
conclusion of life science. We modified LBUSD’s end of course test for the east coast so
that it only included test items relevant to the topics covered in our SIOP science. We
also included questions from the chapter tests associated with the textbook Science
Explorer series which our assessment specialist reviewed for lexile levels.
Year 1 Pilot Results
The pilot study took place from late April through early June. We communicated
regularly with the teachers and periodically met with them at the school sites. On the west
coast, the lessons and assessments from three units—photosynthesis and respiration, cell
division and genetics—were pilot tested at one middle school in three different Grade 7
science classes, and on the east coast, 4 teachers at two schools participated using the
genetics/heredity and biomes curriculum units. For data collection purposes, each teacher
was observed during the pilot (three times on the east coast, twice on the west coast), and
researchers recorded field notes about the lesson delivery and the student participation
and performance on lesson tasks.
As part of the study data collection, each teacher (in both districts) was given a
feedback form to record information after each lesson. Teachers were asked to comment
on what went well during the lesson, any challenges they encountered, and changes they
made to the lesson and why. The teachers were encouraged to offer suggestions for
alternative activities. In addition, as they administered the assessments, the teachers
recorded information about student reaction, time spent on the assessments, and questions
56
National Research and Development Center for English Language Learners
Section C – Additional Information p. 57
posed. One teacher’s feedback indicated that the lessons were engaging and students
enjoyed many of the hands-on activities that provided practice with the lesson’s concepts.
She reported, “Some students’ ability to take notes and summarize has improved and is
evident in their work.” However, generally lessons were too long so students were unable
to complete all the activities in one period. Other teachers reported that they enjoyed the
vocabulary building and review activities built into the lessons, but also acknowledged
not doing as many as they would have liked due to a lack of time.
In LBUSD, 102 students participated in the pilot study. Of those students, 38 were
classified as ELL or had been ELL but were redesignated as fluent. Their proficiency
level breakdown was the following, based on California’s five levels: 33 students were
redesignated; 1 student was advanced, and 4 were early advanced students. Forty-three
APS students participated: 28 were at the intermediate level, 9 were at the early advanced
level, and 1 was an advanced level student. Also 5 former ELLs who have monitored
status were in the classes. The beginning level APS students do not use the Grade 7
science curriculum and so were excluded. We collected and scored the student pre- and
post assessments of scientific language and content knowledge on the following unit
topics: genetics/heredity and interactions among living things/biomes.
Much of the data gathered from the pilot was qualitative. Data were collected
from classroom observations, written teacher feedback on the lessons, teacher interviews,
and pre- and post-assessments to measure student content and language achievement.
Each teacher was observed two or three times and notes were recorded on general
information about the class, such as number and gender of students, and classroom
arrangement, and information specific to the lesson plan. Additionally, notes were made
on the posting of content and language objectives, key vocabulary words, and other
features in the room that supported science language and literacy development. During
the observations, we indicated each section of the lesson that the teacher addressed, and
noted any modifications and how long it took the teacher to cover the section.
Our observations and the teacher feedback confirmed that the lessons need to be
modified in scope for subsequent years of the study. For example, the lessons will be
rewritten to instruct teachers to state the objectives explicitly at the beginning and end of
class. Because many of the lessons took longer than anticipated, the revisions will also
shorten the lesson plans. Classroom observations also informed us of areas where
teachers needed more guidance. For example, some teachers only presented and reviewed
information orally. Lessons will be revised to direct teachers to use more sheltered
instruction practices, such as recording student responses during brainstorming activities,
partnering students for more interaction, and completing the note-tasking templates.
Classroom observations also revealed that there was a high degree of variance
with regard to adherence to the lesson plan and SIOP-based instruction and activities. Of
the four teachers on the east coast, two were familiar with the SIOP Model and were
more consistent in following the lesson plan. The participating teachers who had had no
exposure to the SIOP Model prior to the pilot workshop were less consistent in following
the plans. It was also apparent that the teachers less familiar with the SIOP Model
avoided, or had difficulty with, activities in the lessons that emphasized building
background, language development, and interaction. For example, less experienced SIOP
teachers instructed students to use the book glossary to define new words or ignored
57
National Research and Development Center for English Language Learners
Section C – Additional Information p. 58
activities in the lessons designed to emphasize and expand the students’ use of scientific
language.
In contrast, teachers with more familiarity in the SIOP Model spent more time
building background and emphasizing vocabulary than expected according to the lesson
plans. In some instances, the experienced SIOP teachers modified the lessons effectively
so students would be more successful. For example, these teachers displayed and added
on to a list of the new vocabulary so students would have a resource to consult during the
unit. These teachers overall followed the lesson plans in terms of the activities designed
to promote science language development; they explicitly referred to the key vocabulary
throughout the lesson and encouraged the students to replace general vocabulary terms
with more academic ones. Our conclusion from this finding is that the teachers need
significantly more staff development on the SIOP Model prior to implementing the
lessons.
We also piloted the assessments and rubrics for each unit of lessons. Assessments
were scored by one rater using the rubric created by the assessment specialist. We
uncovered a major difficulty with our west coast student data which rendered our
analyses for those students unfortunately invalid. The pilot teacher admitted that she
coached the students during both the pre- and post-assessment administrations in an effort
to make them feel more confident in their performance. As a result, we do not believe we
have an accurate picture of their performance and have not included their scores in the
discussion here.
Initial analysis of data from the genetics/heredity and biome units on the east
coast indicates that ELLs exhibited more improvement on the content assessments than
on the language assessments (see Table 1). Twelve students were present for both the
pre- and post-test administrations. The maximum score varied by assessment. The
genetics/heredity language and content assessments were worth 16 points. The biome
language assessment was worth 18 points and the content assessment was worth 20.
Table 1: APS student means on pre and post language and content assessments
Language preassessment
Genetics/Heredity
6.0 of 16 pts
Language
postassessment
7.0 of 16 pts
Biomes
N=12
8.9 of 18 pts
10.1 of 18 pts
Content preassessment
Content postassessment
1.6 of 16 pts
6.9 of 16 pts
4.7 of 20 pts
9.1 of 20 pts
The scores on the language assessment reveal minimal improvement between preand post-test administration; however, as shown in Tables 2 and 3, writing was one
section where students showed some improvement.
Table 2: Breakdown of APS student scores on test item V (essay writing) on genetics
language pre and post-assessment
58
National Research and Development Center for English Language Learners
Section C – Additional Information p. 59
Writing score
Pre-assessment scores
Post-assessment scores
0
1
2
3
N=12
7
4
1
0
4
6
2
0
Table 3: Breakdown of APS student scores on test item V (essay writing) on biomes
pre and post-assessment
Writing score
Pre-assessment scores
Post-assessment scores
0
1
2
3
N=12
4
7
1
0
4
3
5
0
We are concerned that the limited time frame in which a unit is taught,
approximately 3 weeks, may be too brief for measurable language acquisition to occur.
We realized that the directions for the writing prompt item were not explicit enough for
the students and will adjust that in the revision process. The writing rubric and
benchmark papers are also being revised. The existing rubric is too general to distinguish
the features of academic science writing that the prompts were designed to elicit. We are
looking into the possibility of using Latent Semantic Analysis to score the writing
assessments during the Year 2 study and will be piloting its use on the current writing
samples over the summer. Based on that pilot we will make a decision toward the end of
summer on the scoring of the writing assessments. Benchmark papers for each writing
score will be reviewed as well. Further, for Year 2, the pre- and post-assessments will
have items that are similar so that they can be linked, but also items that are different to
ensure that a wide variety of material on each unit would be covered. For the reading
comprehension questions, we will also develop items that are more independent of each
other to obtain more information about the goodness of fit of each item.
The teachers also reported a concern about the amount of assessment we required.
They found that two pre-tests, one for language and one for content, along with two posttests were difficult to accomplish in one class period, yet due to tight district pacing
guides, they were unable to devote additional days to assessment.
Products
We have attached a sample lesson plan and sample language assessment
(Appendix P3-A) for the biome unit to this report. These are the versions used during the
pilot study and will be revised in Summer 2006.
59
National Research and Development Center for English Language Learners
Section C – Additional Information p. 60
Update Since the Last Performance Report
A major goal was to secure a study location for the Year 2 intervention on the
west coast and identify the treatment and control teachers. Much effort was expended in
Spring 2006 on the west coast to accomplish this. As noted earlier, LBUSD agreed to be
the district site for the Year 2 study. In February, after district approval was received, the
10 identified schools were categorized by ELL population. Sites were classified as high,
medium or low relative to the number of ELLs in the student population. Sites were
randomly assigned to treatment or control, beginning with the high and medium sites,
until there were 5 schools in each condition.
The next step was to recruit teachers for Year 2. The west coast researchers
emailed all project principals with introductions and a written overview of the project at
the end of March. The memo requested a meeting with the principal and 7th grade science
teachers in the subsequent 2 weeks. One principal responded and asked if the teachers
will be informed of the benefits of the project. None of the other principals responded. In
early May, the researchers succeeded in getting appointments with two of the remaining
principals. They received a memo regarding the timeline of the project and a flyer
announcing the teacher orientation meeting. For the other schools, the memo and flyers
were left for the principal in the main offices. Subsequent emails and phone calls to the
teachers were sent as reminders of a study orientation meeting scheduled for mid May.
The orientation meeting was attended by Jana Echevarria, Cara Richards, and two
SIOP coaches, and one teacher from 4 of the 5 schools. The study’s benefits and
commitments were explained. Several teachers indicated that they may not be teaching
7th grade science in the fall. Given the low turnout of teachers at the meeting and concern
about the teacher subject size in Fall 2006, one of the researchers also consulted with the
principal of the pilot school site for guidance on recruiting more teachers. The principal
informed her that in urban school settings, most principals do not schedule teachers for
specific courses until the school year begins.
Next Steps
In July, researchers will revise the SIOP curriculum units and language
assessments in collaboration with two teachers from the Long Beach district. Additional
lessons to match the syllabus of LBUSD’s Grade 7 science course will be written as well.
For Year 2, a total of 8 weeks of lessons across three curriculum units will be prepared.
In addition, to assist the Year 2 teachers, the researchers will prepare a calendar that
shows the dates when assessments need be administered as well as the sequence of lesson
instruction. The intention is to structure the units tightly so that all teachers will adhere to
a similar instructional sequence.
Beginning in August, we will contact the project treatment and control teachers. A
meeting with the control teachers will provide them with an overview of the project and
their role in administering the pre- and post-test assessments with each unit. The project
teachers will participate in a 3-day SIOP Institute wherein they will receive intensive
training in the SIOP Model and will review the binder of lessons they will be teaching
during the intervention. In addition, teachers will be given the calendar that specifies
dates that the assessments are to be administered and a pacing guide for the lessons.
Once the Year 2 study begins, teachers will be observed and rated using the SIOP
protocol to document fidelity to the SIOP Model. Each project teacher will be observed
60
National Research and Development Center for English Language Learners
Section C – Additional Information p. 61
and their lessons will be rated twice (pre-, post), one of which will be videotaped. Those
teachers who do not meet a minimum level of implementation will receive coaching on
more effective implementation of the SIOP Model. The coaching will be provided by
research staff and will involve:
• Pre-observation discussion with each teacher emphasizing the fidelity checklist
elements
• Possible viewing of the lesson videotapes.
• Observation in class
• Post-observation debriefing and suggestions
Each control teacher will be observed and the lessons will be rated twice (pre-, post) by
observation only.
Challenges
1. Because we discovered that the pilot teacher on the west coast assisted students with
the pre- and post- tests, the researchers will provide an explicit script for Year 2
teachers that will simulate the procedures of standardized testing to ensure
uncontaminated administration.
2. At the initial teacher orientation, not all schools were represented because, given the
transience of teachers and students in urban schools, few schools are certain of their
7th grade science teacher staffing for Fall 2006. We will work with school and district
administration to ascertain the names of the teachers by mid-August in time for the
SIOP training.
3. Teachers tended to modify the SIOP lessons to suit their own style. This is to be
expected to some extent but to ensure that the essential elements of the questions of
the study were addressed, such as language objectives and vocabulary development,
we have developed a Fidelity Checklist. This checklist will function as a reminder to
the teachers that for each lesson they need to: Write objectives on board; State
language objectives; State content objectives; Introduce vocabulary, write words and
keep posted; Review vocabulary at end of the lesson; Review each language objective
and ask if it was met; and Review each content objective and ask if it was met.
Other Activities of Note
We had two chapters published in the National Science Teachers Association
(NSTA) book, Science for English Language Learners:
Echevarria, J., & Coburn, A. (2005). Designing lessons: Inquiry approach to science
using the SIOP model In A. Fathman & D. Crowther (Eds.), English through science:
A guide for developing skills in English and science, grades K-8 (pp. 95-108).
Arlington, VA: National Science Teachers Association.
Short, D., & Their, M. (2005). Teaching and learning English and science. In A.
Fathman & D. Crowther (Eds.), English through science: A guide for developing skills
in English and science, grades K-8 (pp.199-219). Arlington, VA: National Science
Teachers Association.
61
National Research and Development Center for English Language Learners
Section C – Additional Information p. 62
An article for secondary school principals was published by the National
Association for Secondary School Principals:
Echevarria, J. (2006). Helping English language learners succeed. Principal
Leadership, 6 (5), 16-21. Reston, VA: National Association for Secondary School
Principals.
We have submitted proposals to present results of the pilot study at the October
2005 Washington Area Teachers of English to Speakers of Other Languages regional
conference, the November NSTA regional conference on science and English language
learners, and the March 2006 national Teachers of English to Speakers of Other
Languages conference.
References
Bailey, A.L., Stevens, R., Butler, F.A., Huang, B., & Miyoshi, J.N. (2005). Using
standards and empirical evidence to develop academic English proficiency test
items in reading. (CSE Report 664). Los Angeles: University of California. Also
available at http://www.cse.ucla.edu/products/reports_set.htm
Butler, F.A., Lord, C., Stevens, R., Borrego, M, & Bailey, A.L. (2004). An approach to
operationalizing academic language for language test development purposes:
Evidence from fifth-grade science and math (CSE Report 626). Los Angeles:
University of California. Also available at
http://www.cse.ucla.edu/products/reports_set.htm
California Department of Education. (2004). Statewide Stanford 9 test results for reading:
Number of students tested and percent scoring at or above the 50th percentile
ranking. Retrieved February 23, 2004, from http://www.cde.ca.gov/dataquest/.
Echevarria, J., Short, D., & Powers, K. (2006). School reform and standards-based
education: An instructional model for English language learners. Journal of
Educational Research, 99 (4), 195-210.
Echevarria, J., Vogt, M.E., & Short, D. (2004). Making content comprehensible to
English learners: The SIOP model. 2nd edition. Boston: Pearson/Allyn & Bacon.
Echevarria, J., Vogt, M.E., & Short, D. (2000). Making content comprehensible to
English language learners: The SIOP model. Needham Heights, MA: Allyn &
Bacon.
Murray, D. (1998). Design and analysis of group randomized trials. New York: Oxford.
Siegel, H. (2002). Multiculturalism, universalism, and science education: In search of
common ground. Science Education, 86, 803-820.
Snow, C., & Biancarosa, G. (2004). Reading Next: Adolescent literacy development
among English language learners. New York: Alliance for Excellent Education
and the Carnegie Corporation.
62
National Research and Development Center for English Language Learners
Section C – Additional Information p. 63
Attachment A
National Research and Development Center for English Language Learners
Project 3
Appendix P3-A
Part A – SIOP Lesson Plan for Biomes
Part B – Sample Science Language Assessment
63
National Research and Development Center for English Language Learners
Section C – Additional Information p. 64
SIOP SCIENCE LESSON PLAN
SUBJECT: Life Science
UNIT FOCUS: Biome
Lesson # 7_
Length of lesson 1 day
STANDARD(S): Virginia Standards of Learning. Life Science 10) The students will
investigate and understand how organisms adapt to biotic and abiotic factors in an
ecosystem.
LESSON TOPIC: Tundra Biomes
OBJECTIVES: write on board
Language - Students will
•
•
•
•
•
Define and visually represent new vocabulary
Use new vocabulary in original sentences
Read and take notes on main ideas
Describe the tundra and the adaptations of its organisms orally and in writing
Compare and contrast the characteristics of different biomes using transitional
phrases
Content - Students will
•
•
Identify the characteristics of tundra biomes
Analyze an organism’s ability to adapt to its environment
KEY VOCABULARY: tundra, permafrost, dwarf, in contrast, on the contrary, by
comparison, conversely, lichen
MATERIALS: biome summary template, lesson 7 T-chart, lesson 7 mix and match
cards, index cards, markers, poster or chart paper, world map
--------------------------------------------------------------MOTIVATION
**Read and explain the content and language objectives of this lesson to the students.
Warm Up - Review key concepts of previous lesson (10 minutes)
• Mix and Match
o Pass out mix and match cards to pairs of students
o Tell the students that the word on their card is either a biome or an organism
in a particular biome. Tell the students they need to find their match (e.g., one
card will say zebra and the matching card will say grassland)
o Pre-teach the language structures you want the students to use (e.g., “I have
gila monster, what do you have?”, “Ok, can we trade?” etc…) for the mixing
part. Demonstrate the card exchange process for the students.
64
National Research and Development Center for English Language Learners
Section C – Additional Information p. 65
o Students walk around and “mix” their cards while trying to find a match.
o Once the matches are complete, ask the students to sit down as a group and
write a sentence(s) on the Warm-Up sheet explaining how this animal is well
adapted to survive in its biome.
o Groups share answers with the class when everyone finishes.
Building Background (3 minutes)
• Show the students a world map and direct their attention to the area of the Arctic
Circle. Ask the students to brainstorm some ideas they have about this area
including climate and types of organisms that live there. Tell the students that this
area is in the tundra biome and we are going to learn more about it today.
PRESENTATION (15 minutes)
• Students skim Tundra Biomes and Mountains and Ice section to self-select any
words they do not know but think will be important in understanding the passage.
Students read and take notes on the T-chart in pairs. Go over the notes as a class
to check and confirm. Record for all the students to see.
• Make vocabulary cards for tundra, permafrost, dwarf, lichen
PRACTICE/APPLICATION (10 minutes)
• Students compare and contrast the tundra with another biome on the biome
summary template. Before students begin writing, pre-tech the grammar forms
and phrases they need to use for their comparison statements, include some
transition phrases like in contrast, on the contrary, by comparison, conversely
• Ask some students to share their writing with the class.
REVIEW/ASSESSMENT (8 minutes)
• Carousel Review
1. List the biomes on chart paper on post around the room (tropical rain forest,
temperate rain forest, desert, tundra, grassland, deciduous forest and boreal
forest).
2. Put the students into groups and assign each one a poster and a marker. Ask
the students write some information that they know about each biome on the
chart paper. If the groups find a piece of information on the chart paper that
they think is wrong or they have a question about, tell the students to write a
question mark next to it.
3. Give the groups about 60 to 90 seconds at each poster and then tell the
students to move clockwise and repeat the activity.
4. When all groups have visited each poster, debrief the class on the activity and
answer any questions.
• Review objectives.
65
Tundra Biomes
1. What is the climate in the tundra?
1.
2. What kinds of plants live in the
tundra?
2.
3. When do most plants grow? Why
is this the best time?
3.
4. What kinds of animals live in the
tundra?
5. How have birds adapted to living
in the tundra?
6. How have mammals adapted to
living in the tundra?
4.
5.
6.
7. What are some areas on Earth
that cannot be classified as part of
one biome?
7.
8. What are some organisms that
have adapted to living on ice?
8.
© Center for Applied Linguistics 2006
Tundra Biomes - Answer key
1. What is climate in the tundra?
1. Cold and dry
2. What kinds of plants live in the
tundra?
2. Mosses, grasses, shrubs, and small
trees like willows
3. When do most plants grow? Why
is this the best time?
3. During the summer because the days
are long with lots of sunshine and
warmer temperatures
4. What kinds of animals live in the
tundra?
4. Many insects, birds, caribou, foxes,
wolves, and hares
5. How have birds adapted to living
in the tundra?
5. When winter comes they migrate
south
6. How have mammals adapted to
living in the tundra?
6. Some have thick fur coats. Some
animals scrape snow to find lichens and
animals like wolves follow herds of
caribou to prey on the weak ones.
7. What are some areas on Earth
that cannot be classified as part of
one biome?
7. Mountain ranges and places with a lot
of ice
8. What are some organisms that
have adapted to living on ice?
8. Penguins, polar bears, and seals
9.
© Center for Applied Linguistics 2006
Lesson 7 Biomes Mix and Match
Directions to teacher: Print onto labels and affix one per index card.
tropical rain forest
tarantula
temperate rain forest
redwoods
desert
saguaro cacti
grassland
bison
deciduous forest
chipmunks
boreal forest
firs
2006 © Center for Applied Linguistics
68
Lesson 7 Biomes Mix and Match
Directions to teacher: Print onto labels and affix one per index card.
tropical rain forest
birds, like toucans
desert
gila monster
grassland
small trees
deciduous forest
black bears
boreal forest
finches
2006 © Center for Applied Linguistics
69
Summary Template
Introductory paragraph:
Rain Forest Biomes:
Desert Biomes:
Grassland Biomes:
Deciduous Forest Biomes:
Boreal Forest Biomes:
2006 © Center for Applied Linguistics
70
Summary Template
Tundra Biomes:
Freshwater Biomes:
Marine Biomes:
2006 © Center for Applied Linguistics
71
Attachment B
Biomes Language Assessment
Read the following passage and then answer the questions.
An ecosystem is a community of different species living together and the abiotic
elements that affect them. A biome is a group of ecosystems with similar climates and
organisms. The location, temperature, and rainfall define an area’s biomes. On land, some of the
most common biomes are tropical rain forests, deserts, grasslands, deciduous forests, boreal
forests, and tundras.
Tropical rain forests are found in areas near the equator. Deserts can be found in dry
areas such as the Southwestern United States. Grasslands can be found in the Midwestern United
States and Africa. Boreal forests are found in subarctic areas such as Siberia and North Canada.
Deciduous forests are found in temperate areas such as the Northeastern United States. Tundras
can be found in arctic or subarctic areas such as Alaska and the Antarctica.
Tropical rain forests receive more than 300 cm of rain each year. Most grasslands receive
25-75 cm of rain per year, but grasslands located near the equator receive as much as 120 cm per
year. Deciduous forests receive at least 50 cm of rain per year. Boreal forests receive little rain
throughout the year, about 20-75 cm annually. Tundras receive very little rain, only about 15–25
cm each year.
Tropical rainforests are warm and humid. Grasslands are warm to hot. Deserts are hot and
dry. Deciduous forests are temperate. Boreal forests are colder than deciduous ones. Tundras are
extremely cold and dry.
Tropical rainforests have tall and medium-sized trees and vines, and plants that grow well
in the shade. Desert plants include plants that survive with very little water and plants that store
water in their leaves, roots, and stems. Plants in grasslands are mostly grass and non-woody
plants. Deciduous forests include mostly deciduous trees that shed their leaves and grow new
ones each year. Most trees in boreal forests are coniferous. They produce seeds in cones and
have leaves shaped like needles. Plants in tundras include mosses, grasses, shrubs, and dwarf
forms of a few trees.
I. The area around the Amazon River in South America receives more than 400 cm of rain every
year. This area must be a ________.
a)
b)
c)
d)
deciduous forest
grassland near the equator
tundra
tropical rain forest
II. Complete each sentence with one of the words in the list. Each word can only be used once.
ecosystem
biome
abiotic
community
1. ______ factors are the nonliving parts of an ecosystem.
2. A/an _______ contains different species living together.
2006 © Center for Applied Linguistics
72
3. The community and abiotic factors together form a/an __________.
4. A/an __________ is a group of ecosystems with similar climates and organisms.
III. Match the plants and their biomes. Write the letter of the biome next to the plant or plants
that live in it.
A. desert
B. deciduous forest
C. boreal forest
D. tropical rain forest
___
___
___
___
1) Maple trees shed their leaves during autumn.
2) Many of the trees have leaves with a “drip tip” that enables rain drops to fall off quickly.
3) Pine trees have needle-shaped leaves.
4) The saguaro cactus stores water in its body.
IV. Part of northern Senegal is changing from grassland to desert because of a sudden population
growth of humans, overgrazing by cattle, firewood gathering, and severe droughts. Scientists call
this process “desertification”. Predict what will happen to the organisms of northern Senegal
because of desertification. Write a short explanation.
V. Choose two biomes to compare. Using your own words, compare and contrast the two biomes
in terms of their rainfall, temperature, and types of plants. Write as much as you can. Use
scientific terms.
2006 © Center for Applied Linguistics
73
File Type | application/pdf |
File Title | TIMES Power Point Presentation Template |
Author | David Francis |
File Modified | 2008-03-17 |
File Created | 2006-04-18 |