CONTRACT NO. ED-06-CO-0014
September 29, 2006
Submitted by:
PART A. JUSTIFICATION
This information collection is being conducted as one of the Task 2 Studies (Rigorous Applied Research and Development) of the 2005-2010 Regional Education Laboratories Program. The current authorization for the Regional Educational Laboratories program is under the Education Sciences Reform Act of 2002, Part D, Section 174, (20 U.S.C. 9564), administered by the Institute of Education Sciences’ National Center for Education Evaluation and Regional Assistance.
Importance of Study
Partially in response to unacceptably high levels of student misbehavior and concern about the low levels of endorsement of values consistent with good character — honesty, responsibility, and respect for self and others — character education has become one of the fasting growing reform movements in K–12 education today (Williams, 2000). The majority of states mandate or recommend some aspect of character education and such programs have high levels of support from parents, teachers, and administrators. Relatively few prospective randomized trials have been conducted to examine the impact of character education programs on student behavioral and academic outcomes.
The goal of this proposed study is to evaluate the effectiveness of a promising English Language Arts-based character education program — the Lessons in Character Program (LIC) — on student academic performance, school behavior and motivation, and endorsement of values consistent with character education among elementary students. Since 1995, the LIC program has been implemented in over 15,000 schools in every state except Alaska. The core of the LIC program consists of literature-based supplementary curricular material designed to integrate easily into the existing English Language Arts (ELA) curriculum. The lessons are aligned with state ELA standards. The infusion of LIC lessons into the ELA curriculum, and the resulting ease of implementation, distinguishes the LIC from other character education programs.
The primary rationale for character education is the promotion of the ethical, social, and personal integrity of students. Proponents of character education argue that the nation benefits when its citizens subscribe to the ideals of respect for others, fairness and justice, honesty, responsibility, and civic participation. Character education programs are also promoted as a partial solution to the growing problem of student misbehavior at school, and the effect of such misbehavior on student learning. Correlational evidence drawn from years of research has shown that adolescent substance use, violence, crime, and antisocial behavior are closely connected with academic success and other school-related factors including reduced attention spans, lower investment in homework, more negative attitudes toward school, lower motivation, and increased absenteeism (see Hanson, Austin, & Bayha, 2004). Such factors may also adversely affect academic performance by influencing teaching and learning processes in the classroom. For example, Lochman, Lampron, Gemmer, and Harris (1987) found that students who were disruptive and aggressive in the classroom had a negative impact on their classmates’ education by diverting teachers’ attention and reducing instruction time (c.f. Bowen & Bowen, 1999).
Character education may also enhance student learning and achievement through its effects on the skills and habits necessary for academic achievement. Systematic character education programs typically focus on traits and behaviors – such as responsibility, accountability, perseverance, self-respect, and problem solving resources – that have been logically linked to academic performance. By influencing the skills necessary for students to achieve in schools, character education may improve academic achievement. Character education program also typically aim to foster greater attachment to school and connectedness with teachers. Numerous studies suggest that school connectedness and caring relations with teachers are related to higher levels of school engagement, educational aspirations, achievement motivation, and academic achievement (Anderman 1999, Connel & Halpern-Felsher 1997, Murdock, Anderman, & Hodge 2000, Resnick et al. 1997, Ryan & Patrick 2001). Wentzel (1997) found that students who reported that their teachers care about them increased their work effort over the next year.
Because character education in general and the LIC curriculum in particular are logically and conceptually linked to a) knowledge, attitudes, and values related to good character; b) pro-social and anti-social behavior; and c) academic engagement and performance — we focus on program effects in each of these dimensions in the impact evaluation. More specific, this study is guided by the following research questions:
Is participation in LIC effective at raising student achievement, improving attendance, and reducing disciplinary referrals?
Do students who participate in LIC demonstrate more positive character traits and behaviors, and greater social skills, compared to control group students?
In order to answer the above research questions, this study will implement a randomized controlled trial (RCT) to examine the impact of LIC on student academic performance, attendance, school motivation, and endorsement of universal values consistent with character education. Detailed research design, data collection procedure and timeline, and data analysis is presented below.
Research Design
Design Overview and Timeline. Exhibit 1 below and Appendix A show general and detailed timelines for the study. The LIC impact evaluation is a three-year study scheduled to begin in Fall 2006, when final recruitment of the sites, refinement of the design, and finalization of instrumentation are scheduled. Implementation will take place in the 2007/08 and 2008/09 academic years, with teacher professional development and coaching scheduled for early Fall 2007. The study population will consist of approximately 15,000 2nd-5th grade students in 50 schools in California and Arizona. No character education professional development activities or coaching will take place during the second year of implementation. The final six months of the study will be devoted to data analyses, manuscript preparation, and dissemination activities.
Exhibit 1. Overview of Study Timeline |
|||||
July 06 – Sept 06 |
Oct 06 – March 07 |
April 07 – May 07 |
Aug 07 – June 08 |
Aug 08 – June 09 |
Aug 09 – Dec 09 |
|
|
|
|
|
|
Revise Study Design, Develop Instruments, Revise Protocol, OMB & IRB Submission |
School Recruitment, Informed Consent |
Baseline Student and Teacher Data Collection, Random Assignment |
1st year Implementation, Process Data Collection, Post-intervention Data Collection |
Interim Analyses/Report, 2nd Year Implementation, Post- intervention Data Collection |
Final Analyses/Report |
|
To maximize the student sample size available in the estimation of multi-year exposure impacts, all 2nd-5th grade teachers will be recruited in each of the 50 schools recruited. All potential participating schools will have to agree to the data collection activities described below and to make available routinely collected student data on standardized test scores, attendance, and disciplinary referrals. In addition, teachers randomly assigned to the control group must agree to refrain from implementing character education interventions.
Exhibit 2 below depicts the research design. As mentioned above, 50 schools will be randomly assigned to treatment or control conditions. Schools will serve as the unit of randomization. Although the bulk of the LIC program is tightly infused into the English Language Arts curriculum, because schools serve as the unit of random assignment, school-wide aspects of the program, such as the integration of character education into the school discipline policy, will likely be fully reflected in the impact estimates.
The bottom panel of Exhibit 2 depicts the design with respect to students. The design enables estimation of single-year program impacts for each grade, and estimation of multi-year LIC exposure impacts for students who were in the 2nd, 3rd, or 4th grade during 2007/08.
The control group represents the treatment-as-usual conditions. Control group members will be exposed to the regular ELA curriculum in their schools, and will be barred from implementing LIC in their classrooms. It is possible that schools or teachers may change practices because they were assigned to the control condition. A monitoring system will be put in place to assess professional development activities, curriculum practices, and/or other “intervention-like” activities in both treatment and control conditions to better interpret observed program impacts, or lack thereof, as well as to document the treatment contrast.
As discussed in more detail in Part B of this document, the study is powered to detect grade and cohort-specific small program impacts. The study relies on mixed-modeling procedures (see Impact Analysis section below) to detect treatment effects on student outcomes. Exhibit 3 summarizes the study characteristics.
Exhibit 2. Lessons in Character (LIC) Experimental Design
|
|||||||||||
|
2006/07 |
|
2007/08 |
|
2008/09 |
||||||
|
Fall |
|
Spring |
|
Fall |
|
Spring |
|
Fall |
|
Spring |
Teachers |
|
|
|
|
|
|
|
|
|
|
|
1st Grade |
|
|
O |
|
|
|
O |
|
|
|
|
2nd Grade |
|
|
|
|
|
|
|
|
|
|
|
Group #1 |
|
|
O |
|
PD+CharEd |
O |
|
|
CharEd |
O |
|
Group #2 |
|
|
O |
|
TxU |
O |
|
|
TxU |
O |
|
3rd Grade |
|
|
|
|
|
|
|
|
|
|
|
Group #1 |
|
|
O |
|
PD+CharEd |
O |
|
|
CharEd |
O |
|
Group #2 |
|
|
O |
|
TxU |
O |
|
|
TxU |
O |
|
4th Grade |
|
|
|
|
|
|
|
|
|
|
|
Group #1 |
|
|
O |
|
PD+CharEd |
O |
|
|
CharEd |
O |
|
Group #2 |
|
|
O |
|
TxU |
O |
|
|
TxU |
O |
|
5th Grade |
|
|
|
|
|
|
|
|
|
|
|
Group #1 |
|
|
O |
|
PD+CharEd |
O |
|
|
CharEd |
O |
|
Group #2 |
|
|
O |
|
TxU |
O |
|
|
TxU |
O |
|
|
|
|
|
|
|
|
|
|
|
|
|
Students |
|
|
|
|
|
|
|
|
|
|
|
1st Grade |
|
|
|
|
|
|
|
|
|
|
|
Group #1 |
|
|
O |
|
|
|
O |
|
|
|
|
Group #2 |
|
|
O |
|
|
|
O |
|
|
|
|
2nd Grade |
|
|
|
|
|
|
|
|
|
|
|
Group #1 |
|
|
O |
|
|
CharEd |
O |
|
|
CharEd |
O |
Group #2 |
|
|
O |
|
|
TxU |
O |
|
|
TxU |
O |
3rd Grade |
|
|
|
|
|
|
|
|
|
|
|
Group #1 |
|
|
O |
|
|
CharEd |
O |
|
|
CharEd |
O |
Group #2 |
|
|
O |
|
|
TxU |
O |
|
|
TxU |
O |
4th Grade |
|
|
|
|
|
|
|
|
|
|
|
Group #1 |
|
|
O |
|
O |
CharEd |
O |
|
O |
CharEd |
O |
Group #2 |
|
|
O |
|
O |
TxU |
O |
|
O |
TxU |
O |
5th Grade |
|
|
|
|
|
|
|
|
|
|
|
Group #1 |
|
|
|
|
O |
CharEd |
O |
|
O |
CharEd |
O |
Group #2 |
|
|
|
|
O |
TxU |
O |
|
O |
TxU |
O |
|
|
|
|
|
|
|
|
|
|
|
|
O = Observations or measurement points |
|||||||||||
PD = Prof Development/Coaching Condition in Lessons in Character |
|||||||||||
CharEd = Lessons in Character implementation |
|||||||||||
TxU = Treatment as Usual Condition |
|||||||||||
Shaded areas correspond to student cohorts tracked/analyzed across two years of program exposure/non-exposure to Lessons in Character |
Exhibit 3. LIC Study Characteristics
|
|
Study F Design |
Cluster-randomized trial |
Unit of Assignment |
Schools |
Sample Characteristics |
15,000 students/ 50 schools – schools assigned to the intervention condition will implement LIC in grades 2-5. |
Statistical Power Estimates |
For Type 1 error = .05, 80% or higher power to detect MDES of 0.23 for academic outcomes and 0.17 for behavioral outcomes within each school grade; MDES of 0.23-0.28 for subgroup analyses. |
Implementation begins |
Winter 2007 |
Study Outcomes. The evaluation relies on school archival data, student surveys of 4th and 5th graders, parent surveys, teacher surveys, and teacher interviews to measure student outcomes and implementation fidelity. Exhibit 4 describes the outcome measures used in the analysis. With the exception of survey items asking specific questions about LIC implementation, all measures will be collected in both treatment and control sites.
Standardized Achievement Tests. Students’ standardized ELA and Mathematics achievement data from district-administered (or state-mandated) standardized tests for all students will be collected for the years before and during exposure to the study teachers. Unfortunately, Arizona and California administer different tests to 2nd-5th grade students. Arizona administers the Terra Nova and Arizona’s Instrument to Measure Standards (AIMS) to its elementary school students, while the California Standards Test (CST) is administered to elementary students in California, respectively. Although these tests measure the same general constructs, they are different in terms of content emphasis, item sampling, and item difficulty. To convert the scores to an identical metric so that test score data from all of the sites can be analyzed together, all the test score data will be standardized by subtracting the state mean from each student’s score and dividing by the state standard deviation (at baseline). This is analogous to techniques used in meta-analysis to pool the results of studies using alternative measures of similar constructs. However, this technique is far from ideal.
Course Grades, Attendance, and Disciplinary Referrals. School and district records on course grades, student attendance, and disciplinary referrals will be gathered across all sites. Prior to collection of data from school and district records, a form designed to inventory student information kept in school records will be distributed to appropriate staff. Data from this form will be used to determine which records are available for all schools in the sample.
Character Traits and Behavior Survey. A 35-minute survey assessing behaviors, attitudes, and values consistent with the goals of character education will be administered to all 4th and 5th graders during the spring semester of the year prior to implementation and at the end of both implementation years. Using items and subscales from existing validated instruments, the survey will assess student empathy (Funk et al. 2003), altruism (Characterplus 2002), school engagement (Furrer & Skinner 2003), aggression (Opinas & Frankowski 2001), delinquent behavior (Kisker et al. 2004), autonomy and influence, and competence (Characterplus 2002).
Social Skills Rating System (SSRS). Teacher and parent reports on the SSRS (Gresham & Elliott, 1990) will be used to assess student social skills, problem behaviors, and academic competence. The SSRS-Teacher and SSRS-Parent are 57-item multidimensional instruments assessing student social and academic functioning. The instruments have good psychometric properties, good change sensitivity, and the SSRS-Teacher has shown positive results in an evaluation of the Responsive Classroom character education program (Elliott, 1995). The SSRS assesses the sub-domains of cooperation, responsibility, empathy, self-control, externalizing problems, and internalizing problems. Teacher and parent SSRS reports will be obtained from a random sample of 10 students in each classroom.
Teacher Surveys. We also plan to survey all intervention and control group teachers prior to random assignment during the spring of 2007 (pre-test) and in the spring of 2008 and 2009 (post-test). The pre-test survey will assess teacher education, professional development experiences in language arts and character education, and the language arts curriculum used. The post-test survey will also contain questions about additional professional development that the teachers participate in during the implementation year. Teachers in the intervention condition will be asked how they prepared and followed up with their lessons, and the frequency with which lessons were delivered.
Teacher Implementation Logs. Teachers assigned to the treatment group will fill out weekly logs that assess the frequency with which LIC, DOL, and WWC lessons were delivered.
Teacher Interviews. For a sample of 10-14 teachers in the treatment group, a semi-structured interview (by phone) will be conducted during the spring of each implementation year to assess implementation fidelity, including changes made and barriers encountered, and descriptions of the ways they and their students may have benefited (or not) from the program. For this sample of teachers, we will develop a set of case studies to document changes in practices resulting from program participation.
Parent Surveys. In addition to the SSRS-Parent questionnaire, a brief baseline survey will be administered to the primary caregivers of students to assess demographic and socioeconomic characteristics.
Administrator Interviews. Using an analogous procedure as that used in the Social and Character Development Impact Evaluation being conducted by Mathematica Policy Research, Associate Principals or others who are most knowledgeable about school-wide programs in both treatment and control schools will be interviewed to gather information about school activities closely aligned with character education, including the implementation of other character education programs, substance use and violence prevention programs, life skills programs, civics and/or citizenship programs, conflict resolution programs, and related activities.
Exhibit 4. Measurement Matrix of Outcome Variables used in the Evaluation
|
||||
|
|
|
|
|
Construct |
Items |
Source |
Alpha Reliability |
Reference |
|
|
|
|
|
|
|
|
|
|
Student Outcome Measures |
|
|
|
|
Standardized Achievement Tests |
NA |
School Records |
NA |
NA |
Course Grades |
NA |
School Records |
NA |
NA |
Attendance |
NA |
School Records |
NA |
NA |
Disciplinary Referrals |
NA |
School Records |
NA |
NA |
Empathy (reactive & anticipatory) |
16 |
Student |
0.72 |
Funk et al (2003) |
Altruism |
5 |
Student |
0.86 |
Characterplus (2002) |
School Engagement (two scales) |
10 |
Student |
0.75, 0.86 |
Furrer & Skinner (2003) |
Aggression |
11 |
Student |
0.88 |
Opinas & Frankowski (2001) |
Delinquent Behavior |
13 |
Student |
0.71 |
Kisker et al (2004) |
Students’ Feelings of Belonging |
12 |
Student |
0.87 |
Characterplus (2002) |
Students’ Feelings of Belonging |
12 |
Teacher |
0.89 |
Characterplus (2002) |
Autonomy & Influence |
5 |
Student |
0.79 |
Characterplus (2002) |
Competence |
9 |
Student |
0.76 |
Characterplus (2002) |
Academic Competence |
9 |
Teacher |
0.95 |
Gresham & Elliott (1990) |
Assertion |
10 |
Teacher/Parent |
0.86 / 0.74 |
Gresham & Elliott (1990) |
Cooperation |
10 |
Teacher/Parent |
0.92 / 0.77 |
Gresham & Elliott (1990) |
Responsibility |
10 |
Parent |
0.65 |
Gresham & Elliott (1990) |
Self-control |
10 |
Teacher/Parent |
0.91 / 0.80 |
Gresham & Elliott (1990) |
Externalizing |
6 |
Teacher/Parent |
0.88 / 0.75 |
Gresham & Elliott (1990) |
Internalizing |
6 |
Teacher/Parent |
0.78 / 0.71 |
Gresham & Elliott (1990) |
|
|
|
|
|
Classroom/Teacher Measures |
|
|
|
|
School Expectations |
5 |
Student |
0.87 |
Characterplus (2002) |
School Expectations |
5 |
Teacher |
0.94 |
Characterplus (2002) |
Parent & Staff Relations |
7 |
Teacher |
0.88 |
Characterplus (2002) |
Staff Culture of Belonging |
10 |
Teacher |
0.91 |
Characterplus (2002) |
|
|
|
|
|
|
Data Collection Procedure and Timeline
This evaluation study relies on three sources of outcome data: 1) routinely collected district and school archival standardized test scores, attendance, and disciplinary referrals data; 2) Character Traits and Behavior surveys administered to all 4th and 5th graders during both implementation years; and 3) teacher- and parent-reported SSRS data collected from 12 randomly sampled students per class in all 1st-4th grade classes. To enable longitudinal tracking, 1st-4th grade students with positive parental consent will be sampled for the SSRS during the 2007 spring semester, with the exception of a new cohort of 1st grade students that will be tracked beginning in spring 2008. We expect to obtain valid pre-intervention and post-intervention data for more than 75% of SSRS focal students sampled. We will also collect data from teachers to monitor the fidelity of implementation and to monitor conditions in control-group classes. Exhibit 5 depicts the data collection schedule.
Collection of Student Archived Data. Course grades, attendance, and disciplinary referrals will be collected for each student participating in the study. A memorandum of agreement will be obtained by WestEd in which the districts agree to provide archival student records to WestEd for the purposes of research. The district data specialist will be contacted at the beginning of the study to alert them about which data variables are needed. It is anticipated that the district will be able to provide an electronic database containing attendance, course grades, and disciplinary infractions for students identified for the study. WestEd will request copies of the standardized test scores data disks provided to the districts by the state in the late summer/early fall each year.
Collection of Student Survey Data. Recruitment of 4th -5th grade students will begin approximately 45 days prior to survey administration. Active parent and student consent will be required for participation. For the spring 2007 survey administration (and the subsequent year administrations), parental consent forms will be distributed to on-site coordinators to distribute to sampled students. Students will be instructed to return parental consent forms to coordinators, who, in turn, will send accumulated consent forms to WestEd via express mail at selected intervals prior to survey administration. Return of consent forms by students in their classrooms has been found to yield higher rates of consent form return and agreement to participate than mailing consent forms to parents’ homes and having parents mail them back (McMorris et al. 2004). Such a strategy also results in fewer IRB and district policy complications brought about by obtaining student address information. All consent forms will be immediately input into WestEd’s online Consent Manager database program using bar code technology. Developed to facilitate WestEd’s extensive number of survey research and evaluation projects using written parent consent, this program reduces the labor required to monitor consent, helps improve the positive consent rate, and reduces the error rate. For students not returning parental consent forms, follow-up distribution will occur at weekly intervals. Phone calls and direct mailing will be performed if there is no response to research consent by the 20th day prior to scheduled survey administration. In order to ensure on-site cooperation, each school and coordinator will receive a nominal stipend for their assistance with the consent forms. School staff will not be involved with survey administration.
Survey administration will proceed as follows. Two or more staff will administer the surveys. Follow-up administration for students absent will occur within 2 days, and at periodic intervals thereafter if necessary. Each student (with parental consent) will be provided with a consent letter prior to filling out the survey. By signing the consent letter, students affirm their willingness to participate in the survey. Survey staff will retrieve student consent forms before survey administration and students will place the completed questionnaire in an unmarked envelope collected by survey staff.
Given the longitudinal nature of the study, it is necessary to link data across survey administrations. To track students, a unique, arbitrary numeric code will be placed on the student consent form and on the survey in the form of an infrared barcode. Following survey administration, survey assistants will use participant rosters from the Consent Manager database, which include student identification numbers, and the signed student assent letters to match the survey data to student identification numbers. Non-respondents will be identified immediately for follow-up scheduling. Participants will be assured of the confidentiality of their responses. This procedure has been used successfully in other projects at WestEd using school-based surveys.
Collection of Parent Survey Data. On-site coordinators will also distribute parent questionnaires and SSRS-P forms (combined in one survey) in sealed envelopes to students in the SSRS sample, who will deliver forms to parents. Parents will mail completed questionnaire and SSRS-P forms to WestEd using a postage paid envelope. Follow-up distribution will occur at bi-weekly intervals. Phone calls and direct mailing will be performed if there is no response after one month of the original distribution. Telephone interviews will be conducted with survey non-respondents to maximize response rates.
Collection of Teacher SSRS and Survey Data. Teachers will fill out surveys and SSRS-T forms for sampled students during the three-day period in which student surveys are being administered. We estimate that an average of 3 hours will be needed to complete both the surveys and SSRS-T forms. Teachers will be provided with postage-paid express mail envelopes to deliver SSRS-T forms if they have been unable to complete the forms during the scheduled 3-day period. Follow-up will occur at weekly intervals.
Collection of Teacher Implementation Logs. Teachers assigned to the treatment group will fill out checklists that assess the frequency with which LIC, DOL, and WWC lessons were delivered. The first set of completed data forms will be returned to WestEd via express mail two-weeks after classroom implementation has begun, and then every two months thereafter. The two-week data collection will be used to ensure that teachers are filling out the forms appropriately from the start.
Collection of Teacher Interview Data. In treatment schools, a sample of 10-14 teachers implementing LIC will be invited to participate in an interview designed to collect qualitative data about their experience with the program. These interviews will be scheduled through the on-site coordinator at the end of each school year to coincide with student survey data collection efforts. Two data collectors will conduct the telephone interviews at times teachers have identified as being a convenient time for them. One data collector will ask the questions and the second data collector will take notes. Additionally, teachers will be asked permission for the phone interviews to be audio taped. The audio tapes will be transcribed by the data collector who conducted the interviews and entered into an electronic data file.
Collection of Principal/Administrator Interview Data. In both treatment and control schools, in-person interviews with school principals / administrators will be conducted during the school visits made by data collection staff. These interviews will be conducted to gather information about school activities that are related to character education, including the implementation of the LIC program itself.
Exhibit 5. Data Collection Schedule
|
|||
|
2006/07 |
2007/08 |
2008/09 |
Student Outcome Measures |
|
|
|
Standardized Achievement Tests |
Spring |
Spring |
Spring |
Course Grades (ELA, Mathematics) |
Spring |
Fall/Spring |
Fall/Spring |
Attendance/Disciplinary Referrals |
Spring |
Spring |
Spring |
Character Traits Survey |
|
Fall &Spring |
Fall &Spring |
Teacher SSRS |
Spring |
Spring |
Spring |
Parent SSRS (including parent demographic information) |
Spring |
Spring |
Spring |
Teacher Practice/Fidelity Measures |
|
|
|
Teacher Surveys |
Spring |
Spring |
Spring |
Teacher Implementation Logs |
|
Spring |
Spring |
Teacher Interviews |
|
Spring |
Spring |
Administrator Interviews |
|
Spring |
Spring |
The analysis of program impacts will depend on the random assignment research design as a primary source of inference. Adjusted post-intervention outcomes for students in the treatment group will be compared to the outcomes for their counterparts in the control group. The primary hypothesis-testing analyses will involve fitting conditional multilevel regression models (i.e., HLM – hierarchical linear modeling), with additional terms to account for the nesting of individuals within higher units of aggregation (e.g., see Goldstein, 1987; Raudenbush & Bryk, 2002; Murray, 1998). The study involves school-level random assignment and delivery of training courses to teachers within treatment schools, who in turn incorporate 24 supplementary lessons into their classroom instruction during the academic year. The design thus involves clustering at the school and classroom levels, as students are nested within teachers and teachers are nested within schools. A random effect of school site and a fixed effect of teacher will be included in the model to account for the nesting of observations within schools and teachers, respectively. Other fixed effects include treatment group, baseline (pre-test) measures of outcome variables, strata, and other individual and aggregate school-level covariates. The purpose of including statistical controls is to minimize random error and to increase the precision of the estimates.
As an illustrative example, consider the following two-level HLM for a continuous outcome:1
Characti:j = 0 + 1Prei:j + 2Txj + ∑IIi:j + ∑TTi:j + ∑SSj + j + i:j [1]
where subscripts I and j denote student and school, respectively; the nesting is reflected by the colons (:); Character represents the student outcome variable; Pre represents the baseline measure of the outcome variable; Tx is a dichotomous variable indicating student attendance at the school assigned to the treatment condition; I represents a vector of student-level control variables measured prior to random assignment; T represents a set dichotomous variables representing fixed effects for teachers, and S is a set of school-level control variables (e.g., strata). Lastly, i:j and j are error terms for individual sample members and schools, respectively. In this model, the intervention effect is represented by β2, which captures treatment/control school differences in changes in the outcome variable between pretest and posttest.
Simple extensions to model [1] allow us to examine differential effectiveness across subgroups by including interactions between treatment status and one of the variables in I or S. Model [2], for example, shows how we can estimate separate program effects for boys and girls:
Characti:j = 0 + 1Prei:j + 2BTxjBoyi:j + 2GTxjGirli:j + ∑IIi:j + ∑TTi:j + ∑SSj + j + i:j [2]
The only difference between this model and [1] is that the term 2Txj is replaced by two terms that interact program variable Txj with dichotomous variables boys and girls. Program impacts on boys and girls are captured by the coefficients 2B and 2G, respectively. By statistically testing the hypothesis 2B = 2G, we can then establish whether program impacts are statistically different for boys and girls. Similar subgroup analyses will be possible across school-level variables. However, the statistical power of such higher-level subgroup analyses is very limited. Although we have no a priori hypotheses regarding differential impacts, such subgroup analyses will be important for yielding information on program effectiveness for students of varying SES levels, ethnicity, and gender. Moreover, interactions between pretest scores and the program variable will allow us to capture where in the distribution of student outcomes program-related gains come from.
Technology will be used in a variety of ways during the data collection process. Basic contact information about the schools in which teachers work will be gathered on an electronic database created by the WestEd evaluation team. The evaluation team will use this database to keep track of school / teacher contact information and other information used to manage the study. Technology (e.g., Consent Manager) will also be used to link student data from surveys directly to analytic datasets without re-keying data. This saves time and reduces that chance of errors during data input.
Second, communication between the evaluation team and selected school officials and/or teachers will occur through email, fax, and conference calls that take advantage of information technology and reduce burdens associated with paperwork. The communication will cover initial inquiries, the exchange of preliminary information, the scheduling and planning of site visits, and the review of draft reports.
Throughout the study, a toll-free number and email addresses will be available to respondents to allow them to contact the evaluation team with any questions or requests for assistance. This information, along with the names of contact persons on the evaluation team at WestEd will be printed on all data collection instruments.
Each instrument will be carefully reviewed to make sure that we only collect the most necessary information needed for this study. The secondary information such as student standardized test scores will be accessed and collected through the electronic database at the school (or district) level.
The evaluation team will collect data from few small entities, as most of the data sources will be from teachers and students. The few small entities are likely to be associated with the external technical assistants and consultants who may assist with data key-in and help with scoring the instruments. Only minimal information will be needed from these small entities, and so no significant impact on small entities is expected.
The data collection efforts in this study will allow researchers to study the impact of LIC on student academic performance as well as student character traits and behaviors. As indicated earlier, character education has become one of the rapidly growing reform movements in K–12 education due to unacceptably high levels of student misbehavior and concern about the low levels of endorsement of values consistent with good character. Various character education programs have been implemented for the majority of states, and such programs have high levels of support from parents, teachers, and school administrators. However, relatively few prospective RCTs have been conducted to gather evidence about the effectiveness of LIC program. An experimental design is considered to be the strongest design when the interest of the study is in establishing a causal relationship (p.189, Trochim, W. M. K., 2001). The current study as an example of RCT aims to provide such relationship between LIC program and various student outcomes. Failure in collecting data based on this line of experimental design will greatly limit our capability of making the cause-effect inferences of LIC implementation.
This information collection fully complies with 5 CFR 1320.5(d)(2).
A notice about the study will be published in the Federal Register when the final OMB package is submitted.
The evaluation team will seek the expertise of persons outside the agency through the creation of a Technical Working Group (TWG). The TWG will provide consultation on the design, implementation and analysis of this study, as well as the entire portfolio of Regional Educational Laboratory West (REL West) studies. They are expected to consult with REL West for five days per year through a combination of in-person and teleconferenced meetings. An honorarium of $1200 will be paid to each TWG member. The TWG will play an important role in providing insight and guidance in support of a successful evaluation. The TWG members are listed below:
• Professor Jamal Abedi, CRESST, University of California, Davis
• Dr. Lloyd Bond, Carnegie Foundation for the Advancement of Teaching
• Professor Geoffrey Borman, University of Wisconsin
• Professor Brian Flay, Oregon State University
• Professor Tom Good, University of Arizona
• Dr. Corinne Herlihy, Manpower Demonstration Research Corporation (MDRC)
• Dr. Joan Herman, CRESST, University of California, Los Angeles
• Professor Heather Hill, University of Michigan
• Dr. Roger Levine, American Institutes for Research (AIR)
• Dr. Jason Snipes, Council of the Great City Schools
Teachers will be provided with $100 in compensation for the time it takes (3 hours) to fill out the child assessments each year. Parents will be compensated $10 for each of three rounds of data collection.
WestEd staff will comply with the Privacy Act for all individual and school/teacher/parent data collected in the study. All data will be carefully handled in a responsible manner so they are not seen by or released to anyone not working on the project. Data will be reported in a summary fashion so no specific individuals or schools/teachers/parents may be identified. Finally, all data will be maintained in secure and protected files that do not include personally identifying data.
No information will be collected that would identify individual participants. Participants will not be referenced by either their name or their position title. An explicit statement regarding confidentiality will be communicated to any and all participants.
The data security protections for this study receive continuing review by WestEd’s Institutional Review Board. Below is an overview of WestEd’s data security policy.
Policies for Class 1 Data (Confidential data with identifying information)
(1) Can never leave WestEd premises.
(2) Always kept in a secure place.
(3) Only authorized persons can access and use.
(4) Must be properly disposed of or transferred.
Exhibit 6. Procedures for Handling Class 1 Data
|
||
|
Electronic Data |
Paper Data |
Receipt and tracking of Class 1 materials |
|
|
Can never leave WestEd premises |
|
|
Create separate working analysis file |
|
|
Always kept in a secure place |
|
|
Only authorized persons can access and use |
|
|
Must be properly disposed of or transferred |
|
|
Policies for Class 2 (Proprietary data and documents that are not Class 1) are:
(1) Only authorized persons can access and use.
(2) Must be used and stored under responsible person's oversight. Must not be left in public view (e.g. sitting out on a desk, open on computer monitor).
No questions will be asked that are of a sensitive nature.
The estimated total response burden is about 46,703 person-hours assuming there is no data attrition across three years of program implementation (so the burden estimates presented here represent the maximum burden). This total represents the sum of the estimated burden for all portions of the study. Exhibit 7 aggregates the estimated total hours and costs to participants of this study.
Exhibit 7. Aggregate Respondents and Hour Burden
Task |
Number of Respondents |
Hour Burden |
Monetary Burden |
Sampling/Gaining Cooperation |
49,420 |
15,325 |
$313,950 |
Student Data Collection |
30,000 |
17,500 |
$0 |
Teacher Data Collection |
1,950 |
6,028 |
$194,340 |
Parent Data Collection |
23,400 |
7,800 |
$156,000 |
Administrator Data Collection |
100 |
50 |
$1,800 |
TOTAL |
105,020 |
46,703 |
$666,090 |
Sampling and Gaining Cooperation. At the outset of the study, the process of data collection will be initiated with sending a listserv message to all school districts in both California and Arizona serving students in grades 1 through 5 and to all Title IV coordinators notifying them about the study and requesting participation from schools within their respective districts and counties. We believe that this approach will increases efficiency during the study and reduces the overall burden during the recruitment process. Responses to the listserv message will be reviewed to ensure schools meet the criteria for selection into the study as will be described later in B1, “Respondent Universe / Sampling Methods.” The evaluation team will contact the principals of qualifying schools to ensure that sufficient numbers of teachers within the school are willing to participate in the study and that student-level data on school grades, attendance, and state achievement tests will be made available for the evaluation team. In addition, the evaluation team will provide specific information about the process of obtaining parental consent (so that the evaluation team will be able to gather student level data through various surveys).
Exhibit 8 lists the estimated time and cost of burden associated with each task during the sampling and gaining cooperation process. The number of principals/teachers/parents corresponds to the number of schools that will be recruited based on the study design and power estimates.
Task |
Type of Respondent |
Number |
Time Estimate (in hours) |
Total Hours |
Hourly Rate |
Estimated Cost of Burden |
Sampling Tasks |
District administrators |
10 |
1 |
10 |
$45 |
$450 |
Gaining Cooperation |
School Principals |
50 |
1 |
50 |
$36 |
$1,800 |
Gaining Cooperation |
Teachers |
600 |
1 |
600 |
$30 |
$18,000 |
Obtaining Consent from Parents |
Parents |
15,0002 (year 1) |
.3 |
14,625 |
$20 |
$292,500 |
18,7503 (year 2) |
||||||
15,0002 (year 3) |
||||||
Gaining Cooperation |
District Data Specialists |
10 |
4 |
40 |
$30 |
$1,200 |
TOTAL |
- |
49,420 |
- |
15,325 |
- |
$313,950 |
Task |
Type of Respondent |
Number |
Time Estimate (in minutes) |
Total Hours |
Hourly Rate |
Estimated Cost of Burden |
Student Data Collection in the Fall of 2007 |
Students |
7,500 |
35 |
4,375 |
$0 |
$0 |
Student Data Collection in the Spring of 2008 |
Students |
7,500 |
35 |
4,375 |
$0 |
$0 |
Student Data Collection in the Fall of 2008 |
Students |
7,500 |
35 |
4,375 |
$0 |
$0 |
Student Data Collection in the Spring of 2009 |
Students |
7,500 |
35 |
4,375 |
$0 |
$0 |
TOTAL |
- |
30,000 |
- |
17,500 |
- |
$0 |
Task |
Type of Respondent |
Number |
Time Estimate (in minutes) |
Total Hours |
Hourly Rate |
Estimated Cost of Burden |
Teacher Data Collection in the Spring of 2007 |
Teachers |
750 |
180 |
2,250 |
$30 |
$67,500 |
Teacher Data Collection in the Spring of 2008 (1st Grade only) |
Teachers |
150 |
180 |
450 |
$30 |
$13,500 |
Teacher Data Collection in the Spring of 2008 (2nd-5th Grade) |
Teachers |
286 |
195 (treatment)5 |
1,889 |
$30 |
$56,670 |
14 |
255 (treatment)6 |
|||||
300 |
180 (control) |
|||||
Teacher Data Collection in the Spring of 2009 |
Teachers |
286 |
195 (treatment)5 |
1,889 |
$30 |
$56,670 |
14 |
255 (treatment)6 |
|||||
300 |
180 (control) |
|||||
TOTAL |
- |
1,950 |
- |
6,028 |
- |
$194,340 |
Task |
Type of Respondent |
Number |
Time Estimate (in minutes) |
Total Hours |
Hourly Rate |
Estimated Cost of Burden |
Parent Data Collection in the Spring of 2007 |
Parents |
7,200 |
20 |
2,400 |
$20 |
$48,000 |
Parent Data Collection in the Spring of 2008 |
Parents |
9,000 |
20 |
3,000 |
$20 |
$60,000 |
Parent Data Collection in the Spring of 2009 |
Parents |
7,200 |
20 |
2,400 |
$20 |
$48,000 |
TOTAL |
- |
23,400 |
- |
7,800 |
- |
$156,000 |
Task |
Type of Respondent |
Number |
Time Estimate (in minutes) |
Total Hours |
Hourly Rate |
Estimated Cost of Burden |
Administrator Data Collection in the Spring of 2008 |
School Administrators |
50 |
30 |
25 |
$36 |
$900 |
Administrator Data Collection in the Spring of 2009 |
School Administrators |
50 |
30 |
25 |
$36 |
$900 |
TOTAL |
- |
100 |
- |
50 |
- |
$1,800 |
Respondents will mainly come from students and teachers/parents. The hourly rate for each respondent is outlined in section A12. There are no other additional respondent costs aside from those outlined in section A12.
The total cost for the study is $1,348,367 over five years. The average yearly cost is about $269,673. Most of the costs for the study are incurred in years 2007 through 2009 as data collection efforts are under way.
This request is for new information collection.
We plan to produce two technical reports in which evaluation results will be presented, an (1) interim evaluation report that will be based on data collected during the first implementation year and (2) a final evaluation report based on all the collected data. These reports are scheduled to be completed in November, 2008 and December 2009, respectively.
No request is being made for exemption from displaying the expiration date.
This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.
1 For binary outcomes, a conditional mixed-effects logistic model will be estimated.
2 For students in grades 1-4.
3 For students in grades 1-5.
4 There will be no new student data collection in the spring of 2007; for 4th and 5th grade students only.
5 Treatment teachers without interview data collected.
6 Treatment teachers with interview data collected.
File Type | application/msword |
File Title | High School Instruction: Problem-Based Economics |
Author | Kevin Huang |
Last Modified By | Sheila.Carey |
File Modified | 2007-01-23 |
File Created | 2007-01-23 |