Supporting Justification for OMB Clearance of an Evaluation of the Early Warning and Intervention Monitoring System Under the Regional Educational Laboratory Program (REL)
OMB Clearance Request, Part A
March 2014
Submitted to: Submitted by:
U.S. Department of Education American Institutes for Research
Institute
of Education Sciences 1000 Thomas Jefferson Street NW
555 New
Jersey Ave. NW, Rm. 308 Suite 200
Washington, DC 20208 Washington, DC 20007-3835
1120 East Diehl Road, Suite 200
Naperville, IL 60563-1486
866-730-6735
www.relmidwest.org
This publication was prepared for the Institute of Education Sciences (IES) under contract ED-IES-12-C-0004 by Regional Educational Laboratory Midwest, administered by American Institutes for Research. The content of the publication does not necessarily reflect the views or policies of IES or the U.S. Department of Education, nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. government. The publication is in the public domain. Authorization to reproduce in whole or in part for educational purposes is granted.
10/16
Contents
Page
1. Circumstances Necessitating Collection of Information 6
2. How, by Whom, and for What Purpose Information Is to Be Used 12
3. Use of Automated, Electronic, Mechanical, or Other Technological Collection Techniques 13
4. Efforts to Avoid Duplication of Effort 13
5. Sensitivity to Burden on Small Entities 13
8. Federal Register Announcement and Consultation 14
9. Payment or Gift to Respondents 17
10. Confidentiality of the Data 17
11. Additional Justification for Sensitive Questions 19
12. Estimates of Hourly Burden 19
13. Estimate of Total Annual Cost Burden to Respondents or Record-Keepers 25
14. Estimates of Annualized Cost to the Federal Government 25
15. Reasons for Program Changes or Adjustments 25
16. Plan for Tabulation and Publication and Schedule for Project 25
17. Approval Not to Display the Expiration Date for OMB Approval 27
18. Exception to the Certification Statement 27
Attachment A-1. Impact Study: Data Elements, Sources, Access and Periodicity 30
Attachment A-2. Implementation Study: Data Elements, Sources, Access and Periodicity 35
Attachment A-3. Educational Sciences Reform Act (ESRA) 38
Attachment A-4. Federal Register Notices 39
Attachment A-5. REL Midwest’s Technical Working Group (TWG) Recommendations 40
Attachment A-6. Informed Consent Forms 43
Attachment A-7. Confidentiality Forms & Affidavits 45
Attachment A-8. Institutional Review Board (IRB) Approval 54
CONTENTS (continued)
TABLES AND FIGURES
Page
Tables
Table 1. Total Estimated Hourly Burden 19
Table 2. Total Cost to Respondents 23
Table 3. Schedule of Activities 26
Table A.1: Data elements, sources, access, and periodicity for the impact study data collection. 30
Table A.2: Data elements, sources, access, and periodicity for the impact study data collection. 35
Figures
Figure 1. Early Warning Intervention Monitoring System Implementation Process 9
Figure 2. Early Warning and Intervention Monitoring System (EWIMS) Theory of Action 10
Supporting Justification for OMB Clearance of an Evaluation of the Early Warning and Intervention Monitoring System Under the Regional Educational Laboratory Program (REL)
OMB Clearance Request, Part A
The U.S. Department of Education (ED) requests clearance for the recruitment materials and data collection protocols under the OMB generic clearance agreement (OMB Number [IES to complete]) for activities related to the Regional Educational Laboratory Program (REL). ED, in consultation with American Institutes for Research (AIR), is planning a two-part evaluation of the Early Warning and Intervention Monitoring System (EWIMS), consisting of an impact study and an implementation study. OMB approval is being requested for a multimode data collection and analysis of a group of schools, students, and staff members in public schools in Ohio, Michigan, and Indiana. The impact study consists of data collection from the state education agencies (SEAs) in Ohio, Michigan, and Indiana and participating districts and schools. The implementation component consists of data collection from participating schools. Specifically, in this OMB clearance package, ED is requesting clearance for the following data collection approaches:
Recruitment materials for all participating districts and schools
Extant administrative records data collections from SEAs, districts, and schools within Ohio, Michigan, and Indiana
The transfer of data from treatment schools to the evaluation team via populated Early Warning System (EWS) tools
Pilot testing of survey and interview protocols
A Web-based survey of school leaders in treatment and control schools
An interview with one school administrator
The implementation study will include additional forms of data collection (satisfaction survey and monthly logs of EWIMS data team meetings); however, the ED is not seeking approval for these measures of implementation because they are part of the typical EWIMS intervention and present no burden to participants.
As detailed more fully in the project description that follows, this impact study (designed as a cluster randomized controlled trial) will focus on student outcomes spanning multiple domains of school success (student risk status, scores on graduation tests, persistence and progress in school, and being on track at the end of ninth grade) and will examine whether the EWIMS model has an impact on intermediate outcomes in schools, including the schools’ data culture1 and data-informed allocation of dropout prevention interventions. The implementation study will focus on schools’ experience with implementation, the extent to which schools faithfully implement the EWIMS model and the interventions provided to students identified as at risk by the EWS tool.
The purpose of the project is to assess the implementation and impact of EWIMS, a data tool and process for implementing a system of data-driven decision making. Developed by the National High School Center, EWIMS provides a means of systematically and reliably identifying students at risk for dropping out of high school. The proposed study is a two-year school-level randomized controlled trial (RCT) to examine the impact of implementing EWIMS on school processes and student outcomes.
Impact Study
The focus of the impact study will be to assess the effectiveness of EWIMS on student and school outcomes. For that reason, the study will address the following research questions:
What is the impact of the EWIMS model on outcomes for students in schools, including:
indicators of student risk?
scores on graduation tests?
persistence and progress in school?
predicted probability of on-time graduation?
In addition, REL Midwest will conduct exploratory research analyses for both students and schools.
What is the impact of EWIMS on outcomes for subgroups of students, including:
students who receive free or reduced-price lunch (FRPL)?
students who are English language learners (ELLs)?
students with Individualized Education Plans (IEPs)?
students with baseline indicators of risk for attendance, course failures, and/or behavior?
What is the impact of EWIMS on other school outcomes, including:
data-informed allocation of dropout prevention interventions for students?
school data culture (including the context, supports for data use, working with data, and responses to data)?
The exploratory student subgroup analyses will address research questions 1a–1d that will assess whether the impact of EWIMS on student outcomes differs for key subgroups of students: students who receive free or reduced-price lunch (FRPL), students who are English language learners (ELLs), students with Individualized Education Plans (IEPs), and students with initial risk (with one or multiple risk factors) in the fall 2013 semester, preceding random assignment and implementation. We acknowledge that some students may be included in more than one of these analyses because they may be members of more than one subgroup.
The exploratory school-level impact questions are intended to understand how early adoption of EWIMS may cause initial changes in how schools use data to identify at-risk students and respond with interventions.
To minimize burden on participating districts and schools, this study draws heavily on extant data to address the study’s research questions outlined earlier in this statement. Student- and school-level baseline and outcome data for the impact study will be obtained from multiple sources, including the SEAs, the EWS tool, and school and district administrative data.
Implementation Study
REL Midwest also will conduct an implementation study to describe treatment schools’ experiences with adoption and early implementation of EWIMS, and the extent to which there is a difference between the intervention and business as usual in control schools. The implementation study will address the following research questions:
To what extent do treatment schools faithfully implement the EWIMS model?
To what extent does business-as-usual practice in control schools include the use of data for identifying at-risk students (treatment contrast)?
What are the specific interventions provided to students identified as “at risk” by the EWIMS model in treatment schools?
Improving the state’s current high school graduation rate is a point of focus in all three states, as detailed in Ohio’s Comprehensive Continuous Improvement Plan, Michigan’s Dropout Challenge, and Indiana’s Elementary and Secondary Education Act (ESEA) waiver request. Ohio has committed, through adoption of the Ohio Improvement Process, to promoting student success through data-driven decision making, targeted programming, ongoing student monitoring, and evaluation of improvement process effectiveness. Michigan has committed to improving its graduation rate and preventing dropout by encouraging all schools to participate in the superintendent’s Dropout Challenge, which requires participating schools to identify at least 10–15 students exhibiting multiple dropout risk factors in or near a transition year. Use of an early warning system is one strategy that schools can implement within the Dropout Challenge. Finally, as part of its ESEA waiver request, Indiana committed to raising graduation rates to at least 90 percent across the state.
The graduation rates in each of these three states all show room for improvement, particularly among key subgroups of students. The graduation rate in Ohio is 79 percent and this rate is lower for students who are migrant (39 percent); black, non-Hispanic (56 percent); limited English proficient (57 percent); Hispanic (60 percent); American Indian or Alaskan Native (63 percent); economically disadvantaged (63 percent); or receiving special education services (65 percent).2 Similarly, the overall graduation rate in Michigan is 76 percent, but is lower for students who are migrant (68 percent); black, non-Hispanic (60 percent); limited English proficient (63 percent); Hispanic (64 percent); American Indian or Alaskan Native (66 percent); or economically disadvantaged (64 percent); and for students who have disabilities (54 percent).3 Finally, Indiana’s overall graduation rate is somewhat higher (89 percent) relative to Ohio and Michigan, but lower for students who are black, non-Hispanic (78 percent); limited English proficient (81 percent); Hispanic (84 percent); American Indian (82 percent); economically disadvantaged (86 percent); or receiving special education services (73 percent).4
The efforts of these states to improve graduation rates by identifying and supporting students at risk of dropping out are supported by substantial research on dropout prevention. A Practice Guide on dropout prevention, developed by the U.S. Department of Education’s Institute of Educational Sciences (Dynarski et al., 2008), provides six recommendations based on an expert panel review of relevant research. The guide recommends, as Recommendation 1, using data systems as a diagnostic tool to understand dropout trends and identify individual students at risk of dropping out. As a result, districts and schools in these three states are increasingly interested in using an early warning system to identify students who are off track for graduation as early as possible
Informed by research on the academic and behavioral predictors of dropping out (Allensworth & Easton, 2005, 2007; Balfanz, Herzog, & Mac Iver, 2007; Neild & Balfanz, 2007; Silver, Saunders, & Zarate, 2008), early warning systems are a promising approach—or may even be a necessary prerequisite—to effective dropout prevention (Dynarski et al., 2008). The intent of an early warning system is to systematically use data to identify students who are at risk of dropping out of high school; identified students can then be matched with interventions to help them get on track for graduation (Heppen & Therriault, 2008; Jerald, 2006; Kennelly & Monrad, 2007; Neild, Balfanz, & Herzog, 2007; Pinkus, 2008). Furthermore, a robust early warning system can be used to monitor student progress in these interventions (O’Cummings, Heppen, Therriault, Johnson, & Fryer, 2010; O’Cummings, Therriault, Heppen, Yerhot, & Hauenstein, 2011).
Despite the strong foundational research on the use of early indicators to identify students who are at risk of not graduating and the increasingly widespread implementation of early warning systems by states, districts, and schools, to date there have been no rigorous studies testing the impact of these systems on student outcomes such as staying in school, progressing in school, and graduating. There also is very little information on the impact of adopting an early warning system on school-level processes, such as how schools allocate their limited resources to prevent dropout and how early warning systems may affect school data culture— contextual factors (e.g., the assessment and instructional context), supports for data use (e.g., professional development or structured time to review data), working with data (e.g. frequency and depth of data use), and responses to data (e.g., assignment of interventions to students). Although EWIMS is in wide use in 67 districts in six states, and with more than 20,000 downloads of the Center’s free early warning system tool, the developers themselves acknowledge that information about the effectiveness of implementation for improving student outcomes is lacking. The REL Midwest Dropout Prevention Research Alliance5 has therefore requested an evaluation of the EWIMS model in order to obtain rigorous evidence of effectiveness. The districts represented in the alliance and other districts and schools in the region and around the country will be able to use the results to inform their own decisions about the effects of implementing an early warning system on student and school outcomes.
EWIMS includes (1) an early warning data tool and to flag students as at risk on the basis of attendance, course performance, and behavior indicators and (2) an implementation process. The tool enables users to identify students who are at risk of dropping out of school, to record assignment to available interventions, and to monitor students’ response to those interventions. Beyond the development of the data tools, the National High School Center has devised a seven-step EWIMS implementation process to support implementation, as shown in Figure 1. The process guides users to make informed decisions about how to use data to support at-risk students and how to continue to monitor their progress over time. In addition to focusing on individual students, the process guides users to examine the success of specific supports or interventions and to examine possible systemic issues (e.g., school climate) that may relate to dropout trends. Treatment schools will begin receiving technical assistance on the EWIMS process and EWS Tool in March 2014, before we begin collecting implementation data in May 2014.
Figure
1. Early Warning Intervention Monitoring System Implementation
Process
As shown in Figure 1, the steps are intended to be cyclical. At the core of this data-driven decision-making process, the steps focus users on key indicators that identify which students are showing signs of risk of dropping out of high school and that guide users to go beyond the indicator data and other relevant information to connect at-risk students to dropout prevention or academic support interventions. Ideally, the EWIMS model allows users to identify students accurately and provide supports to students through interventions, resulting in improvement in graduation outcomes for students.
Figure 2, the study’s Theory of Action, describes how implementation of the EWIMS intervention at the school level is theoretically linked with the primary outcomes of interest (change in students’ attendance, behavior, and academic achievement).
Figure 2. Early Warning and Intervention Monitoring System (EWIMS) Theory of Action
The Theory of Action flows from left to right, with adoption of EWIMS at the upper left and the ultimate goal of improvement in student outcomes on the bottom right. The theory posits that there are intermediate outcomes at both the school and the student levels; in general, that implementation of EWIMS changes the ways that schools allocate dropout prevention resources, which in turn has a general effect on school data culture. The data-driven allocation of dropout prevention resources also changes the likelihood that students in need of additional support participate in school and student interventions. This, in turn, leads to improvements in student outcomes related to academic progress and performance.
The top path in the Theory of Action reflects the school-level changes that hypothetically occur during schools’ full adoption of the EWIMS process. Prior to adoption, in many schools, dropout prevention resources and interventions are available but not systematically applied, and their use is not well coordinated. The first step in school-level EWIMS implementation is to adopt the seven-step process. As schools implement the seven-step EWIMS process, school-based teams systematically allocate the limited dropout prevention programs and interventions to those students who will most likely benefit, on the basis of the data available in the tool rather than on intuition. This step is captured in Figure 2 in the box labeled “data-informed allocation of dropout prevention programs and interventions.” As a result of the new data-based allocation of resources, schools then are able to realize positive results for individual students and groups of students. Thus, staff gain confidence in data-based decision-making processes that are subsequently applied in other areas (e.g., identification of students for acceleration and enrichment) and at other levels (e.g., individual teachers using data to inform lesson planning or administrators using it to guide school improvement efforts). This increase in confidence and the application of data-driven decision making leads to an improvement in school data culture (represented in the third box in the top row of the Theory of Action).
We also believe that school-level processes may in fact be related to student-level process, as depicted by the arrows connecting the blue school level boxes with the green student level boxes. The two school-level processes of (1) data-informed allocation of dropout prevention programs and interventions and (2) improved school data culture have a double-sided arrow between them, suggesting that as schools engage in more data-informed allocation of dropout prevention programming, they may also have improved school data culture, and vice versa. Therefore, the relationship between school data culture and improved student outcomes is captured through this reciprocal relationship that involves the data-driven allocation of personalized supports, including specific interventions and progress monitoring at both school and student levels.
Although the most proximal outcomes of the EWIMS model may well be how schools conduct the process of allocating their limited resources for dropout prevention and student supports, and the degree to which this process becomes more systematic and routine, the ultimate goal of any early warning system is to have an impact on students’ outcomes. While schools are encouraged to use the EWIMS model to evaluate the effectiveness of student micro-interventions for different subgroups of students and allocate their limited resources in part based on that information in future years—the main goal of any early warning system is still to improve student outcomes and ultimately prevent students from dropping out of school. Therefore, student-level changes, reflected in the bottom path of the Theory of Action graphic, occur in concert with the school-level changes and ultimately lead to improvements in student outcomes. The EWIMS Theory of Action explicitly acknowledges that the model is a school-level macro intervention, through which schools can use data to efficiently allocate dropout prevention and academic support interventions to targeted students. Important to note is the fact that the process does not prescribe specific interventions to students but rather relies on local decision-making to match students with interventions. Thus, the overall impact on student outcomes is driven at least in part by the effectiveness of the specific interventions assigned to students and the appropriateness of the fit between student needs and assigned interventions.
Primary data collection unique to this study will include a Web-based survey on data-informed allocation of dropout prevention interventions for students, school data culture, and the treatment contrast (the presence or absence of an early warning system) administered to school leaders in treatment and control schools.
See Attachment A-1 for details on each data element collected, by whom, and when the data element will be collected for the impact study and Attachment A-2 for details on each element for the implementation study. REL Midwest is not seeking approval for measures of implementation that are part of the typical EWIMS intervention. These include the satisfaction survey and the interview protocols typically conducted with EWIMS data teams in treatment schools.
This data collection is authorized by the Educational Sciences Reform Act (ESRA) of 2002. Please see Attachment A-3 for the ESRA.
The REL Midwest Dropout Prevention Research Alliance is the primary audience for this project. The data will be used by the REL Midwest Dropout Prevention Research Alliance for assessing the impact of the EWIMS model on school and student outcomes related to risk and dropout prevention. The alliance has requested an evaluation of the EWIMS model to obtain rigorous evidence on effectiveness. This project builds upon initial work currently being conducted to validate indicators of student risk of failure to graduate in Ohio districts. Although an increasing amount of attention has focused on using data to identify students as at risk as early as possible, no rigorous evidence exists about the overall effects of establishing an early warning system on student outcomes. This project is designed to specifically address an alliance need for information about impact and implementation of early warning systems, which reflects the concern of states, districts, and schools around the country about high school dropout. Aggregate results will be provided in a report to the Dropout Prevention Research Alliance to assist in their efforts in program improvement. In addition, the districts represented in the alliance and other districts and schools in the region and around the country will be able to use the results to inform their own decisions about implementing early warning systems.
Further, other educational stakeholders will benefit: in Ohio, Michigan, and Indiana; the four other states in the REL Midwest region (Iowa, Illinois, Minnesota, Wisconsin); and the education research community broadly.
ED’s contractor plans to make use of technology for both recruitment activities and study data collection.
The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. During recruitment, the district and school screening interviews will be conducted by telephone. For purposes of gathering the information needed to determine district eligibility for the study, telephone interviews have many advantages over mail surveys. First, a telephone interview is less burdensome for respondents, who can provide oral answers. Consequently, a telephone interview is likely to yield a better response rate than a paper survey. Second, telephone interviews can generate responses within minutes once the interviewer reaches the respondent, which helps to maximize the efficiency of our district screening and recruitment process. Third, the interviewer can immediately probe for further information to clarify ambiguous or conditional responses.
ED’s contractor will use a file transfer protocol (FTP) to collect extant student records data from the state, districts, and schools rather than requesting these data in a printed copy. The data will be transmitted electronically through the use of secure FTP sites established by AIR to use in partnership with the SEAs as well as participating districts and schools when necessary. ED’s contractor will provide clear instructions on the data requested and the methods of transmitting the data securely.
ED’s contractor also will use technology for the school surveys, which will be administered online using a Web platform to measure the treatment contrast, data-informed use of dropout prevention interventions for students, and school data culture. The Web-based platform is preferable because it typically places a lower burden on respondents than a paper-and-pencil survey does.
In an effort to avoid duplication of effort, this study will maximize the amount of pre-collected administrative records to understand the impact of EWIMS (e.g., student attendance data, standardized test scores, grades). The only data collected that will be unique to this study are the Web-based school survey and qualitative implementation data.
ED’s contractor will collect extant administrative data from the state and district to reduce the burden on small entities (schools), and rely on school data only when the data are not available from the state or district. In addition, the use of administrative records will reduce the burden on schools by ensuring that only the minimum amount of extant and original data is requested from schools in order to meet the objectives of this study.
The Education Science Reform Act of 2002 states that the central mission and primary function of the regional education laboratories is to support applied research and provide technical assistance to state and local education agencies within their region (ESRA, Part D, section 174[f]). If the proposed data collections for the EWIMS impact study were not conducted, REL Midwest would not be fulfilling its central mission to serve the states in the region and provide support for evidence-based research. Additionally, the REL Midwest Dropout Prevention Research Alliance, districts represented in the alliance, and other districts and schools in the region and country would not have rigorous evidence of the impact of EWIMS.
This is a one-time study (i.e., not recurring) and therefore periodicity is not addressed.
There are no special circumstances.
A 60-day notice was published in the Federal Register, providing an opportunity for public comments. To date there are no public comments to be addressed from the 60-day period. A 30-day notice will be published to further solicit comments. ED will respond to both public and OMB questions, if any, and summarize the responses under 8a. A place holder (Attachment A-4) for the notices is attached.
The following individuals were consulted on the statistical, data collection, and analytic aspects of the EWIMS evaluation study via REL Midwest’s Technical Working Group (TWG). Major recommendations from the TWG are included in Attachment A-5.
Margaret
Burchinal, Ph.D.
Senior
Scientist,
Frank Porter Graham Child Development Inst.
Research
Professor, Department of Psychology
Adjunct Professor,
Department of Biostatistics
University of North Carolina
515
Oakcrest Drive
Chapel Hill, NC 27516-9638
Ph:
919-966-5059
Fax: 919-962-5771
E-mail: burchinal@unc.edu
Thomas
Cook, Ph.D.
Joan
and Sarepta Harrison Chair in Ethics and Justice
Professor of
Sociology, Psychology, Education and Social Policy
Faculty
Fellow, Institute for Policy Research
Northwestern
University
2040 Sheridan Road
Evanston, IL 60208
Ph:
847-491-3776
E-mail: t-cook@northwestern.edu
Sara
Goldrick-Rab, Ph.D.
Associate
Professor of Educational Policy Studies and Sociology
Senior
Scholar, Wisconsin Center for the Advancement of Postsecondary
Education
University of Wisconsin
211 Education Bldg.
1000
Bascom Mall
Madison, WI 53706
Ph: 608-265-2141
E-mail:
srab@education.wisc.edu
Larry
Hedges, Ph.D.
Board
of Trustees Professor of Statistics and Social Policy
Faculty
Fellow, Institute for Policy Research
Northwestern
University
2006 Sheridan Road, EV 4070
Evanston, IL
60208
Ph: 847-491-8899
E-mail: l-hedges@northwestern.edu
James
J. Kemple, Ed.D.
Executive
Director
Research Alliance for New York City Schools
Research
Professor, Steinhardt School of Culture, Education, and Human
Development
New York University
726 Broadway, 756
New
York, NY 10003
Ph: 212-998-5463
Fax: 212-995-4049
E-mail:
james.kemple@nyu.edu
Brian
Rowan, Ph.D.
Burke
A. Hinsdale Collegiate Professor and School of Education Research
Professor
Institute for Social Research
University of
Michigan
610 E. University Ave.
Room 4112
Ann Arbor,
MI 48109-1259
Ph: 734-615-0286
E-mail: brow@umich.edu
Barbara
Schneider, Ph.D.
John
A. Hannah Chair and University Distinguished Professor
College
of Education and Department of Sociology
Michigan State
University
Erickson Hall
620 Farm Lane, Room 516
East
Lansing, MI 48824
Ph: 517-432-0188
E-mail: bschneid@msu.edu
ED’s contractor has also consulted with the REL Midwest Dropout Prevention Alliance members to gather feedback on the design and measures to be used in the study:
Teresa Brown, Assistant Superintendent at Indiana Department of Education (317-232-0524)
Leisa Gallagher, Director of the Reaching & Teaching Struggling Learners Initiative at the Michigan Department of Education (517- 908-3921)
Jeremy Herr, Principal, McComb High School, McComb Local Schools (419-293-3286)
Laurie Kruszynski, Data
Coordinator, Scott High School, Toledo Public Schools
(419-671-4000)
Cherie Mourlam, Assistant Superintendent, Washington Local Schools (419-473-8222)
Mike O’Shea, Springfield High School (419-867-5633)
Melissa Ramirez, Assistant Principal, Findlay City Schools (419-425-8257)
Jay Wollenburg, Principal, Ohio Virtual Academy (866-339-9071)
Sue Zake, Executive Director, Ohio State Department of Education (419-720-8999)
None.
An incentive will be provided to each school assigned to the control group (receipt of the intervention after the study period). The detailed rationale for this is as follows: ED’s contractor plans to recruit 72 schools, half (36) of which will be assigned to a treatment group to receive EWIMS during the 2013–14 school year, and half (36) of which will be assigned to a control group that will have a 20-month delay in receiving EWIMS. Because schools assigned to the control group will not be eager to wait 20 months to receive the EWIMS macro intervention, ED’s contractor proposes incentives that include support for post-study implementation costs.
School administrators who participate in the annual Web-based survey also will receive a $30 stipend in the form of a gift card, which is aligned with Analytical and Technical Support for Advancing Education Evaluation: How to Put Together an OMB Supporting Statement, Appendix E (Sloan, Ingels, & Burghardt, 2012). This $30 incentive applies to responses on one survey that measures three key constructs, two of which are outcomes for the impact study (data-informed use of dropout prevention programs and interventions and school data culture) and one of which is for the implementation study (treatment contrast). Respondents have the opportunity to receive two $30 gift cards and incentives will be distributed after respondents complete the surveys—once in May 2014 and a second time in May 2015.
Interview participants will also receive a $30 stipend in the form of a gift card, which is also aligned with Analytical and Technical Support for Advancing Education Evaluation: How to Put Together an OMB Supporting Statement, Appendix E (Sloan, Ingels, & Burghardt, 2012). This $30 incentive applies to the annual interview with a EWIMS team member at each of the 36 treatment school in May 2014 and May 2015. Gift cards will be administered to participants after each interview.
Prior to the administration of the Web-based survey and interviews, the protocols will be piloted with five school leaders who are currently implementing EWIMS in their schools. ED’s contractor will recruit these five participants from the network of schools currently implementing EWIMS.
No confidential data will be sought during the recruitment phase of the study.
The following statement applies to procedures to take place during the data collection phase of the study:
A consistent and cautious approach will be taken to protect all information collected during the data collection phase of the study. This approach will be in accordance with all relevant regulations and requirements. REL Midwest will follow the new policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, requires “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.
In addition, for student information, “The Director shall ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools, shall remain confidential in accordance with section 552a of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act.”
Subsection (c) of section 183 referenced above requires the director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.”
Subsection (d) of section 183 prohibits disclosure of individually identifiable information as well as making any the publishing or communicating of individually identifiable information by employees or staff a felony.
REL Midwest will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released. Information from participating institutions and respondents will be presented at aggregate levels in reports. Information on respondents will be linked to their institution but not to any individually identifiable information. No individually identifiable information will be maintained by the study team.
All members of the study team have obtained their certification on the use of human subjects in research, and REL Midwest staff also will obtain federal security clearances. All institution-level identifiable information will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required. The following safeguards are routinely employed by AIR to carry out privacy assurances during the study:
All AIR employees sign a confidentiality pledge emphasizing its importance and describing their obligations under it (please see Attachment A-7 for the confidentiality pledge).
Identifying information is maintained on separate forms and files, which are linked only by sample identification number.
Access to printed documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.
Computer data files are protected with passwords and access is limited to specific users.
Especially sensitive data are maintained on removable storage devices that are kept physically secure when not in use.
Also, the REL study team will submit to the NCEE security officer a list of the names of all people who will have access to respondents and data. The contractor, on behalf of ED, will track new staff and staff who have left the study and ensure that signatures will be obtained or clearances revoked, as necessary.
The Privacy Act of 1974 applies to this data collection. AIR will make certain that all data are held in strict confidentiality, as just described, and that in no instance will responses or data be made available except in in aggregate statistical form. The following statement will appear on all letters to respondents on data collection:
Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. The contractor will not provide information that identifies you or your district to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.
Confidentiality pledges and affidavits of non-disclosure are included in Attachment A-7.
No questions of a sensitive nature will be included in the recruitment screening protocols, or other data collection instruments. The project has been approved by AIR’s Institutional Review Board (IRB00000436), which has conducted expedited and full-board reviews of research involving human subjects for more than 21 years. AIR is registered with the Office of Human Research Protection (OHRP) as a research institution (IORG0000260) and conducts research under its own Federalwide Assurance (FWA00003952), please see Attachment A-8.
There are three components for which ED’s contractor has calculated hours of burden for this clearance package: recruitment activities (in districts and schools), data collection activities for the impact study, and data collection activities for the implementation study. Table 1 shows the hourly burden overall and for each component. The total burden associated with this study, across three study years, is 5,313 hours. The recruitment burden is 864 hours, the data collection burden for the impact study is 3,978 hours, and the burden for the implementation study is 472 hours.
Table 1. Total Estimated Hourly Burden
Instrument |
Person Incurring Burden |
State |
Number of respondents |
Responses per Respondent |
Total Responses |
Hours per Response |
Total Burden Hours |
Recruitment |
|
|
|
|
|
|
|
District first contact (e‑mail or telephone) |
District Administrator |
IN |
68 |
1 |
68 |
0.05 |
3.42 |
MI |
114 |
1 |
114 |
0.05 |
5.70 |
||
OH |
150 |
1 |
150 |
0.05 |
7.50 |
||
Follow-up for nonresponders (districts) |
District Administrator |
IN |
27 |
1 |
27 |
0.167 |
4.57 |
MI |
46 |
1 |
46 |
0.167 |
7.62 |
||
OH |
60 |
1 |
60 |
0.167 |
10.02 |
||
District screening |
District Administrator |
IN |
58 |
1 |
58 |
0.5 |
29.07 |
MI |
97 |
1 |
97 |
0.5 |
48.45 |
||
OH |
128 |
1 |
128 |
0.5 |
63.75 |
||
First school contact (e‑mail or telephone) |
School Administrator |
IN |
85 |
1 |
85 |
0.167 |
14.23 |
MI |
135 |
1 |
135 |
0.167 |
22.55 |
||
OH |
193 |
1 |
193 |
0.167 |
32.16 |
||
Follow-up for nonresponders |
School Administrator |
IN |
34 |
1 |
34 |
0.167 |
5.69 |
MI |
54 |
1 |
54 |
0.167 |
9.02 |
||
OH |
77 |
1 |
77 |
0.167 |
12.87 |
||
School screening and interview |
School Administrator |
IN |
57 |
1 |
57 |
1 |
56.80 |
MI |
90 |
1 |
90 |
1 |
90.00 |
||
OH |
128 |
1 |
128 |
1 |
128.40 |
||
School visit (face-to-face or virtual) |
School Administrator |
IN |
27 |
1 |
27 |
2 |
54.00 |
MI |
36 |
1 |
36 |
2 |
72.00 |
||
OH |
27 |
1 |
27 |
2 |
54.00 |
||
Negotiating final agreements (district MOUs) |
District Administrator |
IN |
20 |
1 |
20 |
1 |
20.00 |
MI |
25 |
1 |
25 |
1 |
25.00 |
||
OH |
15 |
1 |
15 |
1 |
15.00 |
||
Negotiating final agreements (school MOUs) |
School Administrator |
IN |
20 |
1 |
20 |
1 |
20.0 |
MI |
26 |
1 |
26 |
1 |
26.00 |
||
OH |
26 |
1 |
26 |
1 |
26.00 |
||
Subtotal |
--- |
|
1823 |
--- |
1823 |
18.153 |
863.81 |
Impact Study |
|
|
|
|
|
|
|
Student and School baseline data (from the state) |
State data manager |
IN |
1 |
1 |
1 |
16 |
16 |
MI |
1 |
1 |
1 |
16 |
16 |
||
OH |
1 |
1 |
1 |
16 |
16 |
||
Student administrative data (from the state) |
State data manager |
IN |
1 |
3 |
3 |
10 |
30 |
MI |
1 |
3 |
3 |
10 |
30 |
||
OH |
1 |
3 |
3 |
10 |
30 |
||
Student baseline data (from the district) |
District data manager |
IN |
20 |
1 |
20 |
16 |
320 |
MI |
25 |
1 |
25 |
16 |
400 |
||
OH |
15 |
1 |
15 |
16 |
240 |
||
Student administrative data (from the district) |
District data manager |
IN |
20 |
3 |
60 |
16 |
960 |
MI |
25 |
3 |
75 |
16 |
1200 |
||
OH |
15 |
3 |
45 |
16 |
720 |
||
Subtotal |
--- |
|
126 |
--- |
252 |
174 |
3978 |
Implementation Study |
|
|
|
|
|
|
|
Pilot Testing of the Survey |
EWIMS Users |
IN |
3 |
1 |
3 |
1 |
3 |
MI |
3 |
1 |
3 |
1 |
3 |
||
OH |
3 |
1 |
3 |
1 |
3 |
||
Pilot Testing of the Interview |
EWIMS Users |
IN |
3 |
1 |
3 |
1 |
3 |
MI |
3 |
1 |
3 |
1 |
3 |
||
OH |
3 |
1 |
3 |
1 |
3 |
||
Transferring EWS tool from schools to research team |
School administrator |
IN |
10 |
7 |
70 |
0.5 |
35 |
MI |
16 |
7 |
112 |
0.5 |
56 |
||
OH |
10 |
7 |
70 |
0.5 |
35 |
||
Web-based survey for school level administrators |
School administrator |
IN |
19 |
2 |
38 |
2 |
76 |
MI |
30 |
2 |
61 |
2 |
122 |
||
OH |
19 |
2 |
38 |
2 |
76 |
||
Interview with School Administrator |
School administrator |
IN |
10 |
1 |
10 |
1.5 |
15 |
MI |
16 |
1 |
16 |
1.5 |
24 |
||
OH |
10 |
1 |
10 |
1.5 |
15 |
||
Subtotal |
--- |
--- |
158.4 |
--- |
442.8 |
18 |
471.6 |
Totals |
--- |
--- |
2107 |
--- |
2518 |
210.15 |
5313.41 |
Our recruitment strategy will focus on both districts and schools. ED’s contractor will conduct recruitment both in a top–down approach (district then school) and in a bottom–up approach (school then district). Information gathered from schools will inform district recruitment and information gathered from districts will inform school recruitment. Our burden table is a conservative estimate of the highest potential burden with this recruitment approach, but ED’s contractor anticipates that fewer districts and schools will actually be involved in recruitment activities (for instance, if ED’s contractor screens Cleveland Metropolitan School District and they are determined to be ineligible, ED’s contractor would not recruit the 22 high schools in that district).
Districts will be contacted via e‑mail; the estimated burden is 3 minutes to read and respond to the e‑mail (0.05 hour). The target sample size for initial contact with districts is 554 districts with an estimated response rate of 60 percent (332 districts). District respondents will then be contacted to complete a 30-minute district screener, with a target response rate of 85 percent (283 districts).
The study team also will reach out to the pool of eligible high schools, approximately 688 schools with an estimated response rate of 60 percent (413 schools). Schools will be contacted via e‑mail—with an estimated burden of 10 minutes to read and respond to the e‑mail (0.1667 hours). The study team will contact nonrespondent schools a second time via telephone or e‑mail (another 10-minute—0.1667-hour—burden). School respondents then will be contacted to complete a 60-minute district screener, with a target response rate of 80 percent. Last, up to 100 schools will be selected for a site visit with a target response rate of 90 percent. The site visit is anticipated to take approximately 2 hours. Finally, the study team will negotiate final agreements, including establishing district and school memoranda of understanding, with 72 schools. ED’s contractor estimates that there will be 60 districts that encompass the 72 participating high schools.
The total estimated hourly burden for the data collection for the impact of EWIMS is 3,978 hours. To reduce the data collection burden, the study team has identified the organization (i.e., state education agency, district, or school) that can most efficiently provide us with the data required for analysis.
From each SEA, ED’s contractor will collect baseline data in year 2 and student administrative data in years 2 and 3 (138 total hours). Additional student administrative data that are not available from the SEA, such as student grades, will be collected from districts with participating high schools in years 2 and 3 (3,840 total hours).
The total estimated hourly burden for the data collection for the implementation study is 472 hours. The study team will collect copies of the EWS tools used by the 36 treatment schools four times in year 2 and three times in year 3 (126 hours). The study team also will administer a Web-based survey to a school administrator in each of the participating high schools (both treatment and control schools) in years 2 and 3 (274 hours) with a target response rate of 95 percent. Last, the study team will conduct interviews with one member of the EWIMS team at each treatment school (54 hours).
The total cost to respondents for the three components of this study—recruitment activities (in districts and schools), data collection activities for the impact study, and data collection activities for the implementation study—is provided in Table 2.
The total respondent cost associated with this study is approximately $227,928.6 The annualized cost for each year of the four-year study is $56,982. The recruitment cost is $32,412, the respondent cost for the data collection for the impact study is $179,010, and the respondent cost associated with the implementation study is $16,506.
Table 2. Total Cost to Respondents
Instrument |
Person Incurring Burden |
State |
Total Burden Hours |
Hourly Wage Rate |
Total Respondent Cost |
Recruitment |
|
|
|
|
|
District first contact (e‑mail or telephone) |
District Administrator |
IN |
3 |
$45 |
$154 |
MI |
6 |
$45 |
$257 |
||
OH |
8 |
$45 |
$338 |
||
Follow-up for nonresponders (districts) |
District Administrator |
IN |
5 |
$35 |
$160 |
MI |
8 |
$35 |
$267 |
||
OH |
10 |
$35 |
$351 |
||
District screening |
District Administrator |
IN |
29 |
$45 |
$1,308 |
MI |
48 |
$45 |
$2,180 |
||
OH |
64 |
$45 |
$2,869 |
||
First school contact (e‑mail or telephone) |
School Administrator |
IN |
14 |
$35 |
$498 |
MI |
23 |
$35 |
$789 |
||
OH |
32 |
$35 |
$1,126 |
||
Follow-up for nonresponders |
School Administrator |
IN |
6 |
$35 |
$199 |
MI |
9 |
$35 |
$316 |
||
OH |
13 |
$35 |
$450 |
||
School screening and interview |
School Administrator |
IN |
57 |
$35 |
$1,988 |
MI |
90 |
$35 |
$3,150 |
||
OH |
128 |
$35 |
$4,494 |
||
School visit (face-to-face or virtual) |
School Administrator |
IN |
54 |
$35 |
$1,890 |
MI |
72 |
$35 |
$2,520 |
||
OH |
54 |
$35 |
$1,890 |
||
Negotiating final agreements (district MOUs) |
District Administrator |
IN |
20 |
$45 |
$900 |
MI |
25 |
$45 |
$1,125 |
||
OH |
15 |
$45 |
$675 |
||
Negotiating final agreements (school MOUs) |
School Administrator |
IN |
20 |
$35 |
$700 |
MI |
26 |
$35 |
$910 |
||
OH |
26 |
$35 |
$910 |
||
Subtotal |
--- |
|
864 |
--- |
$32,412 |
Impact Study |
|
|
|
|
|
Student and School baseline data (from the state) |
State data manager |
IN |
16 |
$45 |
$720 |
MI |
16 |
$45 |
$720 |
||
OH |
16 |
$45 |
$720 |
||
Student administrative data (from the state) |
State data manager |
IN |
30 |
$45 |
$1,350 |
MI |
30 |
$45 |
$1,350 |
||
OH |
30 |
$45 |
$1,350 |
||
Student baseline data (from the district) |
District data manager |
IN |
320 |
$45 |
$14,400 |
MI |
400 |
$45 |
$18,000 |
||
OH |
240 |
$45 |
$10,800 |
||
Student administrative data (from the district) |
District data manager |
IN |
960 |
$45 |
$43,200 |
MI |
1200 |
$45 |
$54,000 |
||
OH |
720 |
$45 |
$32,400 |
||
Subtotal |
--- |
|
3978 |
--- |
$179,010 |
Implementation Study |
|
|
|
|
|
Pilot Testing of the Survey |
EWIMS Users |
IN |
3 |
$35 |
$105 |
MI |
3 |
$35 |
$105 |
||
OH |
3 |
$35 |
$105 |
||
Pilot Testing of the Interview |
EWIMS Users |
IN |
3 |
$35 |
$105 |
MI |
3 |
$35 |
$105 |
||
OH |
3 |
$35 |
$105 |
||
Transferring EWS tool from schools to research team |
School administrator |
IN |
35 |
$35 |
$1,225 |
MI |
56 |
$35 |
$1,960 |
||
OH |
35 |
$35 |
$1,225 |
||
Web-based survey for school level administrators |
School administrator |
IN |
76 |
$35 |
$2,660 |
MI |
122 |
$35 |
$4,256 |
||
OH |
76 |
$35 |
$2,660 |
||
Interview with School Administrator |
School administrator |
IN |
15 |
$35 |
$525 |
MI |
24 |
$35 |
$840 |
||
OH |
15 |
$35 |
$525 |
||
Subtotal |
--- |
--- |
472 |
--- |
$16,506 |
Totals |
--- |
--- |
5313 |
--- |
$227,928 |
In summary, the total burden hours for a 3-year clearance would be 1,771 burden hours annually. The total annual responses for each of the 3 years would be 839 annually.
There are no start-up costs for this collection.
The total cost to the federal government for work conducted over all four years is $3,756,351 and the estimated annualized cost to the federal government for each year of the study is $939,088.
This is a new study.
After the study report is finalized, ED’s contractor will prepare restricted-use data files in accordance with NCES standards. These files will contain all the data collected for the study with all personal identifiers removed. Thorough documentation will be provided for each data file, including a detailed codebook and explanations of the unit of observation, weights, and methods for handling missing data. These data will become IES restricted-use data sets requiring a user’s license that is applied for through the same process as NCES restricted-use data sets. Even the REL contractor would be required to obtain a restricted-use license to conduct any work with the data beyond the original evaluation.
The Making an Impact report is scheduled to be drafted in December 2015, following the completion of the one and a half year implementation and subsequent availability of outcome data. The key objectives of this report are to summarize analyses of data collected during the two years of implementation of the intervention, specifically address each of the research questions described earlier in this statement and to provide scientific findings on the implementation of EWIMS and the impact or success measured by student outcomes. Analytic techniques will range from descriptive statistics to two-level nested regression models of student-level measures. All results for REL rigorous studies will be made available to the public through peer-reviewed evaluation reports that are published by IES on their website (http://ies.ed.gov/pubsearch).
The report will follow NCEE’s guidance on report writing and will focus on ease of interpretation for practitioners and policy makers. The REL Writers and Style Guide will be used as a framework for drafting the final report. The final report will be submitted to a rigorous review process, the REL Peer Review (RPR) process. Ed’s contractor will also draft an IES Newsflash to be disseminated to a wide audience of researchers, practitioners, and policy makers through IES’ email subscription lists. The contractor will also host a publically available webinar to discuss the findings and implications upon completion of the final report.
The timeline for data collection, analysis, and reporting is in Table 3.
Table 3. Schedule of Activities
Activity |
Expected Date |
Draft Office of Management and Budget (OMB) package |
July 2013 |
Documentation of institutional review board approval |
April 2013 |
Submit 60 day FRN |
July 2013 |
Submit 30 day FRN |
October 2013 |
Draft proposal accepted by ED |
March 2013 |
Final proposal approved by ED |
October 2013 |
Expected OMB clearance data |
February/March 2014 |
Complete school recruitment |
March 2014 |
Obtain signed district/school memoranda of understanding from all participating schools |
March 2014 |
Complete random assignment of participating schools |
March 2014 |
Academic year (AY) 1, 2013‑14 |
|
Collect baseline data from participating districts |
March 2014 |
Pilot the Web-based survey and interview protocols |
June 2014 |
Treatment schools implement EWIMS with Grades 9 and 10 |
March 2014–June 2014 |
Collect EWS tool data from treatment schools (AY1, quarter 3) |
March/April 2014 |
Collect EWS tool data from treatment schools (AY1, quarter 4) |
June 2014 |
Collect end-of-year student-level data from participating districts |
June 2014 |
Conduct interviews with EWIMS teams at treatment schools |
May 2014 |
Administer survey measure of treatment contrast at participating schools |
May 2014 |
Academic year (AY) 2, 2014‑15 |
|
Treatment schools implement EWIMS with Grades 9, 10, and 11 |
August 2014–June 2015 |
Collect EWS tool data from treatment schools (AY2, quarter 1) |
November 2014 |
Collect midyear student-level data from participating districts |
January 2015 |
Collect EWS tool data from treatment schools (AY2, quarter 2) |
January 2015 |
Collect EWS tool data from treatment schools (AY2, quarter 3) |
March 2015 |
Collect EWS tool data from treatment schools (AY2, quarter 4) |
June 2015 |
Collect end-of-year student-level data from participating districts |
June 2015 |
Conduct interviews with EWIMS teams at treatment schools |
May 2015 |
Administer survey measure of treatment contrast at participating schools |
May 2015 |
Academic year 3, 2015‑16 |
August 2015 |
Control schools implement EWIMS |
August 2015–June 2016 |
Submit Making an Impact report (first draft) |
December 2015 |
Making an Impact report accepted by ED |
TBD |
Approval not to display the expiration date for OMB approval is not requested.
No exceptions to the certification statement are being sought.
Allensworth, E. M., & Easton, J. Q. (2005). The on-track indicator as a predictor of high school graduation. Chicago: Consortium on Chicago School Research. Retrieved from http://ccsr.uchicago.edu/publications/p78.pdf
Allensworth, E. M., & Easton, J. Q. (2007). What matters for staying on-track and graduating in Chicago public high schools: A close look at course grades, failures, and attendance in the freshman year. Chicago: Consortium on Chicago School Research. Retrieved from http://ccsr.uchicago.edu/publications/07%20What%20Matters%20Final.pdf
Balfanz, R., Herzog, L., & MacIver, D. J. (2007). Preventing student disengagement and keeping students on the graduation path in urban middle-grades schools: Early identification and effective interventions. Educational Psychologist, 42(4), 223–235.
Dynarski, M., Clarke, L., Cobb, B., Finn, J., Rumberger, R., & Smink, J. (2008). Dropout prevention: A practice guide (NCEE 2008-4025). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. Retrieved from http://ies.ed.gov/ncee/wwc/pdf/practiceguides/dp_pg_090308.pdf
Heppen, J. B., & Therriault, S. B. (2008). Developing early warning systems to identify potential high school dropouts (Issue Brief). Washington, DC: American Institutes for Research, National High School Center. Retrieved from http://www.dropoutprevention.org/sites/default/files/NationalHighSchoolCenterIssueBrief_20101109.pdf
Jerald, C. (2006). Identifying potential dropouts: Key lessons for building an early warning data system. Washington, DC: Achieve.
Kennelly, L., & Monrad, M. (2007). Approaches to dropout prevention: Heeding early warning signs with appropriate interventions. Washington, DC: National High School Center. Retrieved from http://www.betterhighschools.org/docs/nhsc_approachestodropoutprevention.pdf
Neild, R., & Balfanz, R. (2006). Unfulfilled promise: The dimensions and characteristics of Philadelphia’s dropout crisis, 2000–2005. Philadelphia: Philadelphia Youth Network, Johns Hopkins University, & University of Pennsylvania.
Neild, R., Balfanz, R., & Herzog, L. (2007). An early warning system. Educational Leadership 65(2), 28–33.
O’Cummings, M., Heppen, Therriault, S., Johnson, A., & Fryer, L. (2010). Early warning system high school tool. Washington, DC: National High School Center.
O’Cummings, M., Therriault, S., Heppen, Yerhot, L., & Hauenstein, M. (2011). Early warning system middle grades tool. Washington, DC: National High School Center.
Pinkus, L. (2008). Using early-warning data to improve graduation rates: Closing cracks in the education system (Policy Brief). Washington, DC: Alliance for Excellent Education.
Silver, D., Saunders, M., & Zarate, E. (2008). What factors predict high school graduation in the Los Angeles Unified School District (Policy brief 14). Santa Barbara, CA: California Dropout Research Project, University of California.
Sloan, M., Ingels, J., & Burghardt, J. (2012). Analytical and Technical Support for Advancing Education Evaluation: How to Put Together an OMB Supporting Statement. Available at http://relpacific.wikispaces.com/file/view/OMB%20guidance%20compiled%205-30-12.pdf/350956206/OMB%20guidance%20compiled%205-30-12.pdf.
Table A.1: Data elements, sources, access, and periodicity for the impact study data collection.
Data Element |
Data Source |
Data Access |
Data Periodicity |
|||||
Treatment |
Control |
Source |
Evaluation Team |
Implementation Team (Treatment Schools Only) |
Who Is Collecting Data? |
When Will Data Be Collected? |
||
Student
Baseline |
Demographics (e.g., race/ethnicity, gender, FRPL, IEP, and ELL status, and parents’ education) |
X |
X |
SEA |
X |
|
Evaluation team |
Baseline– Mar. 2014 |
Prior academic achievement (e.g., GPA, state mathematics and reading scores) |
X |
X |
SEA |
X |
|
Evaluation team |
Baseline– Mar. 2014 |
|
Prior attendance rates |
X |
X |
SEA |
X |
|
Evaluation team |
Baseline– |
|
School Baseline Data |
High school graduation rates |
X |
X |
SEA |
X |
|
Evaluation team |
Baseline– |
Average state achievement scores in mathematics and reading |
X |
X |
SEA |
X |
|
Evaluation team |
Baseline– |
|
Percentage of FRPL students |
X |
X |
SEA |
X |
|
Evaluation team |
Baseline– |
|
Student Risk Status1 |
Attendance (e.g., missing 10 percent or more of instructional time) |
X |
X |
EWS tool/ SEA |
X |
X |
Evaluation team |
Mar.
2014, Jan.
2015, |
Course performance (e.g., one or more course Fs) |
X |
X |
EWS tool/ school/ district administrative data |
X |
X |
Evaluation team |
Mar.
2014, Jan.
2015, |
|
GPA
(e.g., 2.0 |
X |
X |
EWS tool/ school/ district administrative data |
X |
X |
Evaluation team |
Mar.
2014, Jan.
2015, |
|
Behavior incidences (locally validated) |
X |
X |
EWS tool/ school/ district administrative data |
X |
X |
Evaluation team |
Mar.
2014, Jan.
2015, |
|
On track at the end of ninth grade |
X |
X |
EWS tool/ school/ district administrative data |
X |
X |
Evaluation team |
Mar.
2014, Jan.
2015, |
|
Assessment Scores |
Standardized test scores on the Ohio Graduation Test (available only in Grade 10) for Ohio schools |
X |
X |
SEA |
X |
|
Evaluation team |
June
2014 |
Standardized test scores on the PLAN (available only in Grade 10) for schools in Michigan |
X |
X |
SEA |
X |
|
Evaluation team |
June
2014 |
|
Standardized test scores on the Algebra I and English 10 Acuity Assessments (end-of course assessment typically taken in Grade 9 or 10) for schools in Indiana |
X |
X |
SEA |
X |
|
Evaluation team |
June
2014 |
|
Standardized test scores on the SBAC (taken in Grade 11) for schools in Michigan |
X |
X |
SEA |
X |
|
Evaluation team |
June 2015 |
|
Standardized test scores on the PARCC (end-of-course assessments) for schools in Ohio and Michigan |
X |
X |
SEA |
X |
|
Evaluation team |
June 2015 |
|
Persistence and Progress in School |
Whether students are enrolled or have left school for reasons other than transfer to another district, including dropping out |
X |
X |
SEA |
X |
|
Evaluation team |
Jan.
2014, Jan.
2015, |
Grade promotion |
X |
X |
School/ district administrative data |
X |
|
Evaluation team |
June
2014 |
|
Credits |
X |
X |
School/ district administrative data |
X |
|
Evaluation team |
Mar.
2014, Jan.
2015, |
|
Predicted Probability of On-Time Graduation |
Student-level calculation of predicted probability for on-time graduation (a composite, model-based value) |
X |
X |
School/ district administrative data |
X |
|
Evaluation team |
June 2015 |
Treatment Contrast |
Presence of an Early Warning System tool (or use of student data to identify students at risk) |
X |
X |
Annual Web-based survey |
X |
|
Evaluation team |
May
2014, |
Data-Informed Allocation of Dropout Prevention Interventions for Students |
How schools use student data to allocate dropout prevention programs |
X |
X |
Annual Web-based survey |
X |
|
Evaluation team |
May
2014, |
1. These will be finalized on the basis of ongoing work with REL Midwest and verified by the evaluation team using administrative data.
2. Items adapted from the California Comprehensive Assistance Center (CCAC).
3. Information on specific interventions assigned to students will not be collected from control schools because collecting this information would require schools to engage in a process of data tracking and reflection that is similar to the EWIMS model and would raise concerns about treatment contrast.
Table A.2: Data elements, sources, access, and periodicity for the impact study data collection.
Data Element |
Data Source |
Data Access |
Data Periodicity |
|||||
Treatment |
Control |
Source |
Evaluation Team |
Implementation Team (Treatment Schools Only) |
Who Is Collecting Data? |
When Will Data Be Collected? |
||
Fidelity of Implementation |
Percentage of students with data uploaded into the EWS tool |
X |
|
EWS tool |
X |
X |
Evaluation team |
June 2014, Aug. 2014, Jan. 2015, March
2015 |
Measures of the extent to which identified students have documented interventions recorded in the tool |
X |
|
EWS tool |
X |
X |
Evaluation team |
Mar./Apr. 2014, June 2014, Aug. 2014, Jan. 2015, March
2015, |
|
|
Participation in EWIMS trainings or convenings |
X |
|
Attendance at trainings |
X |
X |
Implementation team |
Mar./Apr. 2014, June 2014, Aug. 2014, Jan. 2015, March
2015, |
Satisfaction with EWIMS trainings or convenings |
X |
|
Satisfaction survey |
X |
X |
Implementation team |
Mar./Apr. 2014, June 2014, Aug. 2014, Jan. 2015, March
2015, |
|
Ways in which each treatment school implements the intervention |
X |
|
Interview with EWIMS team member at each school2 |
X |
X |
Evaluation team |
May
2014, |
|
EWIMS data team monthly logs |
X |
|
Online logs of meeting frequency and content |
X |
|
Evaluation team |
Monthly from Mar. 2014–June 2014; Aug. 2014–June 2015 |
|
Specific Intervention Information3 |
Intervention
strategies used at each school, including schoolwide and
student-level programs assigned to students identified as |
X |
|
EWS tool |
X |
|
Evaluation team |
May
2014, |
Part D, Section 174(f)(2), of ESRA states that as part of their central mission and primary function, each regional educational laboratory “shall support applied research, development, wide dissemination, and technical assistance activities by…developing and widely disseminating…scientifically valid research, information, reports, and publications that are usable for improving academic achievement, closing achievement gaps, and encouraging and sustaining school improvement.”
The EWIMS Impact study was listed in the 60-day register on August 12, 2013.
There were no public comments.
http://www.regulations.gov/#!documentDetail;D=ED-2013-ICCD-0106-0001
Recommendations from the TWG from a meeting held on October 23, 2012.
The major recommendations from this TWG meeting included providing more information on the program/intervention, clarifying the focus of the study regarding the “macro-intervention,” which is the EWS tool and process,” and the “micro-interventions,” which are the supports and interventions schools select for students at risk, and describing the firewall between the implementation and evaluation team in more detail.
TWG Recommendation: Provide more clarity on the expected treatment contrast and identify what outcomes treatment schools will be accountable for when implementing EWIMS.
ED’s Contractor Response: This information will be included in the proposal. Based on the work of the National High School Center and others working in this area, ED’s contractor anticipates full implementation of EWIMS to represent a substantial change from business as usual in participating schools. In particular, the intervention is expected to guide schools to form teams of staff responsible for reviewing early indicator data, considering student-level intervention options for students identified as at-risk, and monitoring student data over time to determine whether they respond to provided interventions. In many schools, the EWIMS team is an extension of an existing school improvement team, and in other schools it is an entirely new structure. The structured and guided process of systematically using data to supersede intuition about which students are falling off track is typically a departure from usual practice in most schools.
TWG Recommendation: Address specific questions about how the data tool would be used by a school, such as: What level of data detail is entered into the tool? How does the tool allow for variation in districts? Who enters the data? How long does the tool track students? Is there potential for students being missed?
ED’s Contractor Response: Information related to these questions will be summarized in the proposal, and recruitment and implementation materials will be produced that communicate this information to prospective/participating schools in detail.
TWG Recommendation: Address specific questions focused on the process by which schools would use and interpret the data, such as: Who comprises the team looking at the data? What role would intuition and existing knowledge of students play? How much of this process is a direct part of the intervention that would be studied?
ED’s Contractor Response: The National High School Center developed an implementation guide for EWIMS that addresses these questions; excerpts will be included in the proposal and recruitment and implementation materials. In brief, the EWIMS team is composed of a mix of school staff including, ideally, the principal and/or assistant principal, guidance counselor, special education specialist, dropout prevention coordinator (if applicable), teachers, and where feasible, district representatives. The use of early indicator data is intended to replace the use of intuition as the sole guide for decisions about allocation of dropout prevention and school support resources. However, intuition and existing knowledge still play an important role in the decision-making process after students are flagged as potentially at-risk based on their administrative data alone.
The 7-step EWIMS process has a specific step focused on gathering additional information about identified students (e.g., by talking with their math teacher or directly with the student, to help guide decisions about appropriate interventions). This process of interpreting early indicator data, gathering other relevant information about individual students identified as at-risk, and making decisions about appropriate “micro-interventions” are key aspects of the “macro-intervention” to be studied. The study’s approach to understanding this step in EWIMS will include a series of descriptive analyses to document the process in each treatment school.
TWG Recommendation: Provide justification for focusing on the macro-intervention.
ED’s Contractor Response: ED’s contractor is confident that they can make a strong case for focusing on the macro-intervention, which is well-developed, well-defined, and widely used, for this impact study. The most proximal outcomes of the macro-intervention may be how well schools conduct the process of allocating their (limited) resources for dropout prevention and student supports, and the degree to which this process becomes more systematic, and routinized. Moreover, we recognize that the overall efficacy of the macro-intervention is partially dependent on the aggregate effectiveness of “micro-interventions” assigned to at-risk students. Nevertheless, a test of the macro-intervention’s impact on interim outcomes for students (e.g. attendance, course performance, behavior issues) is of high policy relevance and directly meets the stated needs of the research alliance.
The suggestion to design a study with multiple conditions based on planned variation within treatment schools is an interesting option to explore. However, the preference is to maintain the design as a straightforward school-level treatment-control design with the macro-intervention as the independent variable for both theoretical and practical reasons. First, it is important to note that the spirit of the tool is to encourage local educators to make decisions based on available interventions, and that this process is interesting and important to study in its own right. Second, we are concerned about the feasibility of recruiting schools under the terms that they would not be able to make these decisions (which are a key part of the EWIMS process) and that some of their students identified as at-risk might not be able to get any interventions at all. We will continue to work toward clarifying our case for focusing on the macro-intervention, and for devising a specific plan to capture detailed information about the micro-interventions that are used in the treatment schools in future iterations of the full proposal.
TWG Recommendation: Describe the proposed firewall in more detail.
ED’s Contractor Response: ED’s contractor has considered this issue carefully and are confident that they can create a secure firewall between the evaluation and implementation of the intervention. ED’s contractor has implemented similar firewalls for an evaluation regarding work with California districts in implementing the EWIMS (as part of the National High School Center) and in large-scale RCTs. Given prior success in ensuring integrity in these efforts, ED’s contractor is well poised to a) oversee the implementation of EWIMS in participating treatment schools, b) document the fidelity of implementation and conduct an implementation evaluation, and c) conduct the impact evaluation component of the study.
To establish a firewall, the study will have two separate study teams; 1) the implementation team that oversees implementing the EWIMS intervention in participating treatment schools, and 2) the evaluation team that will oversee the evaluation of implementation and impact. The project teams will be structured such that no staff will work on both the implementation and evaluation teams. That is, the evaluation team will have its own project leadership and support staff, independent of the implementation team which will have its own leadership and staff.
At a minimum, the implementation team will have no access to data from participating schools in the control group throughout the delivery of the intervention (2013-14 and 2014-15 academic years). However, because the implementation team will have access to the EWS tool, this team will have access to some implementation and outcome data for treatment schools.
Following random assignment, the evaluation and implementation teams will be prohibited from meeting together during the implementation of the intervention (2013-14 and 2014-15 academic years) and these teams will be prohibited from communicating verbally or otherwise regarding any study-related matters. In addition, documentation and files that will be accessible by the evaluation team only will be stored separately on secure servers that prohibit access to implementation team members. Finally, all project staff on the evaluation and implementation teams will be required to sign non-disclosure agreements that prevent team members from discussing any aspect of the study internally with other AIR personnel not staffed on the project or externally with non-AIR personnel.
The On-Time Graduation Study
School Survey Consent Form
PURPOSE
The On-Time Graduation Study is being conducted by the Midwest Regional Educational Laboratory at American Institutes for Research (AIR). It is funded by the U.S. Department of Education, Institute of Education Sciences and will run from March 2014 through the spring of 2016. The purpose of this survey is to ask school administrators about their school data culture and how they use student data to identify and intervene with students who may be at-risk for dropping out of high school.
PROCEDURES
Participation includes an online survey that will take approximately 60 minutes to complete. You can start or stop the survey at any point and your answers will be saved on the online system.
RISKS AND DISCOMFORTS
There are no known risks for participation. Your participation in this study is voluntary. You will not be penalized in any way for not participating. If you decide to participate, you may discontinue at any time without penalty.
CONFIDENTIALITY
You are not required to answer questions that you do not want to answer. Responses will be used only for research purposes and will be kept strictly confidential. Each participant is assigned a study identification number in place of their names. The reports prepared for this study will summarize findings and will not associate responses with a specific school or individual. We will not provide information that identifies you or your school to anyone outside the study team, except as required by law.
BENEFITS
Your participation will help the U.S. Department of Education better understand how schools use data to identify and intervene with students who may be at-risk for dropping out of high school.
MORE INFORMATION
Upon completion of the survey, you will receive a $30 gift card. If you have any questions about the gift cards, please contact Suzanne Stachel at sstachel@air.org or 202-403-5584. For more information about the study in general, you may contact Dr. Ann-Marie Faria, Project Director, at afaria@air.org or 202-403-5356. If you have concerns or questions about your rights as a participant, contact AIR’s Institutional Review Board (which is responsible for the protection of project participants) at IRB@air.org, toll free at 1-800-634-0797, or c/o IRB, 1000 Thomas Jefferson Street, NW, Washington, DC, 20007.
INFORMED CONSENT
YES,
I have read and
understand the information above and I agree to participate in this
online survey. NO,
I have read and
understand the information above and do not agree to participate in
this online survey.
The On-Time Graduation Study
Interview Consent Form
DESCRIPTION
Thank you for taking the time to participate in the On-Time Graduation Study data-team interview. The On-Time Graduation Study is being conducted by the Midwest Regional Educational Laboratory at American Institutes for Research (AIR). It is funded by the U.S. Department of Education, Institute of Education Sciences and will run from March 2014 through the spring of 2016. The purpose of this interview is to better understand how schools work with the EWIMS tool and seven-step process to identify and intervene with students who may be at-risk for dropping out of high school.
PROCEDURES
Participation includes a telephone interview of approximately 90 minutes in length with a project team member and facilitator. To facilitate the interview process, conversations will be digitally recorded and transcribed.
RISKS, DISCOMFORTS & INCONVENIENCES
There are no known risks for participation. Your participation in this study is voluntary. You will not be penalized in any way for not participating. If you decide to participate, you may discontinue at any time without penalty and you may choose not to answer specific questions.
CONFIDENTIALITY
The information we obtain from the interviews will be kept in strict confidence to the fullest extent permitted by law. Information that could identify individuals will not be shared with anyone outside of the study team. All digitally recorded files and summaries will be given codes and stored separately from any names or other direct identification of participants. All reports prepared for this study will summarize findings across the sample and will not associate responses with a specific individual. The digital files will be destroyed once all reporting activities are completed.
BENEFITS
Your participation will help the U.S. Department of Education better understand how schools use data to identify and intervene with students who may be at-risk for dropping out of high school.
MORE INFORMATION
If you have any questions about the gift cards, please contact Suzanne Stachel at sstachel@air.org or 202-403-5584. If you have any questions or concerns about this interview, please contact Dr. Ann-Marie Faria, Project Director, at afaria@air.org or 202-403-5356.
If you have concerns or questions about your rights as a participant, contact AIR’s Institutional Review Board (which is responsible for the protection of project participants) at IRB@air.org, toll free at 1-800-634-0797, or c/o IRB, 1000 Thomas Jefferson Street, NW, Washington, DC 20007.
INFORMED CONSENT
YES,
I have read and
understand the information above and give my consent to participate
in the On-Time Graduation Project interview.
NO,
I have read and
understand the information above and do not give my consent to
participate in the On-Time Graduation Project interview.
CONFIDENTIALITY AGREEMENT
On-Time Graduation Project
(American Institutes for Research under Contract No. ED-IES-12-C-0004)
Safeguards for Individuals Against Invasion of Privacy: In accordance with the Privacy Act of 1974 (5 United States Code 552a), the Education Sciences Reform Act of 2002 (Public Law 107-279), the Federal Statistical Confidentiality Order of 1997, the E-Government Act of 2002 (Public Law 107-347), and the Computer Security Act of 1987, American Institutes for Research (AIR) and all its subcontractors are required to comply with the applicable provisions of the legislation, regulations, and guidelines and to undertake all necessary safeguards for individuals against invasions of privacy.
To provide this assurance and these safeguards in performance of work on this project, all staff, consultants, and agents of AIR, and its subcontractors who have any access to study data, shall be bound by the following assurance.
Assurance of Confidentiality
In accordance with all applicable legislation, regulations, and guidelines, AIR assures all respondents that their responses may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002), 20 U.S. Code, § 9573].
The following safeguards will be implemented to assure that confidentiality is protected as allowable by law (20 U.S.C. § 9573) by all employees, consultants, agents, and representatives of AIR and all subcontractors and that physical security of the records is provided:
All staff with access to data will take an oath of nondisclosure and sign an affidavit to that effect.
At each site where these items are processed or maintained, all confidential records that will permit identification of individuals shall be kept in a safe, locked room when not in use or personally attended by project staff.
When confidential records are not locked, admittance to the room or area in which they reside shall be restricted to staff sworn to confidentiality on this project.
All electronic data shall be maintained in secure and protected data files, and personally identifying information shall be maintained on separate files from statistical data collected under this contract.
All data files on network or multi-user systems shall be under strict control of a database manager with access restricted to project staff sworn to confidentiality, and then only on a need-to-know basis.
All data files on single-user computers shall be password protected and all such machines will be locked and maintained in a locked room when not attended by project staff sworn to confidentiality.
External electronically stored data files (e.g., tapes on diskettes) shall be maintained in a locked storage device in a locked room when not attended by project staff sworn to confidentiality.
Any data released to the general public shall be appropriately masked such that linkages to individually identifying information are protected to avoid individual identification in disclosed data.
Data or copies of data may not leave the authorized site for any reason.
Staff, consultants, agents, or AIR and all its subcontractors will take all necessary steps to ensure that the letter and intent of all applicable legislation, regulations, and guidelines are enforced at all times through appropriate qualifications standards for all personnel working on this project and through adequate training and periodic follow-up procedures.
By my signature affixed below, I hereby swear and affirm that I have carefully read this statement and fully understand the statement as well as legislative and regulatory assurances that pertain to the confidential nature of all records to be handled in regard to this project, and will adhere to all safeguards that have been developed to provide such confidentiality. As an employee, consultant, agent, or representative of AIR or one of its subcontractors, consultants, agents, or representatives, I understand that I am prohibited by law from disclosing any such confidential information to anyone other than staff, consultant, agents, or representatives of AIR, its subcontractors, or agents, and Institutes of Education Science. I understand that any willful and knowing individual disclosure or allowance of disclosure in violation of the applicable legislation, regulations, and guidelines is punishable by law and would subject the violator to possible fine or imprisonment.
, July 16, 2013
(Signature) (Date)
Ann-Marie Faria
, July 16, 2013
(Signature) (Date)
Mindee O’Cummings
, July 16, 2013
(Signature) (Date)
Nicholas Sorensen
, July 16, 2013
(Signature) (Date)
Suzanne Stachel
, July 16, 2013
(Signature) (Date)
Amy Szymanski
, July 16, 2013
(Signature) (Date)
Laura Checovich
, March 7, 2014
(Signature) (Date)
Ryan Eisner
1 Data culture is a school’s general approach toward using data to inform educational decision making and includes contextual factors (e.g., the assessment and instructional context), supports for data use (e.g., professional development or structured time to review data), working with data (e.g., frequency and depth of data use), and responses to data (e.g., assignment of interventions to students).
2 Based on Ohio’s 2010–11 State Report Card
3 Based on Michigan’s 2012 Four-Year Graduation and Dropout Rate Report
4 Based on Indiana’s 2012 Statutory Graduation Rate Data (2012 State, Corporation, and School Disaggregated Graduation Rates [Public and Non-Public])
5 The Dropout Prevention Research Alliance members include stakeholders at the state, district, and local level. Example roles include state staff at the Office for Exceptional Children at the Ohio Department of Education, the members of the State Support Teams (SSTs), the Director of Reaching and Teaching Struggling Learners and Coordinator of the Superintendent’s Dropout Challenge at the Michigan Department of Education, the Assistant Superintendent of Outreach at the Indiana Department of Education, district data coordinators, district superintendents and assistant superintendents, principals, and assistant principals.
6 The hourly wage rates used to calculate total respondent cost are based on previous OMB packages and are consistent with ED’s contractor’s knowledge of SEA wage rates.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | O'Cummings, Mindee |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |