Contract No.: ED-01-CO-0038-0009
Feasibility and Conduct of an Impact Evaluation of Title I Supplemental Educational Services:
Revision to Currently Approved Collection (ICR Reference Number 200805-1850-003)
Part A: Supporting Statement for Request for OMB Approval of Data Collection Instruments
Submitted to:
Institute of Education Sciences U.S. Department of Education 80 F Street, NW Washington, DC 20208
Project Officer:
|
Submitted by:
Mathematica Policy Research, Inc. P.O. Box 2393 Princeton, NJ 08543-2393 Telephone: (609) 799-3535 Facsimile: (609) 799-0005
Project Director: |
CONTENTS
Page
SUPPORTING STATEMENT REQUEST FOR revision of a currently
approved collection 1
A. JUSTIFICATION 1
1. Circumstances Necessitating Collection of Information 2
2. How, by Whom, and for What Purpose Information Is To Be Used 7
3. Use of Automated, Electronic, Mechanical, or Other Technological Collection Techniques 8
4. Efforts to Avoid Duplication of Effort 8
5. Sensitivity to Burden on Small Entities 8
6. Consequences to Federal Program or Policy Activities if the Collection Is Not Conducted or Is Conducted Less Frequently than Proposed 9
7. Special Circumstances 9
8. Federal Register Announcement and Consultation 9
9. Payment or Gift to Respondents 10
10. Confidentiality of the Data 10
11. Additional Justification for Sensitive Questions 11
12. Estimates of Hour Burden 12
13. Estimate of Total Annual Cost Burden to Respondents or Recordkeepers 13
14. Estimates of Annualized Cost to the Federal Government 13
15. Reasons for Program Changes or Adjustments 13
16. Tabulation, Publication Plans, and Time Schedules 13
17. Approval Not to Display the Expiration Date for OMB Approval 16
18. Exception to the Certification Statement 16
REFERENCES 18
APPENDIX A: SES PROVIDER QUESTIONNAIRE AND ACCOMPANYING MATERIALS
APPENDIX B: STUDENT PARTICIPATION SPREADSHEET
APPENDIX C: SCHOOL RECORDS
APPENDIX D: CONFIDENTIALITY PLEDGE
This is a second-stage request for approval to carry out outcome data collection activities for the Feasibility and Conduct of an Impact Evaluation of Title I Supplemental Educational Services. In this second clearance request, the Institute of Education Sciences (IES) of the U.S. Department of Education requests OMB approval to extend district recruitment for an additional year. During the baseline year, we were able to recruit 8 districts; however, these districts were concentrated geographically—with a disproportionate number in a single state.1 In order to achieve the goal of 50,000 participating students and increase the geographical diversity of the sample, we will recruit a second cohort of districts during the 2009-2010 school year. The number of districts to be recruited remains the same as previously approved, totaling 12 districts. In addition, at this time we are also requesting approval for the outcome data collection phase, which for the original districts (cohort 1) will occur in spring 2009, and for the new districts (cohort 2) will occur in spring 2010. The data collection phase includes: (1) an SES provider survey (which will allow the contractor to assess provider characteristics that can then be linked to impacts), (2) SES student participation data, and (3) the collection of student records including state and/or district test scores, which are the main outcome for the evaluation.
The Feasibility and Conduct of an Impact Evaluation of Title I Supplemental Education Services received OMB approval (ICR reference number 200805-1850-003, ICR tracking number 3634) on August 21, 2008. OMB approved the first clearance request describing the study, the regression discontinuity design (discussed below), and baseline data collection activities. Feasibility was determined based on the recruitment and baseline data collection efforts, and in August 2008, the Institute of Education Sciences (IES) approved the option to proceed with the conduct of the full study.
As noted in the original submission, the collection of information is needed to support a rigorous evaluation of supplemental educational services (SES) for the U.S. Department of Education (ED). The No Child Left Behind Act (NCLB) requires school districts to offer SES to students who attend schools that have failed to make adequate yearly progress (AYP) for three years. SES are tutoring or other academic support services offered outside the regular school day by state-approved providers, free of charge to eligible students. Parents can choose the specific SES provider from among a list of providers approved to serve their area. This evaluation is authorized under the No Child Left Behind Act of 2001, Section 1501 (PL No 107-110).
Mathematica Policy Research (MPR) is working with IES to design and conduct a rigorous evaluation of SES based on a regression discontinuity (RD) design in up to 12 districts. The primary research questions to be answered by the evaluation are: (1) what is the effect of SES on student achievement? and (2) how does the effect of SES vary by provider characteristics? MPR will assess the impact of SES by comparing a treatment and control group of students, where the treatment and control groups are formed purposefully based on a measure of prior achievement (such as a test score or grade point average). Valid estimates of the effect of SES can be determined by comparing the average reading and math scores of students who were accepted into SES to the average scores of students who were not accepted into SES, after regression adjusting for the measure of prior achievement used to determine acceptance (this is the definition of an RD design). MPR will assess how impacts vary by provider characteristics by calculating provider-specific impacts and then relating those impacts to provider characteristics, as measured using a survey of SES providers.
Below we describe the need for a rigorous evaluation of SES, the research questions that an impact evaluation would answer, the rationale for using an RD design, and the data collection activities of the study.
The No Child Left Behind Act (NCLB) requires districts with Title I schools that fall short of state standards for three sequential years to offer SES to their students from low-income families. Hundreds of thousands of students participate in SES, but so far little systematic information is available on the effectiveness of SES in promoting student achievement or on the operational characteristics of effective SES providers. During the 2006–2007 school year, 529,627 students participated in SES nationwide. The potential market for SES is substantially larger, as the participating students constituted only 14.5 percent of all the 3,645,665 students eligible to receive SES in 2006–2007 (U.S. Department of Education, Annual Performance Reports). The few studies that have examined the relationship between participation in supplemental services and student achievement have relied on non-experimental designs (for example, Zimmer et al. 2007). Additional research using more rigorous methods that permit more definitive causal inference is needed to assess the achievement impacts of SES.
We will assess the impact of SES by comparing a treatment and control group of students, where the treatment and control groups are formed purposefully based on prior achievement. Valid estimates of the effect of SES can be determined by comparing the average reading and math scores of students who were accepted into SES to the average scores of students who were not accepted into SES, after regression adjusting for the measure of prior achievement used to determine acceptance (this is the definition of an RD design).
A randomized experimental evaluation of SES is precluded by NCLB, which requires that all eligible students who request services receive them, as long as resources are available. Although a randomized design is precluded by statute, NCLB’s rules about the allocation of services when resources are constrained create the opportunity for an RD analysis that will allow causal inferences with rigor approaching that of a randomized experiment. Under the RD design, we will evaluate SES in a small number of districts that, because of funding constraints, cannot serve all students eligible for SES services and that must ration services on the basis of a quantifiable, continuous score (such as an achievement test score). Eligible applicants with scores below a preselected, fixed cutoff score will be offered SES services (treatment students), whereas eligible applicants with scores above the cutoff value (control students) will not. Unbiased estimates of the impacts of SES services can then be obtained by comparing the outcomes of eligible applicants below and above the cutoff value, after adjusting for baseline assignment scores2. RD is generally considered one of the strongest quasi-experimental designs available to researchers for purposes of causal inference (see, for example, Shadish et al. 2002), and it is the only methodology other than random assignment to fully meet the standards of ED’s What Works Clearinghouse (http://ies.ed.gov/ncee/wwc/overview/review.asp?ag=pi). A detailed description of the RD design is provided in Section A.16.
Evaluation of Title I programs and services is authorized in the No Child Left Behind Act, Title I, Part E, Section 1501. Although there is no federal requirement or legislation specifically requiring an evaluation of the SES program, findings of the current study will not only inform national policy discussions about SES but also provide direct feedback to participating districts about the effectiveness of the SES offered in their districts.
b. Research Questions for the Full Evaluation
The contractor will examine the following research questions in conducting a rigorous evaluation of SES:
1. What is the impact of participation in Title I Supplemental Educational Services on student achievement in reading and mathematics? SES represents a considerable investment of resources with the specific purpose of improving the academic achievement of students attending schools that are failing to make AYP. This question focuses on whether SES as a whole are achieving that purpose.
2. Are district characteristics and practices, SES provider characteristics and services, and student characteristics related to the impact on student achievement? Of particular interest to ED is whether specific provider types are more effective than others. Dimensions along which providers might vary include substantive focus (for example, math or reading), intensity (for example, frequency of student attendance), and method of delivery (for example, small group activities, one-on-one tutoring, or the use of computer technology). Relevant student subgroups include prior student achievement and whether or not students are served by their parents’ first choice of providers.
c. Overview of the Design and Feasibility of the Study
The feasibility of the evaluation was assessed through informal conversations with school district officials in 9 of the districts originally identified by OII. During the recruitment and baseline data collection, the contractor was able to determine that there was sufficient number of districts with oversubscription allocating services based on quantifiable measures of prior student achievement (or similar assignment variables), and that it is possible to evaluate SES using an RD design. At this point, 8 districts have agreed to participate in the study. The power calculations of an evaluation based on an RD design indicate that a total sample of 50,000 students will be sufficient to answer the study’s research questions.3 Only a very small number of districts across the country are sufficiently oversubscribed to permit the use of an RD design. During the design and baseline data collection we identified 8 districts, drawn from 24 prospects provided by ED’s Office of Innovation and Improvement (OII). Five of the eight currently identified districts are in the state of Florida. In order to make the study’s findings relevant to a broader audience with more districts from other parts of the country included, OII has provided the names of other potential districts and states to contact this year for recruitment into a second cohort. (See Table A.1).
TABLE A.1
SCHOOL DISTRICTS CURRENTLY RECRUITED AND NEW POTENTIAL DISTRICTS
Currently Recruited District Name (Cohort 1) |
State |
Albuquerque Public Schools |
New Mexico |
Bridgeport School District |
Connecticut |
Cincinnati School District |
Ohio |
Collier County School District |
Florida |
Dade County School District |
Florida |
Gadsden County School District |
Florida |
Palm Beach County School District |
Florida |
Pinellas County School District |
Florida |
Potential Districts Added Fall 2009 (Cohort 2) |
|
Akron City School District |
Ohio |
Anchorage School District |
Alaska |
Boston Public School District |
Massachusetts |
Fall River Public School District |
Massachusetts |
Hartford School District |
Connecticut |
Long Beach Unified School District |
California |
Malden School District |
Massachusetts |
Pomona Unified School District |
California |
Providence School District |
Rhode Island |
San Diego Unified School District |
California |
d. Structure of the Data Collection Effort
To help ED address the study research questions, the contractor will collect and analyze data from several sources. Clearance was already given for the collection of baseline data from the parents’ application for their child to participate in the SES program which will be used to link applicants to providers, in order to answer the second research question. Recruitment of additional districts will require collecting baseline data from a second cohort of up to four additional districts.
In terms of new data collection activities, clearance is currently being requested to collect outcome data, including: (1) an SES provider survey (which will allow the contractor to assess provider characteristics that can then be linked to impacts), (2) SES student participation data, and (3) the collection of student records (including state and/or district test scores) (the main outcome for the evaluation). Table A.2 shows the schedule of these data collection activities.
e. Data Collection Activities for which OMB Clearance Is Being Requested
SES Provider Survey
In spring 2009, the contractor will collect information from SES providers (cohort 1 providers) through a self-administered questionnaire. The questionnaire (a mail survey with telephone followup) will focus on provider characteristics (for example, type and size of organization, years in existence); staff characteristics (gender, ethnicity, prior teaching experience, current certification, employment in study district); services provided (type, frequency, delivery methods); and characteristics of all the students they serve, not just those in the study. The contractor pre-tested the instrument during the winter of 2009 and confirmed that on average, the instrument took 30 minutes to complete. This included the time respondents needed to look up information. Issues with the overall questionnaire design, question wording, and question order have all been addressed and the changes are reflected in the final version of the questionnaire (attached). Copies of the Provider Survey and examples of accompanying letters are included in Appendix A.
TABLE A.2
DATA
COLLECTION SCHEDULE
Activity |
Respondent |
Clearance Previously Approved |
Clearance Requested in Current Package |
Baseline Data Collection, Fall 2008: Collect SES application data (35,000 records from 8 districts – cohort 1) |
Parent/guardian via school districts |
X
|
|
Outcome Data Collection, Spring 2009: SES provider survey – cohort 1 (384 providers) |
SES provider |
|
X |
Outcome Data Collection, Spring/Summer 2009: Obtain SES student participation data from district (35,000 – cohort 1) |
District/School staff/ SES Providers |
|
X |
Outcome Data Collection, Summer 2009: Obtain student records/district test scores – cohort 1 (35,000 records) |
District/School staff |
|
X |
Additional Baseline Data Collection, Fall 2009: Collect SES application data from up to 4 additional districts – cohort 2 (15,000 records) |
Parent/guardian via school districts
|
X
|
|
Outcome Data Collection, Spring 2010: SES provider survey - cohort 2 (66 providers) |
SES provider |
|
X |
Outcome Data Collection, Spring/Summer 2010: Obtain SES student participation data from district (15,000 – cohort 2) |
District/School staff/ SES Providers |
|
X |
Outcome Data Collection, Summer 2010: Obtain student records/district test scores - cohort 2 (15,000 records) |
District/School staff |
|
X |
SES Student Participation Data
A second source of information about services will be obtained through gathering SES student participation data including information on the type and amount of services provided to each student served in the 2008-2009 school year (or 2009-2010 for cohort 2). We plan to gather the student participation data from districts and have confirmed during our baseline data collection efforts that the districts can provide sufficiently detailed attendance information needed for the study. The SES attendance and participation information will be requested during the school records data collection effort although the information may be kept in different systems at the district. A copy of the student participation spreadsheet is included in Appendix B.
Collection of Demographic Data and Student Achievement Scores
During the summer/early fall of 2009, the contractor will collect scores from tests administered by the state or district in school years 2006–2007, 2007–2008, and 2008–2009 for 2008-09 SES applicants who are included in the study. The demographic and other student-level information we will collect from the districts includes grade level, month and year of birth, race/ethnicity, English proficiency, disability status, eligibility for free or reduced-price school lunch, student grades, and school attendance. We anticipate that we will be able to obtain school records for 35,000 students across the eight original cohort 1 districts. The same school records information will be requested from the four cohort 2 districts for the estimated 15,000 2009-2010 SES applicants in the summer/early fall of 2010 (school years 2007–2008, 2008–2009, and 2009-2010 for 2009-10 SES applicants). (See Appendix C).
2. How, by Whom, and for What Purpose Information Is To Be Used
The information collected will inform an impact evaluation of Title I SES on the reading and mathematics achievement of third- to eighth-grade students. The data will also be useful for state and local policymakers, districts and schools, and parents. The information will also inform policy decisions about the approval and funding of SES providers. Specifically, data collection efforts will be used in the following ways:
SES Application Data (previously approved). The contractor will use data from parents’ SES applications to identify parents’ preferred SES providers. School districts will collect this information from parents as part of the SES application form. Districts will then record the data and submit information in an electronic file to the contractor. By identifying parents’ preferred SES providers before the RD cutoff is determined, we can assess whether SES has a greater impact for students who get their first choice provider.
SES Provider Survey. When combined with information from SES application forms, data from the SES provider survey will allow ED to relate provider effectiveness to provider characteristics and practices. Providers in each district will be asked to participate in the provider survey.
SES Student Participation Data. The contractor will also collect participation and attendance records for students who attend SES or other district-provided after school programs. This information will allow ED to describe the counterfactual experiences of students in the control group and calculate the effect of participating in SES (as opposed to the effect of offering SES—see Section A.16). This information is expected to come directly from the districts.
Student Records and District Test Scores. The main outcome for this study will be students’ scores on districts’ existing tests. These tests are most relevant to districts because they are the tests used for accountability purposes. Earlier test scores or other continuous measures from student records (GPA, attendance) can also be used as the RD cutoff variable, and to improve the precision of impact estimates. These records will be collected for all 50,000 students in the study (35,000 from cohort 1 and 15,000 from cohort 2).
The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Where feasible, information will be gathered from existing electronic data sources, such as program and school records. Since some school districts and SES providers may not use electronic means of collecting and storing data, the contractor expects to receive some data from reporting forms or preexisting documents. To avoid burdening the school districts and SES providers, the contractor will offer them the option of delivering participation data electronically, filling out a straightforward reporting form manually, or submitting hard-copy documents that already exist. Minimizing evaluation costs and reducing respondent burden were key considerations in the decision to collect preferred provider information via existing application forms as opposed to administering new parent surveys.
In the case of school records and participation data, confidential information in electronic form for this project will be collected, stored and processed centrally on the contractor’s password-protected local area network (LAN) directories. Access rights to confidential project files and other materials are granted by the project director or the project task leader responsible for data collection and processing on a need-to-know basis. The contractor will work with the districts and providers on how to password protect the files being sent to the contractor. Data stored on network drives is protected using the security mechanisms available through the networking operating systems used on the contractor’s primary network servers: Novell Netware 6.0 and 6.5. Novell Netware 6.0 and 6.5 are compliant with the C2/E2 Red Book security specifications. IntraNetware is certified at the National Computer Security Center’s Trusted Network Interpretation Class C2 level of security at the network level. All servers containing confidential information reside in a controlled-access area. The network is protected from unauthorized external access through a firewall from Cisco System, Inc. This firewall resides between the T1 line over which internet traffic flows and the remainder of the network.
This effort will yield unique data to evaluate the impact of Title I SES. There are no similar evaluations being conducted and there is no alternative source for the information to be collected. Moreover, the data collection plan reflects careful attention to the potential sources of information for this study and particularly the reliability of the information and efficiency in gathering the information. In sites where district’s SES attendance records are adequate for the evaluation, the contractor will refrain from asking providers to supply this information independently, thereby avoiding unnecessary collection of information from multiple sources.
All data collection will be coordinated by the evaluation contractor so as to minimize burden on school and district staff and SES providers. The primary entities for this study are schools and the districts to which they belong, along with providers of SES. Burden is reduced for all respondents by requesting only the minimum information required to meet the study objectives. The burden on schools, districts, and providers has been minimized by carefully specifying information needs, restricting questions to generally available information, and designing the data collection strategy to minimize burden on respondents.
In the absence of the impact evaluation, ED will not be able to assess the impacts of SES on student achievement. The data collection plan calls for the minimum amount of data needed to measure differences in student achievement based on SES provider. The collection of SES provider and school record data will be a one-time collection.
There are no special circumstances involved with this data collection.
The comment period notices have been published for public comments and no comments have been received.
a. Comments
No comments received.
b. Consultations Outside the Agency
During preparation of the study design and data collection plan for this evaluation, professional counsel was sought from a number of people. Input was solicited from a broad range of researchers, most of whom are members of the Technical Working Group under contract to design the impact evaluation.
These individuals are
Ron Zimmer, RAND
Steven Ross, University of Memphis
Drew Gitomer, Educational Testing Service
Jeffrey Smith, University of Michigan
Thomas Cook, Northwestern University
Robert Linn, University of Colorado
Erica Harris, Chicago School District
Tim Silva, Mathematica Policy Research
Peter Schochet, Mathematica Policy Research
No additional people were consulted regarding the outcome data collection effort for which we are now seeking approval.
c. Unresolved Issues
None.
The provider survey is expected to take 30 minutes and is considered a high burden for the respondent. Some of the information being requested may require providers to consult records of the organization. We believe that we will need to provide an incentive to help motivate participation and improve the response rate. We plan to give providers an incentive of $30 which is consistent with the incentives proposed in the NCEE memo Guidelines for Incentives for NCEE Evaluation Studies, dated March 22, 2005.
For SES providers, there is no real incentive to cooperate with this study, since they receive their funds from the districts and not directly from the U.S. Department of Education, the study sponsor. We gauged how much of an incentive might be needed for the provider survey from the experience with Upward Bound grantees. Unlike SES providers, Upward Bound program grantees received funds directly from the Department and are required to cooperate with an evaluation as a condition of receiving a grant as indicated under EDGAR requirements. Despite these requirements, an incentive of $25, approved by OMB, was required in order to get adequate response rates in the national evaluation of Upward Bound in 2004 (the $10 incentive initially approved by OMB was found to be insufficient to yield acceptable response rates).
The goal of the incentive in the Provider Survey is to boost the response rate to the mail survey. The more providers who respond at that stage, the fewer who will require a phone followup, a significantly more expensive mode of data collection. Greater up-front investment by means of incentives is likely to produce greater cost savings in later stages by minimizing the number of cases requiring expensive phone followup.
All data collection activities will be conducted in full compliance with ED regulations. The contractor will follow the new policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, which requires “[a]ll collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of Section 552 of Title 5, United States Code, the confidentiality standards of Subsection (c) of this section, and Sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.
In addition, the contractor will ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools will not be disclosed in accordance with Section 552a of Title 5, United States Code, the confidentiality standards of Subsection (c) of this section, and Sections 444 and 445 of the General Education Provision Act.
The contractor will protect the privacy of all information collected for the study and will use it for research purposes only. For the provider survey the completed questionnaires and check request forms will be kept in a locked storage cabinet and accessible only by project staff. The data collected from the districts, including student names, demographic characteristics and test scores will all be in electronic formats. The contractor employs several levels of security for electronic data, including access for only the personnel directly working on the project. In addition, the contractor has set up a secure, password protected web site for the transfer of these electronic files from the districts. Original data files will be destroyed upon completion of the project. The contractor will protect all electronic data from unintended and unauthorized access, and all software systems proposed for the project guarantee the security of the data they hold as well as the transmission of data between them.
No information that identifies any study participant will be released. Information from participating institutions and respondents will be presented at aggregate levels in reports. Information on respondents will be linked to their program, district, and school but not to any individually identifiable information. Once linking of the various files are complete, all identifying information will be removed from the analysis files and replaced with unique identifiers. No individually identifiable information will be maintained by the study team. All staff that have access to respondents, schools or data will: 1) sign a notarized Confidentiality Pledge (Appendix D) and 2) obtain security clearance through NCEE’s security clearance officer.
In addition, the following verbatim language will appear on all letters, brochures, and other study materials:
Your responses to this data collection are protected from disclosure per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183. Responses to this data collection will be used only for statistical purposes. Mathematica Policy Research (MPR) will present the information collected as part of this survey in an aggregate form, and will not associate responses to any of the individual SES providers who participate. MPR will not provide information that identifies you and your organization to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.
The System of Records Notice was published in the federal register on January 16, 2009.
There are no questions of a sensitive nature included in data collection instruments or procedures. Participation in the study is voluntary and all data collection activities will be conducted in full compliance with ED regulations.
Table A.3 provides an estimate of time burden for the additional data collection activities for which approval is being sought. The annualized burden hours for which we are seeking approval in this package is 157 hours, based on the total of 471 average burden hours over three years during which data collection takes place. The annualized hours are based on a 3-year period for the data collection for two cohorts to account for additional time required due to potential delays in the availability of school records data (i.e. delay of student test scores being available for cohort 2 until fall/winter 2010). The burden associated with collecting baseline data from cohort 2 schools is included in the prior submission as we had anticipated reaching the 50,000 student goal based on the original eight districts, and thus is not included in the burden estimates for this submission. The burden estimates also include time burden for the recruitment of cohort 2.
All SES providers (384 cohort 1 and 66 cohort 2) will complete a SES Provider Survey. All school districts (8 cohort 1 and 4 cohort 2) will provide student demographic and academic achievement (school records) on all SES applicants in an electronic file to submit to the contractor. In addition, all districts will also supply SES participation data to the contractor.
Table A.3
average annual
BURDEN TO RESPONDENTS IN HOURS
Data Collection Activities |
Average Annual Number of Respondents |
Number of Responses/ Respondent |
Average Burden
Hours/ |
Total Average Annual Burden Hours |
Average Hourly Rate |
Estimated Monetary Cost Burden To Respondent |
School Year 08-09 – Cohort 1 |
|
|
|
|
|
|
SES Provider Survey |
384 |
384 |
.5 |
192 |
$40 |
$ 7,680 |
Subtotal for Providers |
384 |
384 |
.5 |
192 |
|
$7,680 |
SES Student Participation |
8 |
8 |
8 |
64 |
$30 |
$1,920 |
School Records |
8 |
8 |
8 |
64 |
$30 |
$ 1,920 |
Subtotal for Districts |
8 |
8 |
16 |
128 |
|
$3,840 |
Estimated 08-09 Total |
392 |
392 |
|
320 |
|
$ 11,520 |
School Year 09-10- Cohort 2 |
|
|
|
|
|
|
SES Provider Survey |
66 |
66 |
.5 |
33 |
$40 |
$ 1,320 |
Subtotal for Providers |
66 |
66 |
.5 |
33 |
|
$1,320 |
Calls to determine eligibility* |
22 |
22 |
1 |
22 |
$30 |
$ 660 |
Recruiting of 4 eligible districts |
4 |
4 |
8 |
32 |
$30 |
$ 960 |
SES Student Participation |
4 |
4 |
8 |
32 |
$30 |
$ 960 |
School Records |
4 |
4 |
8 |
32 |
$30 |
$ 960 |
Subtotal for Districts |
26 |
26 |
16 |
118 |
|
$3,540 |
Estimated 09-10 Total |
92 |
92 |
|
151 |
|
$4,860 |
Estimated Total |
|
|
|
471 |
|
$ 16,380 |
* MPR will contact all potentially eligible districts not already participating and not in Florida.
There are no additional respondent costs associated with this data collection other than the hour burden accounted for in item 12.
The estimated cost to the federal government to carry out the Impact Evaluation of Title I Supplemental Education Services is $2,147,060. The study will be carried out over roughly four years (from fall 2007 to spring 2011). The annual cost of the data collection in this Request for OMB Approval of Data Collection Instruments and analysis of this data is $536,765.
Previously approved burden hours are recorded as 2,000 annual hours. This request is for the newly revised collection of an additional 157 annual hours. Since the burden hours from the approved data activities from the previous collection are still being used, we are carrying over the 2000 annual hours and adding 157 annual hours, for a total of 2157 annual hours. As a result the program change reflects an overall increase of 157 hours.
As detailed in the original OMB submission, using an RD design, valid estimates of the effect of SES can be determined by comparing the average reading and math scores of students who were accepted into SES to the average scores of students who were not accepted into SES, after regression adjusting for the measure of prior achievement used to determine acceptance. Figure 1 illustrates the RD design graphically, using a hypothetical example in a hypothetical district. In this example, students with an assignment score of 50 or less receive SES (the treatment group), and students with a score over 50 do not (the control group). This figure plots student math test scores against assignment scores. It also displays the fitted regression line for the treatment and comparison groups. The estimated impact on math test scores is the vertical distance between the two regression lines at the cutoff value of 50. In this example, all data are used to calculate the impact, including data from students who are far from the RD cutoff. Making use of all available data increases the statistical precision of the impact estimate because it improves our ability to regression adjust for the measure of prior achievement.4 An important consideration in calculating impacts using an RD design is the functional form used to regression adjust for prior achievement. In Figure A.1, the functional form is linear. In practice, we will also calculate impacts using non-parametric regression techniques that allow for a more flexible functional form.
Because the assignment score will be defined differently across districts (we anticipate that in most cases it will be based on a prior year’s test score) and because each district will use a different cutoff for allocating services, we will estimate separate impacts for each district in the sample and then compute a weighted average of these estimates to obtain an overall estimate of the impact of SES among the districts in our sample.5 We will weight district-specific estimates according to the number of eligible students in each district, which will provide an estimate of the impact of SES on the average student under study.6
We will also use the RD design to explore the relationship between SES provider characteristics and effectiveness. In the SES application materials, we will ask parents to name their preferred SES provider. Because we will identify the preferred providers prior to determining the RD cutoff, we will be able to calculate provider-specific impacts for districts served by the largest providers (for example, by estimating a separate impact regression for each provider). However, very few providers will have a large enough sample size for sufficient statistical power to produce reliable provider-specific estimates. In most cases, provider-specific estimates will be aggregated across providers based on provider characteristics and practices. Dimensions along which interventions might vary include substantive focus (for example, math or reading), intensity (for example, frequency of student attendance), and method of delivery (for example, small group activities, one-on-one tutoring, or the use of computer technology).
FIGURE A.1
HYPOTHETICAL
EXAMPLE OF THE RD METHOD
One additional consideration is that some students offered SES might not receive the services, and some students whose assignment score exceeds the cutoff might nonetheless manage to receive SES.7 If this is the case, the impact estimates will represent the impact of offering students SES rather than the effect of receiving SES. We propose to collect data on whether students received SES from the provider survey and from district administrative records. If many students who were offered SES chose not to receive them, or if students who should not have received SES according to their assignment score do in fact receive them, we can compute an additional estimate reflecting the impact on students of receiving SES using what is known as a “fuzzy” RD design (Trochim 1984; Hahn et al. 2001). This approach is similar to calculating the impact of treatment on the treated in a randomized control trial using a Bloom (1984) adjustment, essentially using the discontinuity in SES receipt at the assignment score cutoff as an instrumental variable for SES receipt, holding constant a function of the assignment score.
b. Publication Plans
The evaluation report will be completed after all data from the 2008–2009 school year (cohort 1) and the 2009-2010 school year (cohort 2) have been collected and analyzed. A draft report will be completed by January 2, 2011, and the final report will be completed by the end of September 2011.
c. Time Schedule
The full timeline for the evaluation is shown in Table A.4. The timeline calls for design and district recruiting (cohort 1) in summer and fall 2008, data collection for cohort 1 between fall 2008 and summer 2009, recruiting of cohort 2 districts winter 2008-fall 2009, data collection for cohort 2 between fall 2009 and summer 2010, and analysis and report writing between summer 2010 and spring 2011.
Approval not to display the expiration date for OMB approval is not requested.
No exceptions to the certification statement are requested or required.
TABLE A.4
STUDY ACTIVITIES TIMELINE
Time Period |
Activity |
Spring 2008 |
Contractor contacts districts to assess feasibility of study. |
Summer 2008 |
Contractor recruits districts to be in study (cohort 1). |
Fall 2008 |
Districts enrolls students in SES (September - October 2008)
Contractor provides technical assistance to cohort 1 districts during enrollment process (September - October 2008)
Cohort 1 districts provide contractor with application data.
|
Winter 2008 – Summer 2009 |
Identify and recruit additional oversubscribed districts (cohort 2) |
Spring 2009 |
Contractor conducts SES provider survey (cohort 1). |
Summer 2009 |
District provides contractor with student-level data files on spring 2009 test results, demographics, and level of participation in SES (cohort 1). |
Fall 2009 |
Districts enrolls students in SES – cohort 2 (September - October 2009)
Contractor provides technical assistance to cohort 2 districts during enrollment process (September - October 2009)
Cohort 2 districts provide contractor with application data.
|
Spring 2010 |
Contractor conducts SES provider survey (cohort 2) |
Summer 2010 |
District provides contractor with student-level data files on spring 2010 test results, demographics, and level of participation in SES (cohort 2). |
Summer 2010-2011 |
Contractor conducts analyses and writes report. |
Bloom, H.S. “Accounting for No-Shows in Experimental Evaluation Designs.” Evaluation Review, vol. 8, 1984, pp. 225-246.
Center on Education Policy. From the Capital to the Classroom: Year 4 of the No Child Left Behind Act. Washington, DC: Center on Education Policy, March 2006.
Hahn, Jinyong, Petra Todd, and Wilbert Van der Klaauw. “Identification and Estimation of Treatment Effects with a Regression-Discontinuity Design.” Econometrica, vol. 69, no. 1, January 2001.
Shadish, W.R., Thomas D. Cook, and Donald T. Campbell. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston: Houghton Mifflin, 2002.
Stullich, Stephanie, Elizabeth Eisner, Joseph McCrary, and Collette Roney. National Assessment of Title I Final Report. Volume 1: Implementation of Title I. Washington, DC: U.S. Department of Education, February 2007.
Trochim, W. Research Design for Program Evaluation: the Regression-Discontinuity Approach. Beverly Hills: Sage Publications, 1984.
Zimmer, Ron, Brian Gill, Paula Razquin, Kevin Booker, and J.R. Lockwood III. State and Local Implementation of the No Child Left Behind Act. Volume I: Title I School Choice, Supplemental Educational Services, and Student Achievement. Washington, DC: U.S. Department of Education, 2007.
1 The eight districts have a total of approximately 55,000 eligible SES applicants in 2008-09, with approximately 80% of the sample from Florida.
2 This analysis would examine whether there is a discontinuity in the relationship between the “assignment score” variable (prior achievement) and the outcome (subsequent achievement) at the prior achievement level that is used as the cutoff for assignment to services.
3 This study requires more students than some other education evaluations for two reasons. First, a key research question is how the effects of SES vary by provider type, which requires a large overall sample in order to support smaller subgroup analyses. Second, the RD design is not as statistically powerful as an experimental design, requiring more students in order to detect effects.
4 We plan to calculate impacts using all available data, but we will also calculate impacts using only students who are close to the RD cutoff as a sensitivity analysis.
5 In some districts, assignment score and/or cutoff might also differ by grade, in which case we will estimate district/grade-specific impacts.
6 As a sensitivity analysis, we will also calculate the impact on the average district by giving an equal weight to each district-level impact.
7 This second concern is known as comparison group “crossover,” which might occur if the district erroneously provides the student SES or does not have a systematic approach for allocating available services from a waiting list when students initially offered SES decline them.
File Type | application/msword |
File Title | Supporting Statement for OMB Approval |
Author | Martha Bleeker |
Last Modified By | Yumiko Sekino |
File Modified | 2009-06-04 |
File Created | 2009-06-04 |