U.S. Department of Education
Impact Evaluation of the DC Opportunity Scholarship Program
Office of Management and Budget
Statement for Paperwork Reduction Act Submission
Part A: Justification
Contract ED-04-CO-0126
October 3September 28, 2007
TABLE OF CONTENTS
Part A. Justification…………………………………………………………………. 1
A.1 Explanation oOf Circumstances tThat Make Collection oOf Data Necessary 1
A.2 How the Information Will be Collection, by Whom, and for What Purpose 3
A.3 Use Of Improved Information Technology to Reduce Burden 65
A.4 Efforts To Identify and Avoid Duplication 6
A.5 Efforts to Minimize Burden on Small Businesses or Other Entities 6
A.6 Consequences of Less-Frequent Data Collection 6
A.7 Special Circumstances Regarding Collection of Information in a Manner
Inconsistent with Section 1320.5(d)(2) of the Code of Federal RegulationSpecial Circumstances Regarding Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulationss 6
A.8 Federal Register Comments and Persons Consulted Outside The Agency 7
A.9 Payments to Respondents 7
A.10 Assurances of Confidentiality 8
A.11 Questions of a Sensitive Nature 10
A.12 Estimates of Respondent Burden 10
A.13 Estimates of the Cost Burden to Respondents 11
A.14 Estimate of Annualized Government Costs 11
A.15 Change in Hour Burden 11
A.16 Time Schedule, Publication, and Analysis Plan 11
A.17 Display of Expiration Date for OMB Approval 15
A.18 Exceptions to Certification Statement 15
This package represents a request for a short extension of an evaluation design and set of data collection instruments previously approved by OMB (OMB No.1850-0800, approval notice dated 4/15/05). Based on earlier rounds of data collection, we expect that to complete the final round with reasonable response rates, we will need to extend assessment and survey administration several months beyond the April 2008 expiration date of the original package. Because the design for and burden of the final round of data collection was included in the original package, this current package is identical in content to the package approved by OMB with one exception: the incentive section (A9) has been updated to reflect increases in the incentive amounts and approach that were approved by OMB on 1/10/06 and 4/24/07. (Minor changes in wording have been made to the section headings to reflect the current OMB headings.)
Introduction
In early 2004, the U.S. Congress passed the DC School Choice Incentive Act, Title III of the District of Columbia Appropriations Act of 2004, Division C of HR 2673 (PL 108-199). The legislation established a new, five-year school choice program for low-income residents of Washington, DC, and provided for a program operator to design and oversee parent outreach efforts, school recruitment, the student application process, and the distribution of scholarships.1 The program provides scholarships of up to $7,500 per student per year to enable low-income elementary and secondary students to attend private schools in lieu of the public schools already available to them. It is anticipated that, given annual appropriations of $13 million, up to 2000 students could be supported by scholarships each year, since most private schools in DC charge less than the ceiling amount for tuition and fees. The law requires that students be assigned scholarships by lottery if there are more eligible applicants than can be accommodated by the appropriation or the availability of seats in participating private schools.
The law also requires an evaluation of the program “using the strongest possible research design for determining the effectiveness” of the program (Section 309, see Attachment 1). The U.S. Department of Education (ED) awarded contracts to Westat, and its research partners, the University of Arkansas (formerly Georgetown University) and Chesapeake Research Associates, to (1) provide technical assistance to the program operator, particularly with respect to the design and conduct of the random assignment of participants during the baseline year of 2004, and (2) perform a 5-year impact evaluation of the program.
This document represents the Supporting Statement for the data collection and analysis to be conducted under the Impact Evaluation of the DC Opportunity Scholarship Program. In particular, we are requesting approval for: (1) parent, student, and principal surveys, (2) ongoing testing of student applicants, and (3) records abstraction from DC Public Schools (DCPS) administrative files.
Study Design
The foundation of the DC Opportunity Scholarship Program evaluation will be a Randomized Control Trial (RCT) comparing outcomes of eligible applicants (students and their parents) assigned by lottery to receive or not receive a scholarship. This design is consistent with the requirement for a rigorous evaluation as well as the need to fairly allocate the scholarships if the program is oversubscribed. At the same time, the law specified other kinds of comparisons and analyses, resulting in a planned evaluation study that includes both quantitative and qualitative components, and both performance (progress) reporting and measures of impact.
Research Questions
The study is designed to address the following key questions:
What is the impact of the program on student academic achievement? The law places high priority on examining whether the program—the availability and offer of scholarships—improves the academic achievement of eligible students. This question can be addressed most rigorously by comparing the academic achievement of student applicants randomly assigned to receive and not receive scholarships. However, the law also asks for a comparison of the academic achievement of students who participate in the program with their grade-level counterparts in DCPS.
What is the impact of attending private versus public schools? Because it is likely that some students offered scholarships will choose not to use them, the evaluation will also use accepted econometric methods to examine the effects for students who take the scholarship offer and enroll in a private school.
What is the impact of the program on other student measures? The law calls for examining other indicators of student school success, including persistence, retention, graduation and, if possible, college enrollment. In addition, Congress required the evaluation to assess the school safety of students who receive the scholarships relative to those who did not receive scholarships.
What effect does the program have on student and parent satisfaction with the educational options available in DC and with children’s actual school experiences? A key desired outcome of scholarship programs is an increase in both the school choices possible and parents’ and students’ satisfaction with the choices they have made. These issues will be examined by comparing the satisfaction and reasons for applying to the DC Opportunity Scholarship Program among applicants assigned by lottery to receive scholarships and those assigned to not receive scholarships.
To what extent is the program having an impact on schools in Washington, DC? Scholarship programs have been hypothesized to affect not only the students who receive the scholarships but also the broader population of public schools and students. Theory suggests that these broader outcomes could occur when public school systems respond to a fear of losing students, and therefore revenues, to private schools. These competitive effects might include changing curricula, adopting new themes or missions, and other modifications to existing policies and practices to make the public schools more attractive. Choice programs might also affect the larger population of private schools, beyond those in which the programs’ participants are currently enrolled; if choice programs are successful, additional private schools may choose to participate or new schools may be established to meet enrollment demand. However, exploring these potential systemic effects of the DC Opportunity Scholarship Program will be challenging, given the existing design of the program and the limited resources available to address this question.
Data Collection
Evaluation data will be collected for two cohorts of program applicants and include a variety of data collection methodologies. To achieve the sample sizes necessary for statistical power, the evaluation will track the progress and experiences of applicants in spring 2004 and in spring 2005. The evaluation team is collecting pre-program (“baseline”) measures of family background and student achievement and is planning to collect annual “in program” measures in order to conduct a rigorous evaluation of program impacts. These measures will be collected from the data sources described in Table 1.
Table 1. Data Measures for the Evaluation of the DC Opportunity Scholarship Program |
|
Data Source |
Description |
Student assessments |
|
School records |
Administrative records will be collected from DCPS and charter school authorizers to obtain data on attendance, persistence, disciplinary actions, and grades for members of the treatment and control groups at baseline. In addition, the study will seek to obtain these data for all public school students, including those in charter schools, so that the program applicants can be compared to nonapplicant DCPS students in the relevant grade levels, as required by the DC Choice Act. |
Parent surveys |
The study will conduct surveys of parents (of students in the treatment and control groups) in all four years of data collection for the evaluation. These surveys will examine such issues as reasons for applying, satisfaction with school choices, and perceptions of school safety, educational climate, and offerings. It is likely that these surveys will be administered during the annual program renewal events, with telephone follow up as necessary. |
Student surveys |
Each year, the study will conduct surveys of treatment and control group students who are in grades four and above, to collect information about students’ satisfaction with their schools, perceptions of safety, and other characteristics of their school program and environment. The surveys will be administered each year of the program and are likely to occur at the same time (and place) as the student assessments. |
Principal surveys |
The study design calls for a survey of principals in the spring of each year from (1) principals of all 109 private schools, and (2) principals of all of the 160 regular public and charter schools in DCPS. The surveys will be administered each year of the program, and will collect information about school conditions and the school environment that might affect student achievement, and awareness of and response to the DC Opportunity Scholarship Program. |
DC Opportunity Scholarship Program Operator Records |
As the administrator of the DC Opportunity Scholarship Program, the operator is responsible for confirming ongoing eligibility for the program and continuing participation for scholarship recipients. Although surveys of parents and students will also be conducted, Westat will collect annual data from the program operator about individual student program participation. |
Information on the DC Opportunity Scholarship Program and the outcomes of program applicants will be collected primarily by Westat, with data analyzed by Westat and its research partners, University of Arkansas (formerly Georgetown University) and Chesapeake Research Associates. This work will be conducted under Contract Number ED-04-CO-0126. The data to be collected will be obtained from student assessments, school records, and surveys of parents, students, and principals and used to address the research questions and topics identified by in the authorizing legislation. The legislation also specifies that the evaluation report annually on the performance of the program and the students; thus, annual data collection is necessary and cannot be reduced to a lesser frequency. The student, parent, and principal surveys will all include the universe of respondents. In no case do we anticipate any unusual problems requiring specialized sampling procedures. Table 2 shows how each of the sources of data relates to the study questions followed by detailed descriptions of the data sources.
Table 2. Relationship Between the Study Questions and Proposed Sources of Data |
||||||
Study Question |
Student assessments |
School records |
Parent surveys |
Student surveys |
Principal surveys |
DC Opportunity Scholarship Program Operator Records |
What is the impact of the program on student academic achievement? |
|
|
|
|
|
|
What is the impact of attending private versus public schools? |
|
|
|
|
|
|
What is the impact of the program on other student measures? |
|
|
|
|
|
|
What effect does the program have on student and parent satisfaction with the educational options available in DC and with children’s actual school experiences? |
|
|
|
|
|
|
To what extent is the program having an impact on schools in Washington, DC? |
|
|
|
|
|
|
Student Assessments
Based on the legislated language, the key outcome measure for judging the effectiveness of the program is student achievement. Moreover, the law requires the independent evaluator to measure student achievement each year. For the purposes of the evaluation, we have interpreted “student achievement” as students’ skills in reading and mathematics (not science or history).
There are several key considerations that must be taken into account in order to ensure that the measurement of student achievement is a valid indicator of program impacts. Most importantly, to the extent possible, the same administration and testing environments must be maintained for both scholarship recipients (treatment group) and those who applied for but did not receive scholarships (control group). This is easy in the case of the “baseline” measurement of achievement. DCPS annually administers the SAT-9 in April in all of its schools, following a consistent test administration guide for each grade level; we plan to abstract these data for all public school applicants to the DC Opportunity Scholarship Program.
However, going forward beyond the baseline year offers some challenges. Only the control group and members of the treatment group who have declined to use their scholarships or who attrited from the program will be attending DCPS schools and participate in DCPS testing. Most treatment group members will be dispersed throughout a set of participating private schools. Private schools are unlikely to allow us to pull members of the treatment group out of their school day in order to administer the DCPS test to them. Moreover, comparing test results in those circumstances to results for students in the public schools who took the DCPS test along with all students in at their schools introduces a substantial bias. For the DCPS students, the DCPS test is likely to be more consequential, with teachers planning and preparing for it for at least several weeks. In contrast, students in the private schools will have little warning or preparation, placing them at a serious disadvantage in the comparison of achievement with public school students (the control group). This option, although requiring less burden on the control group, would lay the evaluation open to serious criticism in estimating and interpreting the key program impacts.
Instead, at the current time, we plan to administer the SAT-9 math and reading assessments when the treatment and control group families come in to renew their eligibility for the Program, so that the test administration will be similar across all types of evaluation members. The scholarship users will clearly be the most motivated to attend and we will be conscious of the need to take steps to encourage the scholarship non-users (decliners) and control group members to fulfill the requirements to participate in the evaluation’s data collection. These assessments will be administered in early April of each year, for the four years of the evaluation’s data collection.
School Records
Administrative records will be collected from DCPS and charter school authorizers to obtain data on attendance, persistence, disciplinary actions, and grades for members of the treatment and control groups at baseline. In addition, Westat will seek to obtain these data for all public school students, including those in charter schools, so that the program applicants can be compared to other students in the relevant grade levels, as required by the DC Choice Act.
Parent Surveys
The legislation requires the evaluation to examine the impact of the program on parents. The study will conduct surveys of parents (of students in the treatment and control groups) in all four years of data collection for the evaluation. These surveys will examine such issues as reasons for applying to and remaining with the program, satisfaction with school choices, and perceptions of school safety, educational climate, and offerings. These surveys will be administered to the parents when they come in to renew their child’s program eligibility, with telephone follow up as necessary.
Student Surveys
Each year, the study will conduct surveys of treatment and control group students who are in grades four and above, to collect information about students’ satisfaction with their schools, perceptions of safety, reports of behavior both within and outside of school, and other characteristics of their school program and environment. The surveys will be administered each year of the program and are likely to occur at the same time (and place) as the student assessments – the family events where the parents come in to renew program eligibility.
Principal Surveys
The study design calls for two separate principal surveys: (1) principals of all 109 private schools in DC, administered toward the end of each of the four years and (2) principals of all of the 160 regular public and charter schools in DCPS, administered toward the end of each of the four years.
The private school principal survey will focus on knowledge of the DC Opportunity Scholarship Program and ask specific questions about perceptions of the program, why the school does (or does not) participate, and how the program is integrated within their school. The public school principal survey will collect information about school characteristics, climate, how much they know about the DC Opportunity Scholarship Program, and whether they are changing anything in response to the program.
The data collection plan has been designed to maximize efficiency and accuracy, and to minimize respondent burden. A key consideration in the decision to abstract baseline student achievement data from DCPS records (rather than administer our own evaluation assessment) was to minimize evaluation costs and reduce respondent burden. We will ask parents to complete a paper survey form at the time they come in to renew their eligibility, and we will follow up with telephone interviewing to offer parents the opportunity to provide the information in the format most convenient to them.
As an examination of a new program, serving students at least half of whom will be outside public school district records, the evaluation must collect much of its own data. We are using existing data to the extent possible—for example, relying on the DCPS assessment for the baseline measures of student achievement. However, other information collected as part of the evaluation — the ongoing student assessments, the surveys of parent, students, and principals — is not available elsewhere.
There is no anticipated impact on small business or other small entities. The primary entities for this study are students and parents, although some data will be collected from principals in public and private schools. Burden is reduced for all respondents by requesting only the minimum information required to meet the study objectives. The burden on schools has also been minimized through the careful specification of information needs, restricting questions to generally available information where possible, and designing the data collection strategy—particularly the survey methods—to minimize burden on respondents. For example, we will obtain some descriptive information on public and private schools from the Common Core Data (CCD) available from the National Center on Education Statistics. We will also administer the surveys to students and parents when they are attending events to re-establish their eligibility for the program.
This data collection is necessary in order to evaluate the DC Opportunity Scholarship Program and comply with the evaluation mandate in the DC School Choice Incentive Act to report annually to Congress. Virtually all of the data collection activities—respondents, topics, and the need for annual collection—stem directly from the legislative requirements.
There are no special circumstances associated with this data collection.
Consultations on the research design, sample design, data sources and needs, and study reports have occurred during the study’s design phase and will continue to take place throughout the study. The purpose of such consultations is to ensure the technical soundness of the study and the relevance of its findings, and to verify the importance, relevance, and accessibility of the information sought in the study.
Westat and its subcontractors, University of Arkansas (formerly Georgetown University) and CRA, have provided substantial input to ED for the study. Senior technical staff from these organizations who are conducting the study are listed below:
Westat Babette Gutmann, Project Director (301) 738-3626
Alex Ratnofsky, Vice President (301) 251-8249
Juanita Lucas-McLean, Senior Analyst (301) 294-2866
University of Arkansas Patrick Wolf, Principal Investigator (479) 575-2084
Nada Eissa, Senior Analyst (Georgetown) (202) 687-0626
CRA Michael Puma, Senior Analyst (410) 897-4968
The Department has also consulted with an Expert Advisory Panel, a group that includes both eminent school choice experts and evaluation methodologists. This advisory panel includes:
Professor Julian Betts, University of California, San Diego
Professor Thomas Cook, Northwestern University
Professor Jeff Henig, Columbia University
Professor William Howell, University of Chicago
Professor Guido Imbens, Harvard University
Dr. Larry Orr, Abt Associates (retired)
Professor Rebecca Maynard, University of Pennsylvania.
The notice for this data collection was published in the Federal register on (date to be added).
We realize that participation in the evaluation of the DC Opportunity Scholarship Program will place demands on each of the respondents. Specifically, it is critical to the study design that parents, students, and principals participate in the assessments and complete the survey forms each year, as we will be following each cohort and their parents for three years. The last year of data collection, for which an extension is currently being sought, is particularly important to policymakers, because it will allow the evaluation to estimate the impact of having a scholarship and attending a private school for a cumulative three years, a point at which the impacts of the scholarship program should be stable.
However, unlike other studies of educational programs, we cannot depend on testing students while they are at school, because neither the private schools nor the public schools will allow those evaluation activities to take place on campus.2 Instead, we must encourage parents and students (both treatment group and control group) to attend testing events on Saturdays and evenings throughout the spring of each year in schools and community locations around Washington, DC. While appeals can be made to the treatment group through the program, it is particularly difficult to obtain response from the control group, and other experimental studies of voucher programs have faced the challenge of having substantial differential response rates for treatment versus control groups, raising the possibility of bias in the analysis. Because a study of a voucher program is controversial, there are similar issues in collecting data from principals in treatment and control schools.
To mitigate that problem, we originally proposed—and OMB approved—a set of payments that we expected would generate high rates of response among the evaluation sample. Based on the first year of data collection completed, we requested and OMB approved a set of increased incentives for parents and principals (NOC 1/10/106). Subsequently, we requested and OMB approved (NOA 4/24/07) the option to split the parent payment between the parent and older students in order to incentivize the older students (for whom we had particularly low rates of response) since if they attend the testing events they generally did so on their own and had to forego social and athletic activities to participate in the data collection.
The current approved incentive payments are (the shading represents incentives relevant to the last year of data collection for which this extension is being sought):
INSTRUMENT |
PAYMENT |
Parents |
|
Baseline/Year 1 Follow Up Data Collection |
$50 ($25 in original OMB submission) |
Year 2 Follow Up Data Collection |
$100 |
Year 3 Follow Up Data Collection |
$150 |
Principal |
$20 ($10 in original OMB submission) |
|
|
All data collection activities will be conducted in full compliance with Department of Education regulations to maintain the confidentiality of data obtained on private persons and to protect the rights and welfare of human research subjects as contained in Department of Education regulations. These activities will also be conducted in compliance with other applicable federal regulations. Research participants will be informed about the nature of the information that will be requested and confidentiality protection, and they will be assured that information will be reported only in aggregate, statistical form in reports and public use data files. Respondents will also be informed that their names will not be associated with their answers and that no one will have access to this information except as may be required by law, regulation, or subpoena or unless permission is given by both the parent and participating child.
In particular, it is very important that parents or legal guardians of sample members understand that information is being collected regarding their children, and that this information is being held confidential. When parents apply to the DC Opportunity Scholarship Program on behalf of their child(ren), they receive an oral presentation on evaluation activities and requirements and a written statement of the same; they are asked to sign the consent form and only those who sign are part of the program and the evaluation (see the consent form, Attachment 2).
All parent, and principal surveys will also contain a statement regarding the confidentiality of their responses, as follows:
. “Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law.”
Specific Procedures to Maintain Confidentiality
Westat, in the conduct of the Evaluation of the DC Opportunity Scholarship Program, will follow procedures for ensuring and maintaining participant privacy, consistent with Education Sciences Reform Act of 2002. Title I, Part E, Section 183 of this Act requires, “All collection, maintenance, use, and wise dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.
The Evaluation of the DC Opportunity Scholarship Program will be conducted in accordance with all relevant regulations and requirements, including the Privacy Act of 1974 (5 usc 552 a), the Family Educational Rights And Privacy Act Of 1974 (20 usc 1232 g), the Freedom Of Information Act (5 usc 522), The Protection Of Pupil Rights Act (20 usc 1232 h), the Confidentiality Provisions Of The Education Sciences Reform Act (20 usc 9573), related regulations (41 cfr part 1-1 and 45 cfr part 5b), and, as appropriate, other federal or ED regulations on the protection of human subjects.
In addition, Section 309 of the DC Choice Act includes a particular specification that no personally identifiable information can be disclosed as part of the evaluation. As a result of this provision, in publishing the Privacy Act Notice for the System of Records for this evaluation, ED has eliminated all possible routine disclosures to which any data collected or obtained for the evaluation might be subjected. Under the notice, personal information (names, addresses, student ID numbers) may only be disclosed to Westat and in the unlikely case of a terrorist threat.
Westat, as ED’s “authorized representative” for the collection and maintenance of data for the Evaluation, will take the confidentiality requirements very seriously. Employees of Westat are required to sign Westat’s “employee or contractor’s assurance of confidentiality of data” (see Attachment 3). This document outlines the general requirements and responsibilities of employees and contractors with regard to maintaining the confidentiality and privacy of data. In addition, each project at Westat is required, upon inception, to develop a customized confidentiality. The Westat project director develops the confidentiality plan for the evaluation that takes into account assurances made to respondents, what project information is confidential, who is authorized to have access to it, and how access can be controlled. This plan will be shared with all project staff, who will then be expected to implement it. Some of the components of the plan include:
Keeping hard-copy confidential information under lock and key.
Storing confidential electronic information in a secure location.
Communicating about cases via email without violating confidentiality and privacy.
Clearly labeling documents containing confidential information “confidential.”
Limiting to the number of copies of confidential documents.
Arranging for security when sending confidential jobs to a network printer.
Ensuring that only authorized personnel see faxes containing confidential information.
Adhering to the telephone research center’s (TRC) protocols for transporting confidential data to and from the TRC.
Adhering to data entry’s protocols for transporting confidential data to and from data entry.
Using mail and delivery services appropriate for the sensitivity level of the confidential data.
Not bringing confidential data home.
Disposing of confidential information properly when it is no longer needed.
Institutional Review Board (IRB)
Westat has sought clearance from its Institutional Review Board (IRB) for the DC Opportunity Scholarship Program application and consent form, and for all other protocols associated with student’s participation in the study. In the case of the DC Choice Act, the Congress specified a requirement that all applicants, even those who ultimately do not receive a scholarship through the lottery process, participate in the evaluation’s data collection in order to be eligible for a scholarship in succeeding years. The Congress considered such support for data collection critical to ensure that comprehensive and comparable data was collected from both the treatment and control group members. The IRB provided guidance on how to clarify these requirements on the application and consent forms that all applicants must sign.
There are no questions of a sensitive nature on the data collection instruments.
The study calls for surveys of students, parents, and principals, as well as records abstraction and test administration. The instruments were developed to maximize respondent completion of the surveys and to minimize respondent burden. All survey instruments are brief and focus on collecting only information essential to the study.
The research team will administer the student surveys as part of the student assessment that will be administered to students at the family renewal events. The parent survey will be administered at the family renewal events, with telephone follow up. These surveys are designed to be completed in paper and pencil format and will collect information on the respondents’ perception of the school program and environment.
The two principal surveys will be administered as a mail survey with telephone follow up. The surveys will be mailed to principals with instructions to complete the survey and mail or fax it back to the research team. Principals who do not respond by the stated deadline will be contacted by telephone in an attempt to obtain a completed response.
The research team will administer the assessment to the treatment and control groups each spring. In addition, they will collect administrative records from DCPS and charter schools authorizers to obtain data on attendance, persistence, disciplinary actions, and grades for members of the treatment and control groups at baseline. Table 4 shows the estimated burden for each of the data sources.
Table 4. Annual Burden Estimates, by Data Source |
|
||||
Table 4. Annual Burden Estimates, by Data Source |
|||||
Data Source |
Respondents |
Estimated Number of Responses |
Estimated Annual Burden per Response (in Hours) |
Total Estimated Annual Burden (in Hours) |
|
Student Assessments |
Eligible applicants in grades K-12 |
2,705 |
2.5 |
6,762.50 |
|
School Records |
DCPS staff and charter school authorizers |
2 |
40.0 |
80.00 |
|
Student Survey |
Eligible applicants in grades 4-12 |
2,705 |
0.25 |
676.25 |
|
Parent Survey |
Parents of eligible applicants |
2,705 |
0.25 |
676.25 |
|
Private School Principal Survey |
Private school principals of participating and non-participating schools |
109 |
0.17 |
18.53 |
|
Public School Principal Survey |
Principals of DC public schools |
150 |
0.17 |
25.50 |
|
DC Opportunity Scholarship Program Operator Records |
|
1 |
40.0 |
40.00 |
|
Total |
|
8,377 |
|
8,279.03 |
Notes: The information in this table describes the surveys and burden for one (annual) cycle of data collection. This cycle will be repeated for a total of 4 data collection years.
The original package mistakenly showed 8,662 responses and 8,564 hours. During the approval process, the adjusted numbers were 8,377 responses and 8,279 hours, reflected on the 83C and noted in the memo to OMB dated 1/3/05.
There are no additional respondent costs associated with this data collection other than the hour burden estimated in item A12.
The estimated cost to the federal government of conducting the Impact Evaluation of the DC Opportunity Scholarship Program is based on the government's contracted cost of the data collection and related study activities along with personnel cost of government employees involved in oversight and/or analysis. For the data collection activities for which OMB approval is currently being requested, the overall cost to the government is $2,356,073. This includes:
$491,869 for the first year of data collection, including instrument development
$603,838 for the second year of data collection
$621,177 for the third year of data collection
$639,189 for the fourth year of data collection
The overall costs to the government of the full range of evaluation activities over the entire study period will be $5,489,394 over a five-year period. When annualized, this cost amounts to $1,097,879 per year. This estimate is based on the evaluation contractor's previous experience managing other research and data collection activities of this type.
This is a request for an extension in the time needed to complete the final year of data collection for an existing data collection and therefore does not require any changes in hour burden.
All data will be analyzed according to rigorous technical standards, and woven together to provide a complete assessment of whether the DC Opportunity Scholarship Program achieved its goals. The focus of the analysis and report will be evidence regarding: (1) who applies for and uses a scholarship; (2) what impacts does the offer and use of a scholarship have on student test scores, parental satisfaction, school safety, and other participant outcomes; and (3) do the principals at DC public schools and private schools plan to manage their educational institutions differently in response to the establishment of the DC Opportunity Scholarship Program.
General Analytic Strategy
It is well known that the independent effects of school choice on student outcomes are difficult to estimate. Perhaps the most significant difficulty faced by researchers is selection bias -- the self-selection of families to even seek out a new school choice for their child, and the mutual student/school decision process that selects students into different types of schools. Because this bias is generally a result of unmeasurable factors, most researchers have preferred the use of a randomized experiment to a dependence on non-experimental statistical methods. Since the DC Opportunity Scholarship Program provides for the random distribution of scholarships using a lottery, under certain conditions and within certain parameters, we will therefore use experimental methods to the extent possible to estimate most programmatic impacts.
To motivate the discussion of how we identify the effect of the scholarship program on test-scores, it is useful to begin with a simple representation of the selection problem as a missing data problem, using the potential outcomes approach. This approach defines causal effects in terms of potential outcomes or counterfactuals. Conceptually, the causal effect of treatment is defined as the difference between the “outcome for individuals assigned to the treatment group” and “outcome for the treatment group if it had not received the treatment,“ or:
(E.1) “E(Yi| Xi, Ti =1)” - “E(Yi |Xi, Ti =0)”
In the case of scholarships, the treatment effect–the effect of the scholarships on academic achievement–would be defined as the difference between “test scores for program students” and “test scores for program students if they had not received a scholarship.” The fundamental problem is that a student is never observed simultaneously in both states of the world. What is observed is a student in the treatment group (Ti =1) or in the control group (Ti =0). The outcome in the absence of treatment, E(Yi |Xi, Ti =0), is then the counterfactual--what would have occurred to those students receiving the scholarships if they had not received them.
If students receiving scholarships were identical to other students in both observable and unobservable characteristics, the counterfactual could be generated directly from an appropriately selected comparison group. Valid comparison groups are rarely found in practice, however. The random assignment of students into the program generates the counterfactual from the control group – eligible applicants who did not receive a scholarship.3 If correctly implemented, random assignment yields statistically equivalent groups, and allows estimation of the program impact through differences in mean outcomes between the two groups.
Consistent with this approach is the following basic analytic model of the effects of school choice scholarships on outcomes. Consider first the outcome equation for the test score of student i in year t. It is reasonable to assume that test scores (Yit ) are determined as follows:
(E.2) Yit =α+ τ Tit + Xi γ+ εit if t>k (period after program takes effect)
In equation (E.2), Tit is equal to one if the student has the opportunity to participate in the voucher program (i.e., the award rather than the accrual use of the voucher) and equal to zero otherwise. Xi is a vector of student characteristics (measured at baseline) known to influence future academic achievement, such as prior test scores, mother’s level of education, family income, etc. In this model, τ represents the effect of vouchers on test scores for students in the program, conditional on Xi. With a properly designed experiment, using a concise and judiciously chosen set of statistical controls for characteristics that predict future achievement should improve the precision of the estimated impact. That is, the estimated treatment effect, τ, should be identical to the difference in mean outcomes between the treatment and the control groups.
Customization of General Analytic Strategy
Since the initial applicants were randomized within certain relevant subgroups, we propose a randomized block design for analyzing scholarship program impacts. The randomized block design divides the program group into relatively homogenous groups (called blocks). The program group is then randomly assigned vouchers within each block. We are interested in how academic achievement (Y) is affected by the assignment into a voucher program. Suppose we could identify b blocks -- based on grade and scholarship priority status -- that are of size n. Consider then the following statistical model for this Randomized Block Design:
(E.3) Yikt = μ+ τ Tikt +∑bj=2 ρj Bik+ Xik γ+ εik,t
where
i = 1,…..,n observations and k=1,….,b blocks(defined by grade and priority status);
Yji is the outcome for student i in block j, at time t;
μ is the overall mean outcome (e.g. test score);
τ is the treatment (scholarship program) effect;
ρj is the jth block effect;
Tit is assignment into the voucher program
Bji is the block assignment
Xji represents observable characteristics, measured at baseline
εij is the random error; independent, Ν(0,σε2 ).
This analytical framework follows naturally from the randomization scheme and is easily implemented and interpreted. Y can be measured in several different dimensions, including test scores, school satisfaction, parental satisfaction, grade completion, including where appropriate, high school graduation, etc. μ is average outcome for all program members; ρj is the average block effect. τ is the effect of vouchers on academic achievement. The remainder of this discussion discusses econometric concerns and associated empirical methods.
Take-Up of Scholarships
Even with a properly implemented experiment, we may expect slippage between the random assignment into the experiment and use of the scholarship at a private school. This occurrence has been observed in very different experimental settings, including medical trials, job training and health insurance experiments. More relevant to our exercise is the slippage that has been observed in previous school voucher experiments, such as the Milwaukee Parental Choice Program. Such slippage has important implications for the estimators of the effect of the scholarship program. Generally we define two broad estimators of interest. The first, commonly referred to as the "Intent to Treat" (ITT), is the effect of the offer of a scholarship on student outcomes. All students randomized into the sample make up the experimental sample, regardless of whether they use the scholarship to attend a private school.
Policymakers are typically also interested in the effect of scholarship use on student achievement. This estimator, commonly referred to as the "Impact of the Treated" (IOT), is based on the sample of scholarship users. Instrumental variable analysis provides us with a well-established method to generate an estimate of the scholarship impact on the treated from the ITT estimator.4
Using only the sample of scholarship users in this case could introduce a form of selection bias, in that the sample of students using the voucher to attend private schools is selected (from the randomized-in sample). Self-selection bias results in the case where family (observable and unobservable) characteristics that affect student outcomes also affect the decision to use the voucher. For example, families who care more about education and are more able to gather and analyze relevant information about the schools are also the families whose children are more likely to make use of the voucher, all else equal. Students in such families are also more likely to do better once in a private school setting than their randomized-in counterparts who do not use the voucher. To see the point, consider the following models of actual use of the voucher, and student test scores.5
(E.4) Vit = σ0+ σ1Tit + Xiσ2+ εit
where
i represents student, t time
V represents use of the voucher
T represents treatment status (=1 if selected in the lottery)
X represents observable characteristics
Note that when schools randomly select from applicants when they are over-subscribed, L is random, conditional on the school and grade of the applicant. Such effects would be controlled for in the randomized block design proposed in equation E.3.
We also recognize that a model of student outcomes would be based on actual voucher use/attendance at a private institution.
(E.5) Yit = π0+ π1Vit + Xi π2+ νit
Combining equations (E.4) and (E.5), we get
(E.6) Yit = ψ0+ ψ1Tit + Xi ψ2+ ξit
What these equation show is that the estimated treatment effect ψ1 is equal to a combination of the effects of selection into the program on voucher use and of school attendance on student outcomes (ψ1= π1 *σ1). Note that ψ is the treatment effect in the empirical models E.2 and E.3. What we estimate in the ITT model is therefore the reduced form effect of both margins of response-student learning in private schools and family take-up of scholarship dollars. It is important to note that ψ is in some respects the policy parameter of interest since families cannot be compelled to use available scholarships. Its decomposition is of course incredibly useful for learning about the effectiveness of different types of schools on educational attainment; and of the success of, in this case, publicly funded scholarships. Our empirical analysis will examine, among other margins, family choices regarding take-up of the scholarships as well as types of schools selected.
These types of analyses will be performed for the various outcome measures called for in the law, including academic achievement, safety, satisfaction, and other student outcomes.
Reports
The DC Choice Act requires annual reporting to Congress. The first report will describe who applied to the DC Opportunity Scholarship Program, largely by comparing the demographic characteristics and achievement of program applicants with those of other DCPS students. Subsequent reports will focus on the impact of the program, using the experimental and multivariate regression techniques described above to estimate differences in outcomes between the treatment and control group members. Based on the guidance in the legislation, the reports will focus primarily on conditions and outcomes involving student academic performance, parental satisfaction, school safety, and the process by which parents select schools. A schedule for the reports and data files is provided in Table 5.
Table 5. Deliverable Schedule |
|
Deliverable |
Schedule |
Descriptive Report First Impact Year Report Second Impact Year Report Third Impact Year Report (and Final Report) |
Spring 2006 Spring 2007 Spring 2008 Spring 2009 |
Data Files with Documentation. |
July 2009 |
All data collection instruments will include the OMB expiration date.
No exceptions are requested.
1 In March 2004, a grant to run the program was awarded to the Washington Scholarship Fund, a non-profit organization that operates a privately-funded scholarship program for students in the DC area.
2 Participating private schools do not want government intrusion into their campus activities, and view the evaluation testing as one source of that intrusion, and the public school system did not want to appear to be endorsing the voucher program being studied by allowing testing in the public schools .
3 See the following studies, which all use the same data from an evaluation of a New York City privately-funded scholarship program: Howell, William G., Patrick J. Wolf, David E. Campbell, and Paul E. Peterson, “School Vouchers and Academic Performance: Results from Three Randomized Field Trials,” Journal of Policy Analysis and Management, 21:2, 2000; Barnard, John, Constantine E. Frangakis, Jennifer L. Hill, and Donald B. Rubin, “Principal Stratification Approach to Broken Randomized Experiments: A Case Study of School Choice Vouchers in New York City,” Journal of the American Statistical Association, 98:462, 2003; Alan B. Krueger and Pei Zhu, “Another Look at the New York City School Voucher Experiment,” Working Paper Series, Education Research Section, Princeton University, March 2003.
4 For an extended discussion of the use of this technique under such circumstances, see Howell et al, The Education Gap, pp. 49-51.
5 Cecilia Elena Rouse, “Private School Vouchers and Student Achievement: An Evaluation of the Milwaukee Parental Choice Program,” The Quarterly Journal of Economics, May 1998, pp. 553-602.
File Type | application/msword |
File Title | SUPPORTING STATEMENT |
Author | Beth Sinclair |
Last Modified By | Roseta.Hall |
File Modified | 2007-12-12 |
File Created | 2007-12-12 |