1850-0800 rev OMB package PART A 04 17 13 FINAL

1850-0800 rev OMB package PART A 04 17 13 FINAL.doc

DC Choice Evaluation

OMB: 1850-0800

Document [doc]
Download: doc | pdf




U.S. Department of Education




Evaluation of the Effectiveness of the Scholarships for Opportunity and Results (SOAR) Act Program: Supporting Statement for OMB Clearance


PART A: SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION







October 22, 2012



TABLE OF CONTENTS



INTRODUCTION 1


A. JUSTIFICATION 4

A.1 Circumstances Making The Collection of Information Necessary 4

A.2 Purposes and Uses of the Data 4

A.3 Use Of Technology to Reduce Burden 6

A.4 Efforts To Identify Duplication 6

A.5 Methods to Minimize Burden on Small Entities 6

A.6 Consequences of Not Collecting Data 7

A.7 Special Circumstances 7

A.8 Federal Register Comments and Persons Consulted Outside The Agency 7

A.9 Payments or Gifts 8

A.10 Assurances of Confidentiality 8

A.11 Justification of Sensitive Questions 9

A.12 Estimates of Hour Burden 9

A.13 Estimates of Cost Burden to Respondents 12

A.14 Estimate of Annual Cost to the Federal Government 12

A.15 Program Changes or Adjustments 12

A.16 Plans For Tabulation And Publication of Results 12

A.17 Approval To Not Display The OMB Expiration Date 18

A.18 Explanation of Exceptions 18



Appendix A: The Authorizing Legislation

Appendix B: Public/Charter School Form

Appendix C: Private School Form

Appendix D: Letters to Parents

Appendix E: Parent Survey

Appendix F: Elementary Student Survey

Appendix G: Middle School Student Survey

Appendix H: High School Student Survey

Appendix I: Public/Charter School Principal Letter

Appendix J: Private School Principal Letter

Appendix K: Public/Charter School Principal Survey

Appendix L: Private School Principal Survey

Appendix M: Consent Form

Appendix N: Confidentiality Statement

EVALUATION OF THE EFFECTIVENESS OF THE SCHOLARSHIPS FOR OPPORTUNITY AND RESULTS (SOAR) ACT PROGRAM


SUPPORTING STATEMENT

FOR PAPERWORK REDUCTION ACT SUBMISSION



INTRODUCTION


This document requests forms clearance approval from the Office of Management and Budget (OMB) for the collection of data under the Evaluation of the Effectiveness of the Scholarships for Opportunity and Results (SOAR) Act Program. In particular, we are requesting approval for: (1) parent, student, and principal surveys, and (2) records abstraction from DC Public School (DCPS), from the District of Columbia Public Charter School Board, and private school administrative files. We also describe other aspects of the evaluation plan that do not contribute to burden for context. This is a reinstatement of the #1850-0880 collection that was discontinued on 6/23/2011. The study design, data collection plan, instruments, and levels of burden are consistent with forms clearance packages approved by OMB for the previous evaluation of this program (#1850-0800).


Overview of the Program


The Scholarships and Opportunities for Results (SOAR) Act H.R. 1473 (P.L.112-10), signed into law on April 15, 2011, reauthorized the DC School Choice Incentive Act and provided for a five-year continuation of a school choice program for low-income residents of Washington, DC. The program, still titled the Opportunity Scholarship Program or OSP, now provides scholarships of up to $12,000 per student per year to enable low-income elementary and secondary students to attend private schools in the District of Columbia in lieu of the public schools already available to them. The statute specifies that certain students be given priority in the award of scholarships, including students who have siblings already participating in the program, students who were previously awarded a scholarship under the earlier program but who did not use it, and students from public schools designated as “in need of improvement” (SINI) or corrective action under the federal Elementary and Secondary Education Act.


The OSP is operated under a grant from the U.S. Department of Education (ED) to the DC Children and Youth Investment Corporation (the Trust). 1 The Trust awarded approximately 1,000 new scholarships in summer 2011(soon after the program was reauthorized) to all eligible applicants at that time, and just over 316 scholarships in summer 2012 through a lottery of eligible applicants.


Overview of the Evaluation


The reauthorization once again stipulated that an evaluation of the program be conducted “using the strongest possible research design for determining the effectiveness” of the program (Section 309, see Appendix A). ED awarded a contract to Westat, and its research partners, Pemberton Research and the University of California at San Diego to: (1) provide technical assistance to the program operator, particularly with respect to the design and conduct of the lotteries of applicants, and (2) conduct an evaluation of the impacts of the program.


The foundation of the evaluation will be a randomized control trial (RCT) comparing outcomes of eligible applicants (students and their parents) assigned by lottery to receive or not receive a scholarship. This design is consistent with the requirement for a rigorous evaluation as well as the need to fairly allocate the scholarships if the program is oversubscribed. Because the law also specified other kinds of comparisons and analyses, the planned evaluation study includes both quantitative and qualitative components.


Research Questions


The study is designed to address the following key questions:


  • What is the impact of the program on student achievement? As described in the statute, the purpose of the program is to allow low-income parents to enroll their children in other than DC public schools because test scores in the public schools remain below the national average. The law therefore placed a priority on examining whether the program improves the academic achievement levels and growth of eligible students who would otherwise be in a public school setting. The evaluation will calculate the impacts on achievement (as well as other outcomes) of the offer of a scholarship (the “Intent to Treat” estimate) and the impact of using a scholarship (the “Treatment on Treated” estimate).


  • What is the impact on other measures of student success? The law calls for examining other indicators of school success, including persistence, grade retention, high school graduation and, if possible, college enrollment. Measures of student engagement, such as school attendance, will also be examined.


  • Does the program affect parent and student reports of school satisfaction and safety, or

parent involvement in their child’s education? A key desired outcome of school choice is an increase in both the school choices possible and parents’ and students’ satisfaction with the choices they have made. The SOAR statute extends the outcomes to be studied to include: how parents and students view the safety of the child’s school and the success of the program in increasing parent involvement.


  • Why do parents choose to participate in the program? Previous studies of school choice suggest parents consider a variety of factors in choosing whether to pursue private schooling for their children and the characteristics of schools that most affect their specific school selections. The statute specifies that the evaluation examine these issues for parents who apply to the OSP.


  • Does the program change students’ instructional environment and opportunities? Whatever the effects of the OSP on key outcomes, researchers and policymakers have long been interested in the mechanisms by which voucher programs might be expected to benefit students. Among the hypotheses that will be explored in the evaluation are whether participating students are exposed to more motivated or better performing peers and the extent to which school organization, instruction, or services are different in public vs. private schools.


Sample


In order for the evaluation to have sufficient statistical power to detect policy relevant impacts, the sample will consist of approximately 1,800 eligible program applicants in spring 2012 (cohort 1) and in spring 2013 (cohort 2) (see Part B of this submission). To be included in the evaluation sample the applicant must be eligible for the program, a rising Kindergartener (K) or already be attending a public school, and participate in a lottery to determine whether they will receive a scholarship award.2

Data Collection


Evaluation data will be collected for the two cohorts of program applicants from a variety of sources, as summarized in Table 1. Each cohort will have baseline data3 as well as three years of follow up (post-lottery) data collection; 2013-2015 for cohort 1 and 2014-2016 for cohort 2. In addition to estimating program impact, we will use this experimental study to conducted research about interim outcomes and mediators.



Table 1. Data Measures for the Evaluation of the DC Opportunity Scholarship Program

Data Source

Description

Student assessments

The Terra Nova assessment will be administered to the eligible sample before the lotteries are conducted and each spring following the lotteries for 3 years (spring 2013-2015 for the 2012 cohort and spring 2014-2016 for the 2013 cohort). The follow up assessments will be administered in students’ school and will provide the primary outcome measure for the impact evaluation.

School records

Administrative records will be collected from DCPS, the District of Columbia Public Charter School Board and participating private schools in the fall of each year to obtain data on prior year attendance, persistence, disciplinary actions, and grades for members of the treatment and control groups.

Parent surveys

The evaluation will include annual surveys of evaluation sample members’ parents in each follow up year. These surveys will examine such issues as reasons for continued participation or withdrawal, involvement in school, satisfaction with school choices, and perceptions of school safety, leadership, and offerings. The survey will be mixed mode. (Web with phone or paper follow up).

Student surveys

The study will conduct surveys of the evaluation sample who are in grades four and above, to collect information about students’ satisfaction with their schools, perceptions of safety, and other characteristics of their school program and environment. The surveys will be administered in each of the follow up years at the same time (and place) as the student assessments.

Principal surveys

The study design calls for annual surveys to be administered to principals in the DC traditional public school, charter school, and private school systems in 2013-2016. Data from principals of students in the treatment and control groups will provide information about school organization and offerings for descriptive analyses of students’ school environments and for use as mediators in the impact analysis. The web-based principal surveys will also be used to examine how aware public and private schools are of the DC Opportunity Scholarship Program and whether they are making any changes in response to it.

DC Opportunity Scholarship Program Operator Records

As the administrator of the DC Opportunity Scholarship Program, the operator is responsible for confirming ongoing eligibility for the program and continuing participation for scholarship recipients. Westat will obtain application data for all sample members as well as annual participation information for individual students from the program operator.


  1. JUSTIFICATION


  1. Circumstances Making the Collection of Information Necessary


As described in the introduction, the SOAR Act mandates the conduct of an independent evaluation of the effectiveness of the program. The legislation also lays out a series of topics and issues that the evaluation must address, to which the study’s research questions are directed. The information collected through this study will be used as the basis for this mandated evaluation.


  1. Purposes and Uses of the Data


Information on the DC Opportunity Scholarship Program and the outcomes of program applicants will be collected by Westat, with data analyzed by Westat and its research partners, Pemberton Research and the University of California at San Diego. This work will be conducted under Contract Number ED-IES-12-C-0018. The data to be collected will be obtained from student assessments, school records, and surveys of parents, students, and principals and used to address the research questions and topics identified in the authorizing legislation. The legislation also specifies that the evaluation report annually on the performance of the program and the students; thus, annual data collection is necessary.



Table 2, shows how each of the sources of data relates to the study questions followed by detailed descriptions of the data sources.


Table 2. Relationship Between the Study Questions and Proposed Sources of Data

Study Question

Student assessments

School records

Parent surveys

Student surveys

Principal surveys

DC Opportunity Scholarship Program Operator Records

What is the impact of the program on student academic achievement?




What is the impact on other measures of student success?




Does the program affect parent and student reports of school satisfaction and safety, or parent involvement in their child’s education?



Why do parents choose to participate in the program?





Does the program change students’ instructional environment and opportunities?






    1. Student Assessments


Based on the legislated language, the key outcome measure for judging the effectiveness of the program is student achievement. The authorizing statute includes an expectation that the mandated evaluation, in collaboration with the program operator, “ensure that the parents of each student who applies for a scholarship under this division… and the parents of each student participating in the scholarship program under this division, agree that the student will participate in the measurements given annually by the Institute of Education Sciences…” SOAR Act Sec. 3009(a)(3)(C).  The program operator has determined that to meet this requirement, participants are obligated to participate in the measurements (data collection) or they could lose their scholarship award.”


For the purposes of the evaluation, we have interpreted “student achievement” as students’ skills in reading and mathematics (not science or history). The law also requires that the evaluation use a nationally norm-referenced test and administer it each year.4


The assessments will be administered differently than under the previous evaluation of the OSP, in order to reduce burden on families and alleviate the need to offer substantial respondent payments to ensure adequate response rates. Rather than requiring treatment and control group families to attend testing events at locations around DC on Saturdays or weeknight evenings, the assessments will be administered in students’ schools. The Memoranda of Understanding (MOUs) with DCPS and the Office of the State Superintendent (OSSE) that oversees funds for charter schools under the SOAR Act both specify this activity and the Trust has inserted this requirement into the agreement forms for participating private schools. We anticipate testing each evaluation sample member at baseline and each spring for three years after they participate in a lottery for an OSP scholarship. These assessments are exempt from federal burden reporting requirements.



    1. School Records


Administrative records for both public and charter schools will be collected from DCPS, the District of Columbia Public Charter School Board and, to the extent possible from the Office of the State Superintendent of Education (OSSE), individual private schools with evaluation sample members in order to obtain data on attendance, persistence (including graduation), disciplinary actions, and grades for members of the treatment and control groups (see Appendix B). For private schools without electronic records, we will send them a worksheet listing each sample member who is attending their school. A copy of the form is included in Appendix C.


    1. Parent Surveys


The legislation requires the evaluation to examine the impact of the program on parents. The study will conduct web-based surveys of parents (of students in the treatment and control groups) in each year of the evaluation. These surveys will examine such issues as reasons for continued participation or withdrawal from the program, factors used to select a school, satisfaction with school choices, and perceptions of school safety, leadership, and offerings. A copy of the letter that will be sent to parents and a paper version of the survey are included in Appendices C and D, respectively.


    1. Student Surveys


Each year, the study will conduct surveys of treatment and control group students who are in grades four and above, to collect information about students’ satisfaction with their schools, perceptions of safety, reports of behavior both within and outside of school, including peer effects, and other characteristics of their school program and environment. The surveys will be administered in the spring each year of the program and will to occur at the same time (and place) as the in-school administration of the student assessments. Copies of the Grade 4- 5, 6 - 8 and 9-12 Student Surveys are included in Appendices E, F, and G, respectively.


    1. Principal Surveys


The study design calls for annual administration of a web based survey to principals in the DC traditional public school, charter school, and private school systems. Data from principals of students in the treatment and control groups will provide information about their school environments for impact analysis. Responses from other principals will help assess the extent to which the sending (public) and receiving (private) schools are similar to or different from other schools in their sector. The principal surveys will also be used to examine how aware public and private schools are of the DC Opportunity Scholarship Program and whether they are making any changes in response to it. Copies of the Principal Letter, Public School Principal Survey and Private School Principal Survey are included in Appendices H, I and J, respectively.


Parents and principals will have the option of completing their surveys on the web, by phone or completing a hardcopy paper version of the survey. Respondents who do not complete their surveys within the two weeks of the initial mailing will be send weekly reminder emails and postcards. In addition, our trained team of telephone interviewers from our Telephone Research Center (TRC) will call non-responders to encourage use of the web survey, to try and administer the survey by telephone or to provide a hard copy of the survey via postal mail, if requested. The TRC will call parents on varied days and times during the week and on weekends and will offer them the option of scheduling a call back time to conduct the interview that is most convenient for them. In those instances where we are unable to contact the parent using the telephone numbers they have provided, our Telephone Research Center (TRC) interviewers will immediately begin tracing steps to locate them. Principals will be contacted during traditional working hours. Our success rate using this approach is very high.


  1. Use of Technology to Reduce Burden


The data collection plan has been designed to maximize efficiency, accuracy, and convenience for respondents and to minimize their burden. The surveys of parents and principals are designed to be web-based, with paper and telephone follow up.



  1. Efforts to Identify Duplication


We will use existing data to the extent possible—for example, relying on the program operator and the schools to provide data about attendance, persistence, disciplinary actions, and grades for members of the treatment and control groups. However, other information collected as part of the evaluation —student assessments, the surveys of parents, students, and principals — is not available elsewhere.



  1. Methods to Minimize Burden on Small Entities


There is no anticipated impact on small business or other small entities (as stated on Item 5 of OMB Form 83-I).


The primary entities for this study are students and parents, although some data will be collected from principals in public and private schools. Burden is reduced for all respondents by requesting only the minimum information required to meet the study objectives. The burden on schools has also been minimized through the careful specification of information needs, restricting questions to generally available information where possible, and designing the data collection strategy—particularly the survey methods—to minimize burden on respondents. For example, we will obtain some descriptive information on public and private schools from the Common Core Data (CCD) available from the National Center on Education Statistics. Rather than requiring families to attend testing events on Saturdays or weeknight evenings, surveys will be administered to students at the same time (and place) as the achievement assessment.


A web-based survey will be the primary mode of data collection for the parent and principal surveys. We have found this will not only save money in postage, coding, keying and cleaning the survey data but is a preferred method for survey completion among many respondents. Burdens will be further reduced with the use of skip patterns, where appropriate and will allow respondents to complete the survey at a location and time of their choice. Notification of participation and log in credentials will be sent via email, whenever possible. As alternatives, respondents will be offered the opportunity to complete the survey through telephone follow up calls or use of a hard-copy version. All of these formats allow respondents to complete the survey at their convenience and accommodate individual preferences.


  1. Consequences of Not Collecting the Data


This data collection is necessary in order to evaluate the DC Opportunity Scholarship Program and comply with the evaluation mandated in the SOAR Act. Virtually all of the data collection activities—respondents, topics, and the need for annual collection—stem directly from the legislative requirements.



  1. Special Circumstances


None of the special circumstances listed apply to this data collection.



  1. Federal Register Comments and Persons Consulted Outside the Agency


Public comments from the previous evaluation of the OSP (concluded in 2010) were taken into account in developing plans for the current collection. We also received public comments from 9 organizations: American Association of University Women (AAUA), Archdiocese of Washington (AOW), Association of Christian Schools International (ACSI), National Catholic Educational Association (NCEA), The National Coalition for Public Education (NCPE), Council for American Private Education (CAPE), and Secretariat of Catholic Education (SCE). These comments and responses are in a separate file. We took these public comments into consideration as well for the development of the current collection.



Consultations on the research design, sample design, data sources and needs have occurred during the study’s design phase and will continue to take place throughout the study. The purpose of such consultations is to ensure the technical soundness of the study and the relevance of its findings, and to verify the importance, relevance, and accessibility of the information sought in the study.


Westat and its subcontractors, Pemberton Research and the University of California at San Diego, have provided substantial input to ED for the study. Senior technical staff from these organizations who are conducting the study are listed below:


Westat Ms. Babette Gutmann, Vice President (301) 738-3626

Ms. Juanita Lucas-McLean, Project Director (301) 294-2866

Dr. Louis Rizzo, Senior Statistician (301) 294-4486


Pemberton Dr. Mark Dynarski, Co- Principal Investigator (609) 443-1981

University of CA Dr. Julian Betts, Co-Principal Investigator (858) 534-7040



We are in the process of establishing a Technical Working Group (TWG) that includes both eminent school choice experts and evaluation methodologists.


The notice for this data collection was published in the Federal Register on 11/21/12 (60 day Federal Register published Vol. 77 pg. 69812 on 11/21/12.)


  1. Payments or Gifts


We propose giving parents a $20 incentive to complete the annual parent survey; $5 in advance and $15 upon completion. The advance incentive will be mailed to parents along with the informational letter that includes instructions on how to access the web-based survey. Parents will be offered the option of completing the web-based version or alternatively, completing a telephone interview or paper version. The informational letter will inform parents that they will receive their second $15 incentive payment by mail within 3 weeks after completing the survey. The letter will also reference their agreement to participate in the evaluation and describe the topics to be covered.


Principals will receive an advance incentive payment of $10 for completing the principal survey. The incentive payment to principals will be mailed along with the informational letter that includes instructions on how to access the web-based survey.



  1. Assurances of Confidentiality


All data collection activities will be conducted in full compliance with Department of Education regulations to maintain the confidentiality of data obtained on private persons and to protect the rights and welfare of human research subjects as contained in Department of Education regulations. These activities will also be conducted in compliance with other applicable federal regulations. Research participants have been or will be informed about the nature of the information that will be requested and confidentiality protection, and they will be assured that information will be reported only in aggregate, statistical form in reports and public use data files. Respondents will also be informed that their names will not be associated with their answers and that no one will have access to this information except as may be required by law, regulation, or subpoena or unless permission is given by both the parent and participating child.

In particular, it is very important that parents or legal guardians of sample members understand that information is being collected regarding their children, and that this information is being held confidential. When parents apply to the DC Opportunity Scholarship Program on behalf of their child(ren), they are asked to sign the consent form and only those who sign are part of the program and the evaluation (see the consent form approved by OMB as part of the baseline collection, attached in Appendix M). All parent, student and principal surveys will also contain a statement regarding the confidentiality of their responses (see survey instruments in the appendices).


Specific Procedures to Maintain Confidentiality


The Evaluation of the DC Opportunity Scholarship Program will be conducted in accordance with all relevant regulations and requirements, including the Privacy Act of 1974 (5 usc 552 a), the Family Educational Rights And Privacy Act Of 1974 (20 usc 1232 g), The Protection Of Pupil Rights Act (20 usc 1232 h), the Confidentiality Provisions Of The Education Sciences Reform Act (20 usc 9573), related regulations (41 cfr part 1-1 and 45 cfr part 5b), and, as appropriate, other federal or ED regulations on the protection of human subjects.


In addition, Section 9(a)(1)(A)(5) of the SOAR Act includes a particular specification that no personally identifiable information can be disclosed as part of the evaluation. As a result of this provision, in publishing the Privacy Act Notice for the System of Records for this evaluation, ED has eliminated all possible routine disclosures to which any data collected or obtained for the evaluation might be subjected. The Privacy Act System of Records for this collection will be published shortly. Under the notice, personal information (names, addresses, student ID numbers) may only be disclosed to Westat and in the unlikely case of a terrorist threat.


Westat, as ED’s “authorized representative” for the collection and maintenance of data for the Evaluation, will take the confidentiality requirements very seriously. Employees of Westat are required to sign Westat’s “employee or contractor’s assurance of confidentiality of data” (see Appendix N). This document outlines the general requirements and responsibilities of employees and contractors with regard to maintaining the confidentiality and privacy of data. In addition, each project at Westat is required, upon inception, to develop a customized confidentiality plan. The Westat project director develops the confidentiality plan for the evaluation that takes into account assurances made to respondents, what project information is confidential, who is authorized to have access to it, and how access can be controlled. This plan will be shared with all project staff, who will then be expected to implement it. Some of the components of the plan include:


  • Keeping hard-copy confidential information under lock and key.

  • Storing confidential electronic information in a secure location.

  • Communicating about cases via email without violating confidentiality and privacy.

  • Clearly labeling documents containing confidential information “confidential.”

  • Limiting to the number of copies of confidential documents.

  • Arranging for security when sending confidential jobs to a network printer.

  • Ensuring that only authorized personnel see faxes containing confidential information.

  • Adhering to the telephone research center’s (TRC) protocols for transporting confidential data to and from the TRC.

  • Adhering to data entry’s protocols for transporting confidential data to and from data entry.

  • Using mail and delivery services appropriate for the sensitivity level of the confidential data.

  • Not bringing confidential data home.

  • Disposing of confidential information properly when it is no longer needed.


Institutional Review Board (IRB)


Westat received an exemption from its Institutional Review Board (IRB) for this evaluation on August 1, 2012. Westat's IRB will review all survey instruments, letters and the consent forms that all OSP applicants signed when they applied to the program.



  1. Justification of Sensitive Questions


The surveys do not include sensitive questions.


  1. Estimates of Hour Burden


The data collection plan has been designed to maximize efficiency, accuracy, and convenience for respondents and to minimize their burden. The study calls for surveys of students, parents, and principals, as well as records abstraction and test administration. This request proposes use of revised versions of instruments previously approved by OMB (#1850-0800). All survey instruments are brief and focus on collecting only information essential to the study.


Table 4 shows the estimated burden for each of the data sources. Specific assumptions follow:


Student surveys

A total of 911 students across cohort 1 and cohort 2 are in grades 4-12:

  • 266 of 536 eligible cohort 1 students (49.6%) are in grades 4 – 12 and will complete a student survey.

  • 645 of 1,300 eligible cohort2 students (49.6%) are in grades 4-12 and will complete a student survey.

Parent surveys

  • One parent survey will be completed for eligible students in each cohort. (Parents are asked to complete a survey for each child, so even if a parent has more than one child in the program, the number of parents is the same as the number of students.)

Principal surveys

    • A principal survey will be administered to principals in all 87 private schools in the District of Columbia.

    • A principal survey will be administered to principals in all 125 public and 52 charter schools in the District of Columbia (N = 177).

Administrative Records

  • The Trust will provide DC OSP administrative records

  • The Office of the State Superintendent of Education (OSSE) will provide administrative records for both public and charter school students

  • Forty-five participating private schools will complete a form to provide data for an average of 37 students per school.




Table 4. Annual Burden Estimates, by Data Source for cohorts 1 and 2 a

Data Source

 

Estimated Number of Responses

Estimated Annual Burden per Response

Total Estimated Annual Burden

Total Estimated Annual Burden

Respondents (Appendix)

(in Hours)

(in Hours)

(in Dollars)b

Student Surveys

Students in the impact sample in grades 4-12

 

 

 

N/A

Cohort 1

 

 

 

 

Elementary

(Appendix F)

85

0.25

21.25

Middle

(Appendix G)

116

0.25

29.00

High School

(Appendix H)

65

0.25

16.25

Cohort 1 TOTAL

 

266

0.25

66.50

Cohort 2

 

 

 

 

Elementary

 

206

0.25

51.53

Middle

 

281

0.25

70.32

High School

 

158

0.25

39.40

Cohort 2 TOTAL

 

645

0.25

161.25

Cohort 1 and Cohort 2 TOTAL

 

911

 

227.75

Parent Survey

Parents of students in the impact sample (Appendix E)

 

 

 

 

Cohort 1

 

536

0.33 c

176.88

$ 2,989.27

Cohort 2

 

1300

0.33

429.00

$ 7,250.10

Cohort 1 and Cohort 2 TOTAL

 

1836

 

605.88

$ 10,239.37

Parent Letter

Parents of students in the impact sample (Appendix D)

 

 

 

 

Cohort 1

 

536

0.05

26.80

$ 452.92

Cohort 2

 

1300

0.05

65.00

$ 1,098.50

Cohort 1 and Cohort 2 TOTAL

 

1836

0.05

91.80

$ 1,551.42

Private School Principal Survey

Private school principals of participating and non-participating schools (Appendix L)

87

0.33

28.71

$ 1,247.16

Private School Principal Letter

Private school principals of participating and non-participating schools (Appendix J)

87

0.05

4.35

$ 188.96

Public/Charter School Principal Survey

Principals of DC public schools and charter schools (Appendix K)

177

0.33

58.41

$ 2,537.33

Public/Charter School Principal Letter

Principals of DC public schools and charter schools (Appendix I)

177

0.05

8.85

$ 384.44

Records from DCPS/Charter School

OSSE Contact (Appendix B)

1

40

40.00

$ 1,737.60

Records from participating private schools

Participating Private School Administrators (Appendix C)

45

2

90.00

$ 3,909.60

Total

 

3057

 

1155.75

$ 21,795.88

a/ Cohort 1 includes impact sample members who applied in spring 2012; follow up data will be collected on cohort 1 in spring 2013, 2014 and 2015. Cohort 2 includes impact sample members who apply in spring 2013; follow up data will be collected on cohort 2 in spring 2014, 2015, and 2016. The information in this table describes the surveys and burden for one (annual) cycle of data collection.

b/ Assumes an hourly rate of $43.44 per hour for principals and administrators (derived from the Bureau of

Labor Statistics’ Occupational Employment and Wages for educational administrators and teachers, May 2011) and $16.90 per hour for parents in households with 4.5 family members meeting eligibility requirements for free lunch (derived from http://www.gpo.gov/fdsys/pkg/FR-2012-03-23/pdf/2012-7036.pdf)

c/ In response to recommendations from the Technical Work Group (TWG) after the sixty day comment period,

three questions were added to the Parent Survey, resulting in a change in estimated burden from 0.25 per respondent

to 0.33.



  1. Estimate of Cost Burden to Respondents


There are no additional respondent costs associated with this data collection other than the hour burden estimated in item A12.





  1. Estimate of Annual Cost to the Federal Government


The estimated cost to the federal government of conducting the Impact Evaluation of the DC Opportunity Scholarship Program is based on the government's contracted cost of the data collection and related study activities along with personnel cost of government employees involved in oversight and/or analysis. For the data collection activities for which OMB approval is currently being requested, the overall cost to the government is $2,658,654. This includes:


  • $560,813 for the first year of data collection, including instrument development

  • $648,840 for the second year of data collection

  • $695,716 for the third year of data collection

  • $753,285 for the fourth year of data collection


The overall costs to the government of the full range of evaluation activities over the entire study period will be $6,068,728 over a five-year period. When annualized, this cost amounts to $1,213,746 per year. This estimate is based on the evaluation contractor's previous experience managing other research and data collection activities of this type.


  1. Program Changes or Adjustments


This is a reinstatement of the #1850-0880 collection that was discontinued on 6/23/2011. The previous evaluation only tested students on Saturdays when the students were not in school. However, this evaluation has both in-school and Saturday testing.


  1. Plans for Tabulation and Publication of Results


The focus of the analyses and reports will be evidence regarding: (1) who applies for and uses a scholarship; (2) what impacts does the offer and use of a scholarship have on student test scores, student and parental satisfaction and perceptions of safety, parental engagement and other participant outcomes; and (3) do the principals at DC public schools and private schools modify how they manage their schools in response to the DC Opportunity Scholarship Program. Rigorous technical standards will be applied in analyzing the data.


Analytic Strategy


The centerpiece of the analytic strategy is the experiment created by the use of a lottery to choose among applicants to the program. The lottery serves as a randomization device exactly as would be done for an experimental study. Because of this equivalence, the text refers to a “treatment group” and a “control group,” which should be understood as “applicants selected by the lottery to receive a scholarship” and “applicants not selected to receive a scholarship.”


The lottery enables the study to estimate effects of school choice on student outcomes that are free of “selection bias,” which can arise when families exercise their ability to choose schools or neighborhoods with local schools. For example, families that value education outcomes highly may appear similar in outward respects to families that value these outcomes less, but the former may prefer private schools. If we simply compare outcomes for students from similar families that do and do not attend private schools, or do and do not apply for an OSP scholarship, differences will conflate both the effect of the OSP and of pre-existing differences in family attributes. Using chance – or random assignment through a lottery – to determine which of the student applicants receive a scholarship eliminates the effects of family selection into the program and allows us to isolate the effects of the program itself.


To motivate the discussion of how we identify the effect of the scholarship program on outcomes, it is useful to begin with a simple representation of the selection problem using the potential outcomes approach. This approach defines causal effects in terms of potential outcomes or counterfactuals. Conceptually, the causal effect of treatment is defined as the difference between the outcome for an individual who is assigned to the treatment group and the outcome for that individual when he or she is not assigned to the treatment group or the following difference in expected outcomes:


(E.1) “E(Yi| Xi, Ti =1)” - “E(Yi |Xi, Ti =0)”


In the case of scholarships, the treatment effect–the effect of the scholarships on academic achievement–would be defined as the difference between “test scores for program students” and “test scores for program students if they had not received a scholarship.” The outcome in the absence of treatment, E(Yi |Xi, Ti =0), is termed the counterfactual--what would have occurred to those students receiving the scholarships if they had not received them.


Of course a student cannot be observed simultaneously both assigned to treatment and not assigned to treatment, or, in the context of the DC OSP evaluation, chosen to receive a scholarship and not chosen to receive a scholarship. What can be observed are a student in the treatment group (Ti =1) and another student in the control group (Ti =0). By the logic of the randomized lottery, the average student receiving scholarships is the same as the average student not receiving scholarships in terms of both observable and unobservable characteristics. The lottery needs to be correctly implemented, of course, meaning that no systematic bias is present in assigning random numbers to individual applicants, but algorithms for doing so have been well established for years.


Consistent with the lottery approach is a basic analytic model of the effects of school choice scholarships on outcomes. It is reasonable to assume that a student’s test score (Yit) is related to his or her characteristics, which we label X, factors that are not observed by the study, which we label ε, and, crucially, whether they received a scholarship offer, which we label T. Receiving a scholarship offer is assumed to affect the school attended and therefore a student’s learning environment. (We return below to issues arising when students do not exercise their scholarship offers.)


(E.2)


In this model, τ represents the “treatment effect,” the effect of a scholarship offer on test scores for students. It should be identical to the difference in average outcomes between the treatment and the control groups. However, including characteristics that predict future achievement (the X characteristics) will improve precision of the estimated impact by reducing the amount of natural variability arising in test scores.


The simple model needs to be modified slightly because the statute specified that certain groups of students be given priority in the lottery. To implement these requirements, the lottery included three groups of students: students with no priority, students with a priority because they attended a school in need of improvement or because they previously were offered a scholarship but never used it, and the highest priority, students who had a sibling already in the program. The probability of receiving a scholarship was highest for siblings and lowest for students with no priority. Students were categorized into their priority status and the lottery was conducted within that “block.”


To accommodate this feature, the statistical model uses a “randomized block design:”


(E.3)


where

i = 1,…..,n observations and k=1,….,b blocks defined by priority status);

Yikt is the outcome for student i in block j, at time t;

μ is the overall mean outcome (e.g. test score);

τ is the treatment effect;

ρj is the shift of the constant for the j-th block;

Bij is an indicator variable equal to “1” if the applicant is in the jth block

Tikt is an indicator variable for receiving a scholarship;

Xik is a set of student and family characteristics measured at baseline

εikt is the random error; Ν(0, σ2).


This analytical framework follows naturally from the lottery and is easily interpreted. Y can be a range of outcomes such as test scores, student satisfaction, parental satisfaction, grade completion, high school graduation, and so on.


Further, effects for particular subgroups of students can be estimated straightforwardly by interacting an indicator of subgroup status with the treatment indicator, so that effects can be estimated for older students versus younger students, low-scorers at baseline versus higher scorers, and so on.


Take-Up of Scholarships


The offer of a scholarship does not carry any obligation to use it and we expect some scholarship winners to forgo it. For example, the previous study of the DC Opportunity Scholarship Program found that 25 percent of scholarship winners did not use their scholarship.


It is common in experimental settings for an applicant to be offered treatment but to refuse it. Analysts have structured two kinds of estimates to account for it. The first, commonly referred to as the "Intent to Treat" (ITT), is the effect of the offer of a scholarship on student outcomes. Equation E.2 above is estimating an ITT effect because it uses only information about whether students were awarded the scholarship and not information about whether they used it.


The second, commonly referred to as the effect of “treatment on treated” (TOT) estimates the effect of using a scholarship. An important difference between TOT estimates and ITT estimates is that by design, ITT estimates are unbiased (the estimate is the true effect plus variance introduced by sampling) whereas TOT estimates possibly are biased to the extent that students who use their scholarships differ from those who are offered but do not use them (the estimate is the true effect plus variance plus another component related to the systematic differences). The source of the bias is that the lottery acts as a device to ensure that groups that receive an offer of a scholarship and do not receive an offer have the same characteristics on average, but it does not ensure that groups using the scholarship have the same characteristics as the group not using it or not receiving an offer. For example, families who value education and/or are more able to gather and analyze information about schools also may be more likely to use the scholarship and have better education outcomes than treatment group students who choose not to use the scholarship.


The study currently plans to use two methods to estimate TOT effects. The first and simpler is the “Bloom” adjustment, which essentially is the ITT effect divided by the difference between the proportions of the treatment group that uses their scholarships and the proportion of the control group that chooses to attend private schools in the absence of a scholarship (not receiving an offer of a scholarship does not preclude a family from sending their child to a private school using their own resources).5 Both groups are “treated” by attending private schools.


Two assumptions underpin the interpretation of effects estimated using the Bloom adjustment: (i) students who receive a scholarship offer but elect not to use it to attend private school would not attend private school if they do not receive an offer, and (ii) students who do not use their scholarships do not experience an effect of the offer. The first assumption is innocuous, as it seems highly likely that a student who receives a money offer to attend a private school and chooses not to attend it would also choose not to attend it if they do not receive the money offer. The second assumption also is plausible, as it means a student who receives a money offer to attend a private school and chooses not to does not experience effects of the offer.6


Another and more technically sophisticated approach to estimating TOT effects is to estimate a two-equation model that explicitly considers the choice to attend a private school along with student outcomes. Under this approach called “instrumental variables” (IV):7


(E.4) Pi = λ0+ λ1Ti + Zi λ2+ εi (Attending private school)


(E.5) Yi = π0+ π1Pi + Xi π2+ νi (Student outcome)


where i represents the student, P represents an indicator variable equal to “1” for attending private school, T represents treatment status (“1” if selected in the lottery), and Z and X represent student and family characteristics.


For technical reasons, some variables in the attendance equation are not also in the outcome equation. The ideal variables for this stage are ones that predict take-up of the offer but which are not related to education outcomes except through their effect on attending private school, so-called “instrumental variables.” The treatment indicator (lottery assignment) is the best such variable because being offered a scholarship is by design uncorrelated with outcomes, but it will be correlated with attending a private school.8


ITT and TOT effects do not represent the same concept and the study’s reports will note this. The ITT effect is the change in outcomes caused by an offer of a scholarship. The TOT effect is the change in outcomes caused by attending private school. As Rouse9 notes, the first effect is relevant for policy because programs only can be created to offer scholarships. Individuals cannot be required to use them. The overall effect of such a program on outcomes is a combination of the

proportion of families that use the scholarship and the effect of attending a private school for those that do. The second piece is the TOT effect, which is relevant for families and also policymakers because it represents how attending a private school can improve outcomes, which presumably is one of the rationales for the program in the first place. (If private and public schools were equally effective, offering scholarships to attend private schools transfers resources but does not improve outcomes.) Both effects contribute to what is known about voucher programs.


Reports


Because of annual reporting requirements for the evaluation, we will prepare two descriptive reports and three impact reports. For the descriptive reports, we will provide comprehensive descriptions of the eligible applicants who went through the lottery (e.g., percentage attending SINI schools), comparisons of the treatment and control group students (e.g., race/ethnicity, gender, annual family income, and average years of mother’s education ), and characteristics of the private schools attended by the treatment group students (e.g., average tuition, average size, average percentage of minorities, average student/teacher ratio). The first descriptive report will focus on the evaluation’s cohort 1 while the second will also include cohort 2.


The impact reports will provide information to answer the five key research questions, as well as describe private school and student participation in the program. The report will include an introduction to the DC Opportunity Scholarship Program, a description of the lottery process that resulted in the pool of eligible applicants and the impact sample, the characteristics of the private schools participating in the program (e.g., religious affiliation, tuition, enrollment), provide context about the environment in which the program is operating, contextual information about the participating students (e.g., usage of the scholarships and movement into and out of public and private schools) and provide some signaling of interest in the program. Remaining chapters will provide findings on the impact of the program on student achievement, the impact of the program on other measures of student success (if data availability permits), the impact of the program on parent and student reports of school satisfaction and safety, and parent involvement, an examination of the reasons why parents choose to participate in the program, and an examination of the patterns in the instructional environment and opportunities afforded to students in the treatment and control groups (e.g., types of instructional programs offered, physical aspects of the school’s facilities). Appendices will provide technical and more comprehensive discussions of the analytic techniques, construction of measures, and response rates.


A schedule for the reports is provided in Table 5.
















Table 5. Deliverable Schedule

Report

Report Contents

Expected Release Date

  1. Descriptive report

for lottery applicants in spring 2012

For the lottery applicants in spring 2012, description of the eligible applicants who went through the lottery; comparisons of treatment and control students on demographic information from the applicants; usage by the treatment group students in their first year in the program; characteristics of the private schools attended by the treatment group students from publicly available data.

June 30,

2013

2. Descriptive report for lottery applicants in spring 2013

For the lottery applicants in spring 2013, description of the eligible applicants who went through the lottery; comparisons of treatment and control students on demographic information from the applicants; usage by the treatment group students in their first year; characteristics of the private schools attended by the treatment group students from publicly available data.

For the lottery applicants in spring 2012, usage in their second year in the program.

June 30,

2014

3. Impact report for lottery applicants after 1 year (interim report)

Impacts after one year in the program on student achievement, parent and student satisfaction, parent and student perceptions of safety, and parent involvement; patterns in the instructional environment and opportunities of students in the treatment group vs. students in the control group; why parents choose to participate in the program; characteristics of the participating private schools; usage by the treatment group in the program.

June 30,

2015

4. Impact report for lottery applicants after 2 years (interim report)

Impacts after two years in the program on student achievement, parent and student satisfaction, parent and student perceptions of safety, and parent involvement; patterns in the instructional environment and opportunities of students in the treatment group vs. students in the control group; why parents choose to participate in the program; characteristics of the participating private schools; usage by the treatment group in the program.

June 30,

2016

5. Impact report for lottery applicants after 3 years (final report)

Impacts after three years in the program on student achievement, parent and student satisfaction, parent and student perceptions of safety, and parent involvement; patterns in the instructional environment and opportunities of students in the treatment group vs. students in the control group; why parents choose to participate in the program; characteristics of the participating private schools; usage by the treatment group in the program.

July 30,

2017



  1. Approval to Not Display OMB Expiration Date


All data collection instruments will include the OMB expiration date.



  1. Explanation of Exceptions


No exceptions are requested.














APPENDIX A


The Authorizing Legislation













APPENDIX B


Public/Charter School Form




























APPENDIX C


Private School Form









APPENDIX D


Letters to Parents












APPENDIX E


Parent Survey













APPENDIX F


Elementary Student Survey











APPENDIX G


Middle School Student Survey












APPENDIX H


High School Student Survey












APPENDIX I


Public/Charter School Principal Letter




























APPENDIX J


Private School Principal Letter












APPENDIX K


Public School Principal Survey











APPENDIX L


Private School Principal Survey












APPENDIX M


Consent Form












APPENDIX N


Confidentiality Statement














1 In May 2012, a grant to run the program was awarded to the DC Children and Youth Investment Trust Corporation (“Trust”), a non-profit organization that operates a privately-funded scholarship program for students in the DC area.

2 Although students who attend a private school when they apply to the OSP are eligible for a scholarship and may be awarded one through a lottery, these students are not included in the evaluation because the “treatment” for these students differs significantly from the OSP treatment for students from public schools. For students already attending a private school when they apply to the OSP, the lottery determines only who will pay for their private school tuition – the federal OSP program vs. other scholarship programs or the families themselves; we have no hypothesis that this difference could result in improvement in achievement although it could affect family resources. In contrast, the lottery of public school applicants in most cases determines whether a student attends a private schools or a public school and there is a body of evidence suggesting that such differences in school settings could lead to differences in achievement.

3 Approval for baseline collection was approved on November 3, 2011 (#1855-0015).

4 Among the two most common assessments that fit the statute’s criteria we have chosen to use the Terra Nova 3 in place of the Stanford Achievement Test 10 (SAT 10) because of its ease in administration, the shorter completion time for students in most grades, and the commitment made by the test publisher to provide test score data back to the evaluation team much more quickly than was the case with the other test publisher. The latter will help us meet the tight deadlines for reporting results contained in the statute.

5Howard S. Bloom, “Accounting for No-Shows in Experimental Evaluation Designs.” Evaluation Review, 8 (1984): 225-246. This estimator is undefined if the proportion of the treatment group that does not use its scholarships is equal to the proportion of the control group that attends private schools because the denominator is zero in this case.

6This assumption is less plausible in other settings, such as when workers receiving unemployment insurance are randomly assigned to participate in employment training. In this context workers may opt to return to work sooner rather than participate in training, which is an effect of the training offer.

7The standard reference for estimating treatment effects using instrumental variables and the assumptions underlying the technique is Joshua D. Angrist, Guido W. Imbens and Donald B. Rubin. “Identification of Causal Effects Using Instrumental Variables.” Journal of the American Statistical Association, Vol. 91, No. 434 (Jun., 1996), pp. 444-455.

8If only T is included in the equation, the estimator becomes equivalent to the previous Bloom estimator.

9 Cecilia Elena Rouse, “Private School Vouchers and Student Achievement: An Evaluation of the Milwaukee Parental Choice Program, “The Quarterly Journal of Economics, May 1998, pp. 553-602.


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorBeth Sinclair
Last Modified Bykatrina.ingalls
File Modified2013-04-17
File Created2013-04-17

© 2024 OMB.report | Privacy Policy