Responses to Comments from Public Notice
Evaluation of the D.C. Opportunity Scholarship Program (OSP)
FR Docket # 2012-28317
ED-2012-ICCD-0055-0001
A total of 7 entities commented during the 60 day notification period for the Evaluation of the D.C. Opportunity Scholarship Program (OSP). These organizations include:
American Association of University Women (AAUA)
Archdiocese of Washington (AOW)
Association of Christian Schools International (ACSI)
National Catholic Educational Association (NCEA)
The National Coalition for Public Education (NCPE)
Council for American Private Education (CAPE)
Secretariat of Catholic Education (SCE)
ED is pleased to be able to hear from the public about this collection. We appreciate your taking the time to respond and to provide your insights and recommendations about this important study. We have carefully reviewed your comments and listened to your suggestions. Below we provide a response to each of the comments.
The information collection is important because of the mandate to conduct the evaluation and to provide important data for the interested policy officials and public.
We appreciate the strong support for the evaluation that was articulated in all of the submissions
Students should be assessed at their schools, during the school day, to minimize burden on them and their families and to help speed production of reports.
ED agrees that students should be assessed during the school day at the school site. As noted on p. 3 and elsewhere in Part A of the information collection package, the data collection plan specifies that all students, both those in the treatment group and in the control group, will be tested at their school during the school day. Our contractor will work closely with each school to arrange the testing logistics for each student in a way that minimizes loss of instructional time and disruption to the school. An informational discussion meeting has already taken place with several private school administrators to obtain their recommendations about how best to arrange in-school testing. Similar conversations have taken place with key officials in DC Public schools.
We appreciate the expressed concern about timely determination of where students attend school and of publication of reports. Considerable thought has been given to each of these issues. Our plan is to collect and process the data in a timely and efficient manner so that required reports are produced in a timely manner. A detailed timeline for data collection and report writing has been developed and will be monitored closely throughout the study.
Items about the application process should be added to the survey instruments.
Although such information could be useful to improve program operations, this information collection is designed to address the congressional mandate to evaluate the effects of the program on students’ educational outcomes and student and parent perceptions of school experiences, satisfaction and safety. Given the limitations in evaluation resources and the feasible length of any survey, proposed questions about the application process would have to displace other questions that are more germane to the study. We do not plan to make such a change at this time.
Data other than parent reports should be used to obtain high school graduation information.
As noted on p. 5 of Part A of this information collection, school records will be obtained for students, whether in the public or private schools, so that persistence and graduation can be calculated from these sources.
The evaluation should collect information about whether the schools of participants are single-sex or sex-segregated.
We appreciate this suggestion and have added 2 questions to the principal survey requesting this information.
The evaluation should collect quality control or monitoring information about the private schools, such as accreditation, financial sustainability, fiscal controls, individual teacher credentials, religious content, anti-discrimination policies, etc.
Although such information could be useful to improve program operations and monitoring, this information collection is designed to address the congressional mandate to evaluate the effects of the program on students’ educational outcomes and student and parent perceptions of school experiences, satisfaction and safety. Given the limitations in evaluation resources and the feasible length of any survey, these proposed topics and survey items would have to displace other questions that are more germane to the study. We do not plan to make such a change at this time.
The evaluation should collect more information about the characteristics of private schools.
Given the limitations in evaluation resources and the feasible length of any survey, additional questions on the characteristics of the private schools would have to displace other questions that are more germane to the study. We do not plan to make such a change at this time.
However, we do note that there are other sources for the information, beyond the surveys and data collection discussed in this package, that the evaluation will take advantage of. For example, the National Center for Education Statistics (NCES) collects descriptive institutional data on all private schools in the country every few years. In addition, the Trust, the operator of the OSP, conducts its own survey of private schools in DC and produces a directory of schools for parents that contain some of the information that the commenter is interested in. The evaluation will obtain these data and use them in analyses describing the private schools.
The evaluation should collect information about applications, awards and use of scholarships, dropping out from the program and why.
The parent and student surveys contained in this information package contain questions regarding why families apply to, participate in, and leave the program. Data on applications and awards come from the program operator and are used in the evaluation’s analysis.
The parent survey should be available in different forms and outreach should be similarly diverse to obtain a high rate of response.
The parent survey has been designed to be “multi-mode.” It will be administered as a web survey but with options for telephone and paper questionnaire completion instead. There will be many letters, phone calls, and emails to encourage survey response. We are planning to translate the survey into different languages based on parent need.
The evaluation should consider increasing the number of students included to about 2,000
The evaluation used statistical analyses and data from past voucher studies to determine the sample size needed to have a reasonable degree of confidence that any differences between students offered scholarships (treatment group) and not offered scholarships (control group) are real and not due to chance. The 1,800 OSP applicants who participate in a lottery should be considered closer to a minimum, rather than a maximum sample required for the evaluation. The final number of students in the evaluation will depend on interest in the program (i.e., the number of eligible applications) and the funds available for new scholarships, neither of which is determined by the evaluation.
The assessment data for individual students should be provided to participating schools to aid in instruction.
Various federal statutes prohibit the evaluation from disclosing any data about individual students to anyone except their parents.
Data analysis for the evaluation should address subgroups of students, such as those based on type of school attended. The analysis should also include descriptive data, such as how many student applied and enrolled, what kinds of schools they came from (e.g, those in need of improvement), whether a student has been diagnosed with a disability, and retention and graduation rates.
The evaluation plans to provide descriptive information on applications and awards, participating schools and students. However, we will only calculate program impacts for groups of students based on their characteristics prior to applying to the program (e.g., the type of school they came from). We cannot examine the impacts of the program for students who entered certain types of private schools after being awarded a scholarship because, while we will know which students in the treatment group made that choice we cannot know which students in the control group WOULD HAVE made that choice if given the opportunity through the lottery.
The evaluation should be designed to provide impacts not only for the offer of an OSP scholarship but also the use of scholarship.
We are uncertain why this comment was included since Part A of this package describes analyzing the impacts of the offer and use of a scholarship and the reports from the prior evaluation of the OSP conducted by ED included several ways of analyzing the effects of using a scholarship.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Juanita Lucas-McLean |
File Modified | 0000-00-00 |
File Created | 2021-01-29 |