|
U.S. DEPARTMENT OF EDUCATION INSTITUTE OF EDUCATION SCIENCES
NATIONAL CENTER FOR EDUCATION STATISTICS |
Date: |
May 20, 2010 |
|
|
To: |
Rochelle W. Martinez, OMB |
|
|
Through: |
Kashka Kubzdela, NCES |
|
|
From |
Peter Tice, NCES |
|
|
Subject: |
Fast Response Survey System (FRSS) 99: District Survey of Dropout Prevention Services and Programs |
Justification
The National Center for Education Statistics (NCES), U.S. Department of Education (ED) proposes to employ the Fast Response Survey System (FRSS) to conduct a district survey about dropout prevention services and programs. This survey will provide the first nationally representative data on this topic by capturing a current snapshot of dropout prevention services and programs available within the nation’s public school districts. In addition to dropout prevention services and programs available in public school districts, the proposed survey will cover factors and methods used to identify students at risk of dropping out, mentoring and transition support services used by the district, the entities with which districts work in their dropout prevention efforts, information provided to students who appear highly likely to drop out, follow-up efforts when a student drops out, and information used by the district in determining whether to implement additional dropout prevention efforts district wide.
The FRSS survey, under OMB clearance #1850-0733, is authorized under the Education Sciences Reform Act of 2002 (20 U.S.C. 9543), which authorizes NCES to collect and report statistical data related to education in the United States.
Design
Overview of Survey Development
Westat will collect the information for the Early Childhood, International, and Crosscutting Studies Division, NCES, U.S. Department of Education, using the FRSS. Westat is responsible for the questionnaire development; sample design and selection; data collection by mail and web; telephone follow up; editing, coding, keying, and verification of the data; and production of tabulations and the report detailing the results of the survey.
The development work for this survey included a literature review on dropout prevention programs and services and four rounds of feasibility calls that informed the survey design. The first two rounds of feasibility calls were conducted to identify topics that could be addressed in a short questionnaire and they informed the initial draft of the survey. The third and fourth rounds were conducted to assess the clarity and relevance of the developed survey items, and to gauge whether respondents thought they could answer the questions without too much burden. In the third round of feasibility calls, respondents reviewed the draft survey and provided feedback over the phone for all survey items. Based on their feedback, the survey was revised and, in the fourth round, respondents provided feedback only about the items and definitions that had changed since the third round of calls. The four rounds of feasibility calls were conducted between March 2009 and February 2010. The resulting draft of the survey was then reviewed by the NCES Quality Review Board (QRB).
Based on feedback from the QRB, the survey was revised and a pretest of the questionnaire was conducted to identify problems they might have in providing the requested information. The purpose of the pretest was to verify that all questions and corresponding instructions were clear and unambiguous, to determine if the information would be readily available to respondents, and to determine whether the burden on respondents could be further reduced. Responses and comments on the pretest questionnaire were collected by fax and telephone. Changes to the questionnaire were made based on the feedback received from the pretest, and documented in a memorandum summarizing the pretest results.
Description of Sample and Burden
The proposed sample design is a nationally representative sample of approximately 1,200 public school districts from the NCES Common Core of Data (CCD) 2007-08 Local Education Agency (School District) Universe File.1 The data collection will be accomplished by means of a self-administered survey. Respondents will have the option of completing the survey on a traditional paper and pencil questionnaire or on a Web version of the questionnaire that will be accessed through the Internet. The questionnaire is limited to three pages of items readily available to respondents and can be completed by most respondents in 20 minutes or less. These procedures are typical for FRSS surveys and result in minimal burden on respondents.
Questionnaires and information needed to access the Web survey will be mailed in September 2010 to the superintendent of each sampled school district. Follow up for nonresponse will be conducted both by mail and telephone and will begin about 3 weeks after the questionnaires have been mailed to the districts. Experienced telephone interviewers will be trained to conduct the nonresponse follow up and will be monitored by Westat supervisory personnel. Telephone nonresponse follow up is used to prompt respondents to complete the survey by web or mail and is expected to take about 5 minutes.
The response rates for FRSS surveys of districts typically have been 90 percent or greater. At a response rate of 90 percent, the initial sample of approximately 1,200 districts will yield about 1,080 completed questionnaires. Based on a response burden of approximately 20 minutes per completed questionnaire, the estimated response burden to complete the questionnaire is estimated to be about 360 hours (table 1). It is anticipated that about 25 percent of the sample will have returned the completed survey before nonresponse follow up begins and about 75 percent of the sample (i.e., 900 respondents) will receive a nonresponse follow up call that takes about 5 minutes. The total estimated burden time for nonresponse follow up is about 75 hours. The total number of burden hours for data collection and nonresponse follow up is about 435 hours.
Table 1. Estimated burden for data collection and nonresponse follow up.
Type of Collection |
Sample size |
Estimated response rate (percent) |
Estimated number of respondents |
Estimated number of responses |
Total burden hours per respondent |
Respondent Burden Hours |
District Questionnaire |
1,200 |
90 |
1,080 |
1,080 |
.333 |
360 |
District Nonresponse follow-up call |
1,200 |
75 |
900 |
900 |
.083 |
75 |
|
|
|
|
|
|
|
Total Burden |
|
|
1,080 |
1,980 |
|
435 |
Procedures and Data Collection Instrument
A cover letter (Attachment 1), questionnaire (Attachment 2), and web information sheet (Attachment 3) will be mailed to each sampled district. The cover letter requests the participation of the district and introduces the purpose and content of the survey. It also notes that the survey should be completed by the person most knowledgeable about dropout prevention services and programs in the school district. The cover letter also includes instructions on how to complete and return the survey, as well as contact information in case of queries. A Web information sheet will also be included in the mailing which will provide information about the option to complete a Web version of the survey. On the cover of the survey, respondents are assured that their participation is voluntary and their answers may not be disclosed or used in identifiable form for any other purpose unless compelled by law (Education Sciences Reform Act of 2002, 20 U.S.C. 9573). The public law is cited on the front page of the survey (Attachment 2). All sampled districts that do not complete a survey within 3 weeks after the initial mailing of the survey will also receive a nonresponse follow-up letter (Attachment 4), another copy of the Web information sheet (Attachment 3), and a brief, scripted telephone call (Attachment 5) prompting the respondent to return a completed survey via the Web or mail.
The survey is designed to collect basic information on dropout prevention services and programs offered for students enrolled in a public school district. The first two questions ask about the services and programs that are available in the district to address the needs of students who are at risk of dropping out. Some of the services and programs presented are those that may be offered at elementary school, middle/junior high school, and high school instructional levels (e.g., tutoring, summer school to prevent grade retention, and remediation classes), whereas others are typically only offered to students at the secondary level (e.g., General Education Development (GED) preparation courses, and early graduation options). Because the levels at which services and programs may be offered vary, only the first question on the survey asks about services and programs offered at multiple instructional levels. Response options for question 2 are not tied to an instructional level because it is expected, based on the survey development work, that these options are largely available only at the high school level.
The literature and district respondents in feasibility calls indicated that an important component of dropout prevention is offering educational options that are relevant to students’ life or career goals. To capture this in the survey, question 3 asks about various educational options available in the district (e.g., career/technical high school, dual enrollment in postsecondary courses with a career/technical focus) and whether students who are at risk of dropping out participate in those educational options. When pretesting this question, districts reported that there were a wide range of reasons why no or few at-risk students or only some at-risk students participate in various educational options. Some of the reasons included lack of student interest or motivation, availability to only very high academic performers (such as gifted and talented students), and limited availability of the educational option in the school district (e.g., a very small program only available to a limited number of at-risk students around the district).
Another service that was discussed by respondents in feasibility calls as one that helps to prevent students from dropping out is childcare services while a teen parent is attending classes. During feasibility calls, some respondents indicated that their district provided child care for teen parents while others indicated that their district subsidized child care for those students. Question 4 asks whether the district provides or subsidizes childcare for teen parents while they are attending classes.
The literature and respondents in feasibility calls also indicated that transitions from a school of one instructional level to a school at a higher instructional level can be particularly difficult for students who are at risk of dropping out. Questions 5 and 6 ask about the processes and supports used by the district to help students in such transitions. Transitional supports may include assigning each student a student or adult mentor or offering an advisement class during the first year at the new school.
Question 7 asks about different types of mentors used in the district specifically to address the needs of students at risk of dropping out. The list of the types of mentors was developed based on the literature review and feedback from district respondents. Examples of mentors include student mentors, school counselors, teachers, or school administrators who formally mentor students; adult mentors employed by the district whose only job is to mentor students; and community volunteers who mentor.
One type of program that has been discussed in the literature as effective in reducing the dropout rate in schools and districts is a school-wide or classroom-wide program to reduce behavioral problems. Question 8 asks whether any of the schools in the district use formal programs designed to reduce behavioral problems in schools or classrooms. Because districts may employ these programs at one or multiple instructional levels, respondents are asked to report by instructional level.
Question 9 asks if the district has a standardized method of identifying students who are at risk of dropping out (e.g., a standardized checklist of at-risk behaviors or an electronic warning system). Question 10 asks about the factors used in the district to identify students who are at risk of dropping out. Again, these were identified based on a review of the literature and through discussions with district respondents during feasibility calls about the factors commonly used to identify at-risk students. Among some of the factors that may indicate that a student is at risk are truancy or excessive absences, academic failure, behaviors that warrant suspension or expulsion, and substance abuse.
Because districts often work with other entities to address the needs of students who are at risk of dropping out, Question 11 asks about those entities. The list of entities included in the survey was developed based on a review of the literature and discussions with district respondents in feasibility calls. Some of the entities listed include child protective services, local businesses, community mental health agencies, and churches or community organizations (e.g., Boys and Girls Clubs, United Way, and Lion’s Clubs).
Question 12 and 13 ask about information provided to students who appear highly likely to drop out, including information about the employment or financial consequences of dropping out and the education and training options available to them (e.g., alternative schools and programs, job training/GED combination programs, GED or adult education programs, and job training programs). For questions 12 and 13, response options include “Yes, this is standard procedure with all students highly likely to drop out”, “Yes, with some students”, and “No.” During feasibility calls and the pretest, some districts indicated that providing this information is standard procedure with all students who appear highly likely to drop out, whereas in other districts, respondents indicated that providing the information covered in these questions was based on factors such as an individual student’s need or situation and thus the information is only provided to some students.
Question 14 and 15 ask about dropout recovery efforts in the district, including whether the district tries to determine the status of students who were expected to return in the fall but who do not return as expected, and whether the district follows up with students who have dropped out before the next school year to encourage them to return. Response options for both questions 14 and 15 allow districts to indicate that districts follow up with all students, some students, or no students. In question 14, some feasibility respondents indicated that follow up was limited to only some students, such as those who were within the compulsory school attendance age range. In question 15, respondents provided a range of responses for why the district followed up with only some students who dropped out including lack of manpower and inability to locate many of the students.
Question 16 asks about information the district uses to determine whether to implement additional district-wide dropout prevention efforts. Some examples of types of information that may be used include dropout rates, graduation rates, and attendance rates. As with the other items in this survey, the list of the types of information for this question was developed based on a review of the literature and from feasibility call discussions with district respondents.
Consultations Outside of Agency
In addition to the four rounds of feasibility and pretest calls conducted with district respondents and some school-level staff (in the first round of feasibility calls only), general topics were identified through literature reviews and in consultation with Mark Dynarski, an expert on dropout prevention and Director of the IES What Works Clearinghouse. Additional comments were requested on various drafts of the survey from several reviewers outside of NCES including, Ed Pacchettii (Special Assistant to the Senior Advisor on the Secretary’s Initiative on College Access), Theda Zawisza (Office of Elementary and Secondary Education (OESE)), and Braden Goetz (OESE).
Survey Cost and Time Schedule
The survey is estimated to cost the Federal government about $480,000, including about $440,000 for contractual costs and $40,000 for salaries and expenses. Contractual costs include the costs for survey preparation, data collection, data analysis, and report preparation and dissemination.
Mailing of the survey is planned for September 2010. About 3 weeks after mailout of the surveys, Westat will begin telephone follow up for nonresponse. Data collection is scheduled for completion about 16 weeks after initial mail out.
Plan for Tabulation and Publication
Most of the analyses of the questionnaire data will be descriptive in nature, providing data users with tables and appropriate explanatory text. Reports of the findings will be distributed to survey respondents and, upon request, to other interested individuals and organizations, as well as published on the NCES website. Survey responses will be weighted to produce national estimates. Tabulations will be produced for each data item. Crosstabulations of data items will be made with selected classification variables, such as the following.
District enrollment size (less than 2,500, 2,500-9,999, and 10,000 or more);
Region (Northeast, Southeast, Central, West);
Community type (city, suburban, town, rural); and
Poverty concentration (less than 10 percent, 10 to 19 percent, 20 percent or more).
Statistical Methodology
Reviewing Statisticians
Adam Chu, Senior Statistician, Westat, (301) 251-4326, was consulted about the statistical aspects of the design. Westat is the contractor currently conducting the QRIS surveys for NCES.
Respondent Universe and Statistical Methodology
The respondent universe for the proposed FRSS survey on dropout prevention services and programs will include all regular public school districts in the United States (50 states and the District of Columbia). School districts in the outlying U.S. territories will be excluded from the survey. The most recent Common Core of Data (CCD) Local Education Agency (LEA) Universe File maintained by the National Center for Education Statistics (NCES) will be used to select a stratified sample of school districts for the proposed survey. Table 2 summarizes the distribution of the 13,645 regular public school districts (i.e., districts with an NCES type-of-agency code of 1 or 2) in the 2007-08 CCD universe file (the most recent file currently available).
Districts in the sampling frame will be stratified by instructional level of schools in the district and enrollment size class. A stratified sample of 1,200 public school districts will be allocated to strata in rough proportion to the aggregate square root of the enrollment of districts in the stratum. Such an allocation gives large districts relatively higher selection probabilities than smaller ones, and is expected to provide acceptable sampling precision for both prevalence estimates (e.g., the proportion of districts with a specified characteristic) and numeric measures correlated with enrollment (e.g., the number of students in districts with various dropout prevention services or programs). Prior to sample selection, districts in the sampling frame will be sorted by community type (city, suburban, town, rural) and region (Northeast, Southeast, Central, and West) to induce additional implicit stratification. Within each primary stratum, districts will be selected systematically and with equal probabilities. Assuming an overall response rate of 90 percent, the initial sample of approximately 1,200 districts will yield about 1,080 completed questionnaires. Table 3 summarizes the proposed sample allocation and the expected sample yields by primary sampling stratum.
Table 2. Distribution of public school districts in the 2007-08 NCES Common Core of Data (CCD) Local Education Agency Universe File
Instructional level
|
Enrollment size class
|
Number of districts* |
Total enrollment |
Number of schools |
Elementary only |
Less than 1,000 |
1,618 |
343,740 |
1,744 |
|
1,000 to 2,499 |
47 |
70,469 |
147 |
|
2,500+ |
34 |
230,746 |
378 |
|
Subtotal |
1,699 |
644,955 |
2,269 |
Unified/secondary |
Less than 1,000 |
4,771 |
2,282,376 |
12,197 |
|
1,000 to 2,499 |
3,262 |
5,313,409 |
14,105 |
|
2,500 to 9,999 |
3,047 |
14,395,630 |
27,463 |
|
10,000 to 24,999 |
586 |
8,905,141 |
14,561 |
|
25,000 to 99,999 |
253 |
10,633,326 |
16,776 |
|
100,000+ |
27 |
6,009,021 |
8,605 |
|
Subtotal |
11,946 |
47,538,903 |
93,707 |
Total |
|
13,645 |
48,183,858 |
95,976 |
* Counts are of regular school districts in the 2007-08 CCD Local Education Agency universe file. If available, the more current 2008-09 CCD will be used for sampling.
Table 3. Proposed sample sizes for the FRSS district survey
Stratum |
Instructional level |
Enrollment size class of district |
Number of districts to be sampled |
Expected number of responding districts* |
1 |
Elementary |
Less than 1,000 |
65 |
58 |
2 |
|
1,000 to 2,499 |
3 |
3 |
3 |
|
2,500+ |
7 |
6 |
4 |
Unified or |
Less than 1,000 |
191 |
172 |
5 |
secondary |
1,000 to 2,499 |
241 |
217 |
6 |
|
2,500 to 9,999 |
427 |
384 |
7 |
|
10,000 to 24,999 |
141 |
127 |
8 |
|
25,000 to 99,999 |
101 |
91 |
9 |
|
100,000+ |
27 |
24 |
Total |
|
|
1,203 |
1,082 |
*Assumes an overall response rate of 90 percent.
Expected Levels of Precision
Table 4 summarizes the approximate sample sizes and standard errors to be expected under the proposed design for selected subgroups. Since the sample sizes in Table 4 are based on preliminary tabulations of the 2007-08 CCD file, the actual sample sizes to be achieved may differ from those shown. Note that the sample sizes represent the expected numbers of completed questionnaires with eligible districts, and not the initial numbers of districts to be sampled. The standard errors in Table 4 reflect design effects ranging from 1.2 to 1.4 depending on subgroup. The design effect primarily reflects the fact that under the proposed stratified design, large districts will be sampled at relatively higher rates (i.e., have smaller sampling weights) than small districts. The standard errors in Table 4 can be converted to 95 percent confidence bounds by multiplying the entries by 2. For example, an estimated proportion of the order of 20 percent (P = 0.20) for suburban districts will be subject to a margin of error of ±5.2 percent at the 95 percent confidence level. Similarly, an estimated proportion of the order of 50 percent (P = 0.50) for districts in the Northeast will be subject to a margin of error of ±7.8 percent at the 95 percent confidence level.
Table 4. Expected standard error of an estimated proportion under proposed design for selected analytic domains
Domain (subset) |
Expected sample size* |
Standard error† of an estimated proportion equal to ... |
||
P = 0.20 |
P = 0.33 |
P = 0.50 |
||
Total sample |
1,082 |
0.014 |
0.017 |
0.018 |
Community Type |
|
|
|
|
City |
152 |
0.038 |
0.045 |
0.048 |
Suburban |
325 |
0.026 |
0.031 |
0.033 |
Town |
206 |
0.033 |
0.039 |
0.041 |
Rural |
400 |
0.024 |
0.028 |
0.030 |
Region |
|
|
|
|
Northeast |
227 |
0.031 |
0.037 |
0.039 |
Southeast |
195 |
0.034 |
0.040 |
0.042 |
Central |
325 |
0.026 |
0.031 |
0.033 |
West |
336 |
0.026 |
0.030 |
0.032 |
District Enrollment Class |
|
|
|
|
Under 2,500 |
450 |
0.021 |
0.024 |
0.026 |
2,500 to 9,999 |
390 |
0.022 |
0.026 |
0.028 |
10,000 or more |
242 |
0.028 |
0.033 |
0.035 |
* Expected number of responding eligible districts, assuming response rate of 90 percent. The standard errors given in this table are given for illustration. Actual standard errors may differ from those shown. † Assumes unequal weighting design effect of 1.2 to 1.4 depending on subgroup. |
Estimation and Calculation of Sampling Errors
For estimation purposes, sampling weights reflecting the overall probabilities of selection and adjustments for nonresponse will be attached to each data record. To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, 50-100 subsamples or "replicates" will be formed in a way that preserves the basic features of the full sample design. A set of estimation weights (referred to as "replicate weights") will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and each of the jackknife replicates. The variability of the replicate estimates is used to obtain a measure of the variance (standard error) of the survey statistic. Previous surveys, using similar sample designs, have yielded relative standard errors (i.e., coefficients of variation) in the range of 2 to 10 percent for most national estimates. Similar results are expected for this survey.
1 If available, the more current 2008-09 CCD will be used for sampling.
WASHINGTON, D.C. 20006―
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Date: |
Author | Priscilla Carver |
File Modified | 0000-00-00 |
File Created | 2021-02-01 |