|
U.S. DEPARTMENT OF EDUCATION INSTITUTE OF EDUCATION SCIENCES
NATIONAL CENTER FOR EDUCATION STATISTICS |
Date: |
August 11, 2010 |
|
|
To: |
Rochelle W. Martinez, OMB |
|
|
Through: |
Kashka Kubzdela, NCES |
|
|
From |
Peter Tice, NCES |
|
|
Subject: |
Fast Response Survey System (FRSS) 98: District Survey of Distance Education Courses for Public Elementary and Secondary School Students: 2009-10 |
Justification
The National Center for Education Statistics (NCES), U.S. Department of Education (ED) proposes to employ the Fast Response Survey System (FRSS) to conduct a district survey about technology-based distance education for public elementary and secondary school students. The proposed survey was requested by the Office of Educational Technology (OET), ED. This survey will provide nationally representative data on this topic by presenting current information about enrollments in distance education courses in the nation’s public elementary and secondary schools, as well as covering tracking and monitoring of student progress in distance education courses, district record-keeping, entities with which districts partner to deliver distance education courses, reasons for having distance education, types of distance education courses, and technologies used to deliver these courses. This survey will provide the only current nationally representative data on this topic.
The FRSS survey, under OMB clearance #1850-0733, is authorized under the Education Sciences Reform Act of 2002 (20 U.S.C. 9543), which authorizes NCES to collect and report statistical data related to education in the United States.
Design
Overview of Survey Development
Westat will collect the information for the Early Childhood, International, and Crosscutting Studies Division, NCES, U.S. Department of Education, using the FRSS. Westat is responsible for the questionnaire development; sample design and selection; data collection by mail and web; telephone follow up; editing, coding, keying, and verification of the data; and production of tabulations and the report detailing the results of the survey.
Two iterations of the district survey Distance Education Courses for Public Elementary and Secondary School Students were previously conducted by NCES for school years 2002-03 and 2004-05. The development work for the current survey (2009-10) is based on the previous versions, with modifications based on the rounds of feasibility calls with public school district personnel most knowledgeable about distance education. The first round of calls focused on potential survey topics and potential revisions to the definition. In the second round, the feasibility of and burden associated with providing specific enrollment numbers was explored, and the definition was finalized. The third round provided a review of the entire questionnaire, and the fourth round focused on a brief review of a subset of modified questions. The four rounds of feasibility calls were conducted between February 2009 and February 2010 to update the survey questions. The resulting draft of the survey was then reviewed by the NCES Quality Review Board (QRB).
Based on feedback from the QRB, the survey was revised and a pretest of the questionnaire was conducted with 15 respondents to identify problems respondents might have in providing the requested information. The purpose of the pretest was to verify that all questions and corresponding instructions were clear and unambiguous, to determine if the information would be readily available to respondents, and to determine whether the burden on respondents could be further reduced. Responses and comments on the pretest questionnaire were collected by fax, email, and telephone. Changes to the questionnaire were made based on the feedback received from the pretest, and documented in a memorandum summarizing the pretest results. OET, the data requester for this survey, reviewed and approved the questionnaire changes made after the pretest.
Assurance of Confidentiality
Data to be collected will not be released with institutional or personal identifiers attached. Data will be presented in aggregate statistical form only. In addition, each data file undergoes extensive disclosure risk analysis and is reviewed by the NCES/IES Disclosure Review Board before use in generating report analyses and before release as a public use data file. Each respondent will be assured that all information identifying them or their school will be kept confidential in compliance with the Education Sciences Reform Act of 2002 (P.L. 107-279).
Description of Sample and Burden
The proposed sample design is a nationally representative sample of 2,306 public school districts from the NCES Common Core of Data (CCD) 2008-09 Local Education Agency (School District) Universe File. The data collection will be accomplished by means of a self-administered survey. Respondents will have the option of completing the survey on a traditional paper and pencil questionnaire or on a Web version of the questionnaire that will be accessed through the Internet. The questionnaire is limited to three pages of items readily available to respondents and can be completed by most respondents in 30 minutes or less. These procedures are typical for FRSS surveys and result in minimal burden on respondents.
Questionnaires and information needed to access the Web survey will be mailed in October 2010 to the superintendent of each sampled school district. Follow up for nonresponse will be conducted both by mail and telephone and will begin about 3 weeks after the questionnaires have been mailed to the districts. Experienced telephone interviewers will be trained to conduct the nonresponse follow up and will be monitored by Westat supervisory personnel. Telephone nonresponse follow up is used to prompt respondents to complete the survey by web or mail and is expected to take about 5 minutes.
The response rates for FRSS surveys of districts typically have been 90 percent or greater. At a response rate of 90 percent, the initial sample of 2,306 districts will yield about 2,076 completed questionnaires. Based on a response burden of approximately 30 minutes per completed questionnaire, the response burden to complete the questionnaire is estimated to be about 1,038 hours (table 1). It is anticipated that about 25 percent of the sample will have returned the completed survey before nonresponse follow up begins and about 75 percent of the sample (i.e., 1,730 respondents) will receive a nonresponse follow up call that takes about 5 minutes. The total estimated burden time for nonresponse follow up is about 144 hours. The total number of burden hours for data collection and nonresponse follow up is about 1,182 hours.
Table 1. Estimated burden for data collection and nonresponse follow up.
Type of Collection |
Sample size |
Estimated response rate (percent) |
Estimated number of respondents |
Estimated number of responses |
Total burden hours per respondent |
Respondent Burden Hours |
District Questionnaire |
2,306 |
90 |
2,076 |
2,076 |
.50 |
1,038 |
District Nonresponse follow-up call |
2,306 |
75 |
1,730 |
1,730 |
.083 |
144 |
|
|
|
|
|
|
|
Total Burden |
|
|
2,076 |
3,806 |
|
1,182 |
Procedures and Data Collection Instrument
A cover letter (Attachment 1), questionnaire (Attachment 2), and web information sheet (Attachment 3) will be mailed to each sampled district. The cover letter requests the participation of the district and introduces the purpose and content of the survey. It also notes that the survey should be completed by the person most knowledgeable about distance education courses available to students in public elementary and secondary schools in the district. The cover letter also includes instructions on how to complete and return the survey, as well as contact information in case of queries. A Web information sheet will also be included in the mailing which will provide information about the option to complete a Web version of the survey. On the cover of the survey, respondents are assured that their participation is voluntary and their answers may not be disclosed or used in identifiable form for any other purpose unless compelled by law (Education Sciences Reform Act of 2002, 20 U.S.C. 9573). The public law is cited on the front page of the survey (Attachment 2). All sampled districts that do not complete a survey within 3 weeks after the initial mailing of the survey will also receive a nonresponse follow-up letter (Attachment 4), another copy of the Web information sheet (Attachment 3), and a brief, scripted telephone call (Attachment 5) prompting the respondent to return a completed survey via the Web or mail.
The survey is designed to collect general information on distance education courses for public elementary and secondary school students in the nation’s public school districts. The first three questions ask about enrollments in distance education courses. The first question is provided to ‘screen out’ districts that do not have students enrolled in distance education courses. The second question asks for the number of enrollments in distance education courses, i.e., a duplicated count of students. Respondents are asked to report enrollments by instructional level. This is the same as the 2002-03 and 2004-05 versions of the question. Question 3 asks whether the district can provide the number of students enrolled in distance education courses, i.e., an unduplicated count. Respondents in the first three rounds of feasibility calls indicated that these counts were not always available, and that when they were available, providing these counts was burdensome. As a result, the question was modified to have a Yes-No response. OET is interested in collecting this information for possible future research.
Questions 4-6 focus on tracking and monitoring distance education courses. Question 4 asks whether the district distinguishes distance education courses from other academic courses on academic records kept by the district. This question provides information about the ease with which districts can identify these types of courses in student records. Question 5 asks about tracking of information on completions in these courses. During feasibility calls, respondents indicated that providing numbers of completions would be burdensome and problematic due to differing definitions and time periods for tracking completions information. Thus, the question was modified from the 2004-05 version to ask about fewer completion types and have a Yes-No type of response rather than asking for enrollments. Question 6 asks about specific ways districts may be monitoring progress in distance education courses.
Questions 7-9 were included to obtain information about district policies and practices regarding distance education courses and programs. Question 7 asks about the written policies that specify the consequences of not successfully completing a distance education course. Questions 8 and 9 focus on students enrolled in regular high school programs in a district. Question 8 asks whether students can take a full course load of distance education courses in an academic term, while Question 9 asks whether students can take distance education courses to fulfill all high school graduation requirements. During survey development, respondents indicated that taking a full course load or a full program through distance education was often handled differently depending on the type of program (e.g., credit recovery, alternative schools, regular high school program). For this reason, questions 8 and 9 ask only about regular high school programs.
Districts may use many entities to deliver and/or develop distance education courses. Question 10 asks districts which entities deliver these courses, and which five entities most frequently deliver distance education courses (via rank order). This question has been slightly modified from the 2004-05 version of the questionnaire with the addition of the Part 2/ranking task. Information about the extent to which districts or other entities are developing distance education courses is asked in Question 11.
Question 12 asks about the types of distance education courses taken (e.g., Advanced Placement, credit recovery). Question 13 asks how important various reasons are for having distance education courses in the district. Question 13 is similar enough for comparisons of at least some items in the 2002-03 version of the questionnaire.
Questions 14 and 15 address the technologies used for delivery of distance education courses. Question 14, which asks about the extent to which each technology is used, is a modified version of a similar question on the 2004-05 version of the questionnaire, with the extent scale replacing Yes/No response categories. Question 15 is the same as a question on the 2004-05 questionnaire. It asks which one technology is used as the primary mode of instructional delivery for the greatest number of courses.
Question 16 asks about courses delivered over the Internet and is included to set up the skip pattern for Question 17. Question 17 asks about locations in which students were accessing Internet-based courses, which is similar to a question on the 2004-05 version. The question wording is slightly modified to reference courses “delivered over the Internet”, rather than courses that are “online”, but the responses remain the same. The question should be similar enough for comparisons across the surveys.
Because there is interest in district plans for distance education courses for the future, Question 18 asks if the district will expand the number of distance education courses offered in the next three years.
Lastly, Question 19 asks districts whether they deliver distance education courses to students not regularly enrolled in the district. The wording varies slightly from a question on the 2004-05 version and is similar enough for comparison across surveys.
Consultations Outside of Agency
In addition to the four rounds of feasibility and pretest calls conducted with district respondents, the Office of Educational Technology provided extensive input during survey development, and reviewed and approved all questions.
Survey Cost and Time Schedule
The survey is estimated to cost the Federal government about $480,000, including about $440,000 for contractual costs and $40,000 for salaries and expenses. Contractual costs include the costs for survey preparation, data collection, data analysis, and report preparation and dissemination.
Mailing of the survey is planned for October 2010. About 3 weeks after mailout of the surveys, Westat will begin telephone follow up for nonresponse. Data collection is scheduled for completion about 16 weeks after initial mail out.
Plan for Tabulation and Publication
Most of the analyses of the questionnaire data will be descriptive in nature, providing data users with tables and appropriate explanatory text. Reports of the findings will be distributed to survey respondents and, upon request, to other interested individuals and organizations, as well as published on the NCES website. Survey responses will be weighted to produce national estimates. Tabulations will be produced for each data item. Cross-tabulations of data items will be made with selected classification variables, such as the following.
District enrollment size (less than 2,500, 2,500-9,999, and 10,000 or more);
Region (Northeast, Southeast, Central, West);
Community type (city, suburban, town, rural); and
Poverty concentration (less than 10 percent, 10 to 19 percent, 20 percent or more).
Statistical Methodology
Reviewing Statisticians
Peter Tice, of NCES, is the Project Officer for this survey. Adam Chu, Senior Statistician, Westat, was consulted about the statistical aspects of the design. Westat is the contractor currently conducting the QRIS surveys for NCES.
Respondent Universe and Statistical Methodology
The sampling frame for the proposed survey on distance education will be constructed from the 2008-09 NCES Common Core of Data (CCD) Local Education Universe Survey. The CCD file contains over 17,000 local public school districts of all types in the United States and outlying territories. However, districts in the outlying U.S. territories, along with certain types of “nonregular” districts specified by NCES, will be excluded from the survey. The survey universe consists of 15,754 districts, of which 13,563 are “regular” local school districts and 2,191 are charter school districts (Table 2). Regular districts are defined to be those with an NCES type-of-agency code of 1 (local school district that is not a component of a supervisory union) or 2 (local school district component of a supervisory union). Charter school districts are those with an NCES type-of-agency code of 7 (nonregular districts consisting of charter schools only) or 8 (specifically, the subset of other nonregular districts that operate at least one charter school). This definition of the respondent universe is generally consistent with that used in the previous 2005 FRSS district survey on distance education.
Table 2. Distribution of public school districts in the 2008-09 CCD file by district type, enrollment size class and poverty level
|
|
|
|
Percent of children in poverty4 |
||||
District Type |
Enrollment size class |
Number of districts3 |
Missing |
Less than 10 |
10 to 19.9 |
20 to 29.9 |
30+ |
|
|
|
|
|
|
|
|
|
|
1. Regular1 |
Less than 1,000 |
6,373 |
88 |
1,439 |
2,835 |
1,471 |
540 |
|
|
1,000 to 2,499 |
3,272 |
19 |
944 |
1,345 |
706 |
258 |
|
|
2,500 to 9,999 |
3,044 |
8 |
1,129 |
1,112 |
617 |
178 |
|
|
10,000 to 99,999 |
848 |
2 |
224 |
391 |
172 |
59 |
|
|
100,000+ |
26 |
0 |
6 |
12 |
7 |
1 |
|
|
Total |
13,563 |
117 |
3,742 |
5,695 |
2,973 |
1,036 |
|
2. Charter2 |
0 or NA |
112 |
|
|
|
|
|
|
|
Less than 1,000 |
1,973 |
––– |
––– |
––– |
––– |
––– |
|
|
1,000 to 2,499 |
88 |
––– |
––– |
––– |
––– |
––– |
|
|
2,500+ |
18 |
––– |
––– |
––– |
––– |
––– |
|
|
Total |
2,191 |
––– |
––– |
––– |
––– |
––– |
|
|
1 Type 1 (local school district not part of a supervisory union) or type 2 (local school district component of a supervisory union). 2 Type 7 (districts consisting entirely of charter schools) or type 8 (other nonregular districts with at least one charter school). 3 Excluded are districts in the outlying territories and regular districts for which enrollment is 0 or not applicable. 4 Based on district-wide estimates of the percent of children 5-17 years of age in families living below the poverty level (source: 2008 district estimate file in the Census Bureau’s Small Area Income and Poverty Estimates [SAIPE] website, http://www.census.gov//did/www/saipe/).
|
A stratified sample of approximately 2,300 public school districts will be selected for the study. The total sample size of 2,300 is designed to yield up to 900 completed questionnaires with districts having distance education courses/programs, based on rates from the previous 2005 iteration of the FRSS distance education survey. The sample will be selected from strata defined by type-of-district (regular versus charter school districts), district enrollment size class, and poverty status. The total sample of 2,300 districts will be allocated to the strata in rough proportion to the aggregate square root of the enrollment in the stratum. The use of the square root of enrollment to allocate the total sample is a compromise designed to achieve acceptably small standard errors for district-level prevalence estimates as well as for numeric measures correlated with enrollment.
Implicit substratification by type of locale and region will also be employed to ensure that the sample includes an appropriate cross-section of the population with respect to these characteristics to the extent feasible. Within each sampling stratum, districts will be selected systematically and with equal probabilities at rates determined by the sample allocation described above. In general, large districts will be sampled at relatively higher rates than small districts. Assuming a response rate of 90 percent, the initial sample of roughly 2,300 districts will yield over 2,000 responding districts, of which an estimated 800-900 completed questionnaires with districts having distance education programs will be obtained. The sample sizes and expected numbers of responding districts offering distance education are shown in Table 3.
Finally, it should be noted that to allow for longitudinal analyses, the selection of districts will be made in such a way as to maximize overlap with the district sample selected for the 2005 FRSS survey on distance education. Reselection procedures such as those described in the paper by Brick, Morganstein, and Wolters (1987)1 will be used to select the district sample. Under these procedures, appropriate conditional probabilities of selection are derived and used to select the sample. These conditional selection probabilities depend on (a) the desired overall selection probability for the current survey, (b) the selection probabilities for the previous survey, and (c) the prior selection status (i.e., whether or not the district was selected for the previous survey). This approach was used in the 2005 FRSS survey on distance education, where over 90 percent of the districts selected for the current sample had also been selected for the prior 2003 survey. Depending on the extent of the changes in the district universe files since the 2005 survey, the amount of overlap for the current survey may be less, but is still expected to provide sufficient overlap for longitudinal analyses. Since the reselection procedures ensure that the desired probabilities of selection for the current survey will be maintained, unbiased cross-sectional estimates can also be derived from the survey results.
Table 3. Initial and expected sample sizes for the proposed survey on distance education
Type of district
|
District enrollment size class |
Districts in frame |
Districts included in initial sample
|
Districts responding to survey1 |
Responding districts with distance education2 |
|
|
|
|
|
|
Regular |
Less than 1,000 |
6,373 |
414 |
373 |
152 |
|
1,000-2,499 |
3,272 |
425 |
383 |
165 |
|
2,500 to 9,999 |
3,044 |
791 |
712 |
263 |
|
10,000-99,999 |
848 |
544 |
490 |
269 |
|
100,000+ |
26 |
26 |
23 |
21 |
Charter schools |
0 or NA |
112 |
5 |
5 |
0 |
|
Less than 1,000 |
1,973 |
88 |
79 |
5 |
|
1000 to 2,499 |
88 |
9 |
8 |
2 |
|
2,500+ |
18 |
3 |
3 |
3 |
Total |
|
15,754 |
2,306 |
2,076 |
880 |
1 Assumes 90 percent response rate. Includes districts with and without distance education.
2 Estimates based on results of 2005 FRSS survey on distance education.
Expected Levels of Precision
Table 4 summarizes the approximate sample sizes and standard errors to be expected under the proposed design for selected analytic domains. Note that sample sizes refer to districts with distance education programs (not the initial sample sizes). Also note that the standard errors in Table 4 reflect design effects ranging from 1.1 to 1.4. The design effects (i.e., unequal weighting effects) are a consequence of the fact that large districts will be sampled at relatively higher rates (i.e., have smaller sampling weights) than small districts. Since the sample sizes in Table 4 are based on preliminary tabulations, the actual sample sizes may differ from those shown. Finally, note that the sample sizes represent the expected numbers of completed questionnaires assuming an overall response rate of 90 percent. The standard errors in Table 4 can be converted to 95 percent confidence bounds by multiplying the entries by 2. For example, as can be seen in Table 4, an estimated proportion of the order of 20 percent (P = 0.20) for suburban districts would be subject to a margin of error of ±0.58 (±5.8 percent) at the 95 percent confidence level. Similarly, an estimated proportion of the order of 50 percent (P = 0.50) for the total sample would be subject to a margin of error of ±0.040 (±4.0 percent) at the 95 percent confidence level.
Table 4. Expected sample sizes (number of responding districts with distance education) and corresponding standard errors for estimates of proportions for selected analytic domains
|
|
Standard error† of an estimated |
||
|
|
proportion equal to ... |
||
Domain |
Expected sample size* |
P = 0.20 |
P = .33 |
P = .50 |
|
|
|
|
|
Total sample |
880 |
0.016 |
0.019 |
0.020 |
|
|
|
|
|
Metropolitan status |
|
|
|
|
Central city |
162 |
0.037 |
0.044 |
0.047 |
Suburban |
266 |
0.029 |
0.034 |
0.036 |
Town |
150 |
0.039 |
0.045 |
0.048 |
Rural |
303 |
0.027 |
0.032 |
0.034 |
|
|
|
|
|
Percent of children below poverty |
|
|
|
|
Less than 10% |
235 |
0.031 |
0.036 |
0.039 |
10 to 19.9% |
370 |
0.025 |
0.029 |
0.031 |
20%+ |
274 |
0.029 |
0.034 |
0.036 |
|
|
|
|
|
Region |
|
|
|
|
Northeast |
166 |
0.037 |
0.043 |
0.046 |
Southeast |
175 |
0.036 |
0.042 |
0.045 |
Central |
246 |
0.030 |
0.035 |
0.038 |
West |
293 |
0.028 |
0.033 |
0.035 |
|
|
|
|
|
Enrollment size class |
|
|
|
|
Less than 1,000 |
154 |
0.034 |
0.040 |
0.042 |
1,000 to 2,499 |
167 |
0.032 |
0.038 |
0.041 |
2,500 to 9,999 |
266 |
0.026 |
0.030 |
0.032 |
10,000+ |
293 |
0.025 |
0.029 |
0.031 |
|
|
|
|
|
* Expected number of responding districts offering distance education courses assuming a 90 percent response rate and rates of distance education comparable to the 2005 FRSS survey on distance education. † Assumes design effects ranging from 1.1 to 1.4 depending on analytic domain.
|
Estimation and Calculation of Sampling Errors
For estimation purposes, sampling weights reflecting the overall probabilities of selection and adjustments for nonresponse will be attached to each data record. To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, 50-100 subsamples or "replicates" will be formed in a way that preserves the basic features of the full sample design. A set of estimation weights (referred to as "replicate weights") will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and each of the jackknife replicates. The variability of the replicate estimates is used to obtain a measure of the variance (standard error) of the survey statistic. Previous surveys, using similar sample designs, have yielded relative standard errors (i.e., coefficients of variation) in the range of 2 to 10 percent for most national estimates. Similar results are expected for this survey.
1 Brick, M., Morganstein, D., Wolters, C. (1987). “Additional uses for Keyfitz selection.” Proceedings of the Section on Survey Research Methods of the American Statistical Association. pp. 787-791.
WASHINGTON, D.C. 20006―
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Date: |
Author | Priscilla Carver |
File Modified | 0000-00-00 |
File Created | 2021-02-01 |