The Individuals with Disabilities Education Act (IDEA) State and Local Implementation Study 2019
Part A: Justification for the Study
Submitted to:
Institute of Education Sciences
Washington, DC 20202
Project
Officer: Erica Johnson
Contract Number:
ED-IES-17-C-0069
Submitted by:
Mathematica Policy Research
P.O.
Box 2393
Princeton, NJ 08543-2393
Telephone: (609)
799-3535
Facsimile: (609) 799-0005
Project
Director: Amy Johnson
Reference Number: 50538
CONTENTS ii
PART A. SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION 1
Justification 1
A1. Circumstances necessitating the collection of information 1
A2. Purpose and use of data 2
A3. Use of technology to reduce burden 5
A4. Efforts to avoid duplication of effort 5
A5. Methods of minimizing burden on small entities 5
A6. Consequences of not collecting data 6
A7. Special circumstances 6
A8. Federal register announcement and consultation 6
A9. Payments or gifts 7
A10. Assurances of confidentiality 7
A11. Justification for sensitive questions 9
A12. Estimates of hours burden 9
A13. Estimate of cost burden to respondents 10
A14. Annualized cost to the federal government 10
A15. Reasons for program changes or adjustments 10
A16. Plans for tabulation and publication of results 10
A17. Approval not to display the expiration date for OMB approval 11
A18. Exception to the certification statement 11
TABLES
A.1 Data collection 3
A.2 Schedule of major study activities 5
A.3 Estimated response time for data collection 9
This Office of Management and Budget (OMB) package requests clearance for data collection activities to support the Individuals with Disabilities Education Act (IDEA) State and Local Implementation Study 2019. The study is one component of a congressionally-mandated National Assessment of IDEA that the Institute of Education Sciences (IES) within the U.S. Department of Education (ED) is conducting. The study will develop a national picture of how states, districts, and schools are implementing IDEA in order to provide ED, Congress, and other stakeholders with knowledge that can inform the next reauthorization of IDEA and, ultimately, how services are provided to children. IES has contracted with Mathematica Policy Research and its partners the National Center for Special Education in Charter Schools, Inc. and Walsh Taylor, Inc. to conduct this study.
IDEA is a longstanding and important federal law, last reauthorized by Congress in 2004. It supports children and youth with disabilities from birth through age 21, who represent 13 percent of the enrollment in public schools in the United States (more than 7 million children in 2015) (U.S. Department of Education 2017). IDEA and the services it provides continue to receive attention from Congress. Still, allocating resources for the provision of IDEA services is a challenge. For fiscal year 2016, IDEA funding for school-age children with disabilities covered about 16 percent of the estimated excess cost of educating these children. The remainder of the costs of educating these children is assumed by states and local districts, many of which are providing fewer education dollars per child than they did before the 2007–2009 recession (Lav and Leachman 2017). States and districts may face choices in how to provide services within the constraints of limited resources.
For these reasons, as well as the mandate to evaluate the program, IES will examine core aspects of IDEA implementation at the state and local levels. The study will collect information and report on both Part C of the law, which provides financial assistance to states to coordinate early intervention services for infants and toddlers with disabilities, and Part B, which ensures that all students with disabilities have access to a free appropriate public education and authorizes funding to help education agencies meet their individual learning needs. This new study will build on an earlier IDEA National Assessment Implementation Study (IDEA-NAIS), which surveyed states and a nationally representative sample of districts in 2009.
A new study is critical to conduct because almost a decade has elapsed since the IDEA-NAIS. During the intervening years, the characteristics of students with disabilities and the context and implementation of early intervention and special education services may have changed. For example, since 2009, relevant court decisions were issued including Endrew F. v. Douglas County School District, and educational legislation was passed including the reauthorization of the Elementary and Secondary Education Act. In addition, the Office of Special Education Programs in ED has released IDEA regulations and guidance. These judicial, legislative, and regulatory developments may have implications for how states and local agencies are implementing services under IDEA.
The study will primarily rely on surveys of: (1) all states and territories receiving IDEA funding, (2) a nationally representative sample of districts, and (3) a nationally representative sample of publicly funded schools. To avoid duplication of effort and to minimize respondent burden, the study will rely on extant data when possible. In order to examine changes since the IDEA-NAIS, the surveys include some items that were administered in 2009. The study will address the following research questions:
How do state and local agencies identify infants, toddlers, children, and youth for early intervention and special education services? How do they measure disproportionate identification and what policies and practices have been implemented with the goal of addressing disproportionate identification?
What policies and programs do states and local agencies have in place to support infants, toddlers, children, and youth identified for early intervention or special education services? What types of supports do schools provide to children and youth with disabilities to support their academic and behavioral learning?
To what extent do states and local agencies rely on evidence from research on the effectiveness of policies, programs, and supports for children with disabilities?
How do state and local agencies allocate resources—including funding and personnel—to support infants, toddlers, children, and youth with disabilities?
In Fall 2019, the study team will survey special education administrators in a census of states and territories receiving IDEA funding and in nationally representative samples of districts and schools. States, districts, and schools have important and non-overlapping roles in how IDEA is implemented, which necessitates data collection at each level. Additionally, the sample is constructed to provide adequate statistical precision necessary to examine the implementation of policies and practices, and to detect differences in this implementation across districts and schools as well as implementation by various policy-relevant subgroups including charter schools and elementary schools with pre-kindergarten programs. Because a growing percentage of special education students are attending charter schools, it is important to study how these students are being served. In addition, the subgroup of elementary schools with pre-kindergarten programs will allow the study team to examine the implementation of the IDEA Part B program for preschool-age children within schools.
The study will use six survey instruments at the state, district, and school levels to collect data (Table A.1).
Table A.1. Data collection
Instrument |
Respondent |
Mode |
Schedule |
State surveys |
|
|
|
IDEA Part C infants and toddlers program state survey |
State IDEA Part C infants and toddlers program coordinator |
60-minute web-based survey |
Fall 2019 |
IDEA Part B program for preschool-age children state survey |
State IDEA Part B program for preschool-age children coordinator |
60-minute web-based survey |
Fall 2019 |
IDEA Part B program for school-age children state survey |
State special education director |
60-minute web-based survey |
Fall 2019 |
District surveys |
|
|
|
IDEA Part B program for preschool-age children district survey |
District IDEA Part B program for preschool-age children coordinator |
60-minute web-based survey |
Fall 2019 |
IDEA Part B program for school-age children district survey |
District special education director |
60-minute web-based survey |
Fall 2019 |
School survey |
|
|
|
IDEA Part B school survey |
School
principal |
45-minute web-based survey |
Fall 2019 |
State surveys
At the state level, the study team will administer three separate surveys that focus on the Part C program for infants and toddlers (administered to the Part C infants and toddlers program coordinator; see Appendix A.2), the Part B program for preschool-age children (administered to the Part B program for preschool-age children coordinator; see Appendix A.3), and the Part B program for school-age children and youth (administered to the special education director; see Appendix A.4), respectively. Three surveys are necessary because different state administrators are likely to oversee IDEA programs for children at those different age levels. The survey topics focus on state policies about the identification of children with disabilities, supports for children with disabilities, the use of evidence from research, and the allocation of special education resources.
The study team will administer surveys to the respondents in each state, the District of Columbia, and other territories. The contact information for respondents will be obtained from up-to-date online lists from the National Association of State Directors of Special Education and from the Early Childhood Technical Assistance Center (for The IDEA Part C infants and toddlers program and IDEA Part B program for preschool-age children coordinators). The targeted time to complete each survey will be 60 minutes, the same length as the IDEA-NAIS state surveys. IDEA is a comprehensive law, and the state policies on identification and supports for children with disabilities and use of evidence from research, and the allocation of special education resources are complex. Obtaining information on such issues requires surveys of this length. The surveys will be administered electronically and optimized for mobile phones. This will allow respondents to start and stop the survey as needed to accommodate schedules and gather data needed to complete the survey. As recipients of IDEA funding, it is expected that all states will respond to the surveys (Education Department General Administrative Regulations, 34 C.F.R. § 76.591).
District surveys
At the district level, the study team will administer two surveys that focus on the IDEA Part B program for preschool-age children (administered to the Part B program for pre-school age children coordinator; see Appendix A.5) and the IDEA Part B program for school-age children and youth (administered to the special education director; see Appendix A.6), respectively. If a district does not have an IDEA Part B program for preschool-age children coordinator, the study team will work with the district to identify the survey’s most appropriate respondent, likely someone in the pre-school special education leadership. Two surveys are necessary because different district staff members are likely to oversee IDEA programs for students at those different age levels. The survey topics are similar to those covered by the state survey except the focus is on district policies and programs.
The study team will administer the IDEA Part B program for preschool-age children survey to a nationally representative sample of 602 school districts and the IDEA Part B program for school-age children survey to a nationally representative sample of 665 school districts.1 The two surveys will be administered electronically, and each is expected to take about 60 minutes, the same length as the district survey for IDEA-NAIS. As stated above, IDEA is a comprehensive law, and districts’ policies and programs related to IDEA are complex. Obtaining information on such issues requires surveys of this length. The study team intends to obtain responses from at least 85 percent of selected districts (512 districts for the Part B program for preschool-age children and 565 districts for the Part B program for school-age children). As recipients of IDEA funding, it is expected that districts will respond to the surveys (Education Department General Administrative Regulations, 34 C.F.R. § 76.591).
School survey
A single school survey covers the IDEA Part B program for school-age children, the Part B program for preschool-age children, the transition from the Part C infants and toddlers program, and transition planning for secondary school students (administered to the school principal or lead special education staff; see Appendix A.7). The survey topics focus particularly on the supports that schools are providing to children with disabilities through IDEA.
The study team will administer an electronic survey to a nationally representative sample of 2,750 schools from the 665 selected districts. The survey will be administered electronically, and it is expected to take about 45 minutes. The study team expects to obtain responses from at least 80 percent (2,200) of schools.
Table A.2 shows the schedule of data collection activities and the overall study timeline.
Table A.2. Schedule of major study activities
Activity |
Fall 2019 |
Winter 2020 |
Spring 2020 |
Summer 2020 |
Fall 2020 |
Winter 2021 |
Administer state surveys |
X |
|
|
|
|
|
Administer district surveys |
X |
|
|
|
|
|
Administer school surveys |
X |
|
|
|
|
|
Conduct analyses, develop report, and prepare briefing |
|
X |
X |
X |
X |
|
Release report |
|
|
|
|
|
X |
Prepare data documentation |
|
|
|
|
X |
X |
For each data collection effort, the study team has selected the form of technology that will provide reliable information while minimizing respondent burden. Information technology will be used heavily in data collection tasks. Examples include the following:
All surveys will be administered electronically. To minimize the effort required to respond, the surveys will be optimized for mobile phones and will allow respondents to start and stop the survey as needed to accommodate schedules and gather data needed for completion. Hard copies will be provided to state and district respondents and to school respondents who request one.
The study will have a toll-free number and an email address, both of which will be hosted by Mathematica. Staff from Mathematica will respond in a timely and clear way to any questions from respondents.
No similar studies are being conducted, and there is no equivalent source for the information to be collected. Moreover, the data collection plan reflects careful attention to the potential sources of information for this study, particularly to the reliability of the information and the efficiency in gathering it. The data collection plan avoids unnecessary collection of information from multiple sources and uses extant data when available. For example, states report annually on special education child counts, educational environments, personnel, and dispute resolution (IDEA Section 618), eliminating the need to collect these data in the 2019 surveys.
No small businesses or entities will be involved as respondents.
The data collection plan described in this submission is necessary for IES to comply with the requirements of the IDEA legislation. This study will provide essential information about the current implementation of IDEA. Without this information, policymakers, states, districts, and schools will not have the information they need to make informed decisions about improving legislation, policies, and services for infants, toddlers, children, and youth with disabilities.
There are no special circumstances associated with this data collection.
The 60-day notice to solicit public comments was published in Vol. 84, no. 72, pp 15204-15205 of the Federal Register on 4/15/2019. The 30-day notice to solicit public comments was published in [insert volume, number, and page] of the Federal Register on [insert date].
The study team experts who formulated the study design and contributed to the content of the instruments include Drs. Amy Johnson (Mathematica Policy Research), Stephen Lipscomb (Mathematica Policy Research), Mary Cary “Cay” Bradley (Mathematica Policy Research), Margaret McLaughlin (University of Maryland), Sharon Walsh (Walsh Taylor, Inc.), Beth Rous (University of Kentucky), and Lauren Rhim (National Center for Special Education in Charter Schools). In addition, the study team relies on a technical working group (TWG) to provide input to ensure the study is of the highest quality and that findings are relevant to the president, policymakers, school districts, and the public. The TWG has reviewed the study plans described in this package. The TWG members have content expertise on special education topics, including the issues mentioned above that are needed to address the study’s research questions. Many members also have relevant work experience at the state, district, and/or school levels. The TWG members are listed below.
Carl Beck, Pennsylvania Department of Education and Welfare
Cecelia Dodge, WestEd
John Eisenberg, Virginia Department of Education
Segun Eubanks, University of Maryland College Park
John Hosp, University of Massachusetts
Sheila Self, California Department of Education
Patricia Snyder, University of Florida
David Test, University of North Carolina – Charlotte
Gerald Tindal, University of Oregon
Laurie VanderPloeg, Kent Independent School District2
There are no unresolved issues.
Providing data collection incentives is important in federal studies, given the recognized burden on school staff and the need for high response rates. The use of incentives has been shown to be effective in increasing response rates and reducing the number of attempts needed, which helps offset payment costs (Dillman 2007, American Statistical Association and American Association for Public Opinion Research 2016; Jacob and Jacob 2012). In 2005, IES’s National Center for Education Evaluation and Regional Assistance (NCEE) submitted a memorandum to OMB outlining guidelines for incentives for NCEE evaluation studies and tying recommended incentive levels to the level of burden (represented by the length of the survey).
Consistent with NCEE’s recommendations, the study team proposes to offer a $30 incentive to each school survey respondent to offset time and effort. School staff constitute a potentially challenging group in terms of survey participation. Schools are not required to participate in the data collection and often have a full schedule with competing demands for their time. The study team’s experience on numerous school-based collections indicates that monetary incentives increase cooperation. Incentives are proposed because high response rates are needed to make the study findings reliable. Because staff at some schools are not allowed to accept incentives directly, we will offer school staff the option of receiving a Donors Choose gift card in lieu of a participation incentive. Donors Choose gift cards can be used to support school or classroom projects, such as buying classroom supplies.
School responses are critical because the schools implement the policies that states and districts set. School staff have the most direct perspectives on what services children are receiving. The school survey will be useful for understanding school-level interventions, policies, and practices.
No incentives will be offered to state or district respondents, since they are expected to complete the surveys, and we believe that will be sufficient to adequately obtain their responses (U.S. Department of Education 2014).
The study team has established procedures to protect the confidentiality and security of its data. This approach will be in accordance with all relevant regulations and requirements, in particular the Education Sciences Institute Reform Act of 2002, Title I, Subsection (c) of Section 183, which requires that the director of IES “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.” The study also will adhere to requirements of Subsection (d) of Section 183, which prohibit disclosure of individually identifiable information, as well as make the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.
The study team will protect the full privacy and confidentiality of all people who provide data. The study will not have data associated with personally identifiable information (PII), because study staff will be assigning random ID numbers to all data records and then stripping any PII from those records. Other than the names and contact information for the survey respondents, which is information typically already available in the public domain on state and district websites, no data collected for surveys will contain PII. No names or contact information will be released. In addition to the data safeguards described here, the study team will ensure that no respondent names are identified in publicly available reports or findings, and, if necessary, the study team will mask distinguishing characteristics. The following statement to this effect will be included with all requests for data:
“Mathematica Policy Research follows the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Responses to this data collection will be used only for research purposes. The reports prepared for the study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. We will not provide information that identifies respondents to anyone outside the study team, except as required by law.”
The three state surveys will also include the following statement:
“Please note that data on state policies and resources/supports may be reported by state. Thus, while PII about individual respondents will not be released, data displayed by state could be attributed to the state agency or possibly an individual respondent.”
Mathematica uses the following safeguards to protect confidentiality:
All Mathematica employees sign a pledge that emphasizes the importance of confidentiality and describes their obligation (Appendix A.8).
All internal networks are protected from unauthorized access by using defense-in-depth best practices, which incorporate firewalls and intrusion detection and prevention systems. The networks are configured so that each user has a tailored set of rights, granted by the network administrator, to files approved for access and stored on the network. Access to hard-copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.
Computer data files are protected with passwords, and access is limited to specific users, who must change their passwords on a regular basis and conform to strong password policies.
Especially sensitive data are maintained on removable storage devices that are kept physically secure when not in use.
After the study concludes, the study data will be transmitted to the National Center for Education Statistics (NCES) for safekeeping as a restricted-use file. Before transmittal, the data will be stripped of any individual identifiers. Researchers wishing to access the data for secondary analysis must apply for an NCES license and agree to the rules and procedures guiding the use of restricted-use files.
This study will include no questions of a sensitive nature.
Table A.3 provides an estimate of time burden for the data collections, broken down by instrument and respondent. These estimates are based on our experience collecting administrative data from districts and schools.
Table A.3. Estimated response time for data collection
Respondent/Data request |
Number of targeted respondents |
Expected response rate (%) |
Expected
number |
Unit response time (hours) |
Total burden time (hours) |
Annual respondent burden (hours) |
Respondent average hourly wage |
Respondent labor cost |
State-level staff |
|
|
|
|
|
|
|
|
Part C infants and toddlers program state survey |
61 |
100 |
61 |
1.00 |
61 |
20.3 |
$44.79 |
$2,732.19 |
Part B program for preschool-age children state survey |
61 |
100 |
61 |
1.00 |
61 |
20.3 |
$44.79 |
$2,732.19 |
Part B program for school-age children state survey |
61 |
100 |
61 |
1.00 |
61 |
20.3 |
$44.79 |
$2,732.19 |
District-level staff |
|
|
|
|
|
|
|
|
Part B program for preschool-age children district survey |
602 |
85 |
512 |
1.00 |
512 |
170.7 |
$44.79 |
$22,932.48 |
Part B program for school-age children district survey |
665 |
85 |
565 |
1.00 |
565 |
188.3 |
$44.79 |
$25,306.35 |
School-level staff |
|
|
|
|
|
|
|
|
Part B school survey |
2,750 |
80 |
2,200 |
0.75 |
1,650 |
550 |
$44.79 |
$73,903.50 |
Total |
4,200 |
|
3,460 |
|
2,910 |
970 |
|
$130,338.90 |
Note: The 61 state respondents include the 50 states, the District of Columbia, eight U.S. territories, the Bureau of Indian Education, and the Department of Defense Education Activity.
The number of targeted respondents and the expected number of responses are 4,200 and 3,460, respectively. The total estimated burden is estimated at 2,910 hours.
The total estimate of 2,910 hours of burden includes the following efforts: one hour for each of the three state surveys, which will be administered to 61 coordinators in the 50 states, District of Columbia, U.S. territories, the Bureau of Indian Education, and the Department of Defense Education Activity; one hour for the preschool-age district survey, to be administered to 512 district coordinators and one hour for the school-age district survey, to be administered to 565 district coordinators; and 45 minutes (0.75 hours) for each school survey, to be administered to 2,200 school principals or special education directors. The estimated cost to respondents is $130,338.90, which is based on the 2016 Bureau of Labor Statistics average hourly salary of $44.79 for Education Administrators.
There are no additional respondent costs associated with this data collection beyond the burden of staff time estimated in item A.12.
The total cost to the federal government for this study is $3,453,247. The estimated average annual cost—including recruiting districts, designing and administering all collection instruments, processing and analyzing the data, and preparing reports—is $986,642 (the total cost divided by the 3.5 years of the study).
This is a new collection.
The report from the data collection will provide nationally representative, descriptive information on IDEA implementation that can inform future actions of policymakers and educators.
Applying appropriate methods. The study team will use descriptive methods to tabulate the data. The primary method will be to report point-in-time estimates of mean values for state, district, and school data. The analysis will also compare means for subgroups of schools and, where possible, examine trends across time. (See Part B, Section B.3.2 of the OMB package for information on the application of weights for selection probability and nonresponse.)
Point-in-time estimates. For state data, the study will report numbers of states and unweighted means. For district and school data, the study will report weighted means for survey variables and unweighted means from extant sources such as EDFacts where data exist for nearly all districts. Weights will reflect the probability of selection for districts and schools, with adjustments for survey nonresponse.
Point-in-time estimates for school subgroups. Subgroup analyses will provide a fuller understanding of how IDEA implementation varies across types of schools. For example, the sampling strategy will permit the study to document (and statistically test for) differences between charter schools and traditional public schools in how they identify and support children with disabilities. The sampling approach will also allow the study team to precisely estimate means for topics applicable only to specified school subgroups, such as on preschool inclusion for elementary schools with pre-kindergartens or on post–high school transition planning for secondary schools.
Trend analyses. The study team will document trends relative to the IDEA-NAIS on measures related to topics such as the development and quality of individualized education programs (IEPs) , provided item universes and framing are the same. Comparisons will be made across select items in the state and district surveys. (IDEA-NAIS did not include a school survey.)
Study reports will use plain language, feature accessible tables and figure formats, and emphasize higher-level findings. The study team will work closely with content experts and IES on approaches for communicating findings in useful ways.
Reports will be accessible to a broad audience and will be no longer than 15 pages. Reports will comply fully with the standards set by NCES and be Section 508-compliant.
IES is not requesting a waiver for the display of the OMB approval number and expiration date. The study will display the OMB expiration date.
No exceptions to the certification statement are requested or required.
American Statistical Association and American Association for Public Opinion Research. “Joint American Statistical Association/AAPOR Statement on Use of Incentives in Survey Participation.” 2016. Available at https://www.aapor.org/Publications-Media/Public-Statements/AAPOR-Statement-on-Use-of-Incentives-in-Survey-Par.aspx. Accessed August 17, 2017.
Dillman, Don A. Mail and Internet Surveys: The Tailored Design, second edition, 2007 update. Hoboken, NJ: John Wiley, 2007. ISBN: 0-470-03856-x.
Griffith, Michael. “A Look at Funding for Students with Disabilities.” The Progress of Education Reform, vol. 16, no. 1, March 2015.
Jacob, R.T., and B. Jacob. “Prenotification, Incentives, and Survey Modality: An Experimental Test of Methods to Increase Survey Response Rates of School Principals.” Journal of Research on Educational Effectiveness, vol. 5, no. 4, 2012, pp. 401–418.
Lav, Iris, and Michael Leachman. “At Risk: Federal Grants to State and Local Governments.” Washington, DC: Center on Budget and Policy Priorities, March 13, 2017. Available at https://www.cbpp.org/research/state-budget-and-tax/at-risk-federal-grants-to-state-and-local-governments. Accessed August 15, 2017.
Leachman, M., N. Albares, K. Masterson, and M. Wallace. “Most States Have Cut School Funding, and Some Continue Cutting.” Washington, DC: Center on Budget and Policy Priorities, January 25, 2016. Available at https://www.cbpp.org/research/state-budget-and-tax/most-states-have-cut-school-funding-and-some-continue-cutting. Accessed August 15, 2017.
Rhim, Lauren. “Charter Schools in Special Education.” In Charting the Course: Special Education in Charter Schools, edited by Azure D.S. Angelov and David Bateman. Arlington, VA: Council for Exceptional Children, 2016.
Rhim, Lauren Morando, and Shaini Kothari. “Key Trends in Special Education in Charter Schools: A Secondary Analysis of the Civil Rights Data Collection.” National Center for Special Education in Charter Schools, 2018.
U.S. Congress. “Individuals with Disabilities Education Improvement Act of 2004.” Pub. Law 108-446, 118 Stat. 2647. December 3, 2004.
U.S. Department of Education. “Education Department General Administrative Regulations (EDGAR).” Sec. 75.591, 20 U.S.C. 1221e-3 and 3474. December 19, 2014.
U.S. Department of Education, Office for Civil Rights. “2013–2014 Civil Rights Data Collection, A First Look: Key Data Highlights on Equity and Opportunity Gaps in Our Nation’s Public Schools. 2016.” Available at https://www2.ed.gov/about/offices/list/ocr/docs/2013-14-first-look.pdf. Accessed March 7, 2018.
U.S. Department of Education, National Center for Education Statistics. Digest of Education Statistics: Children 3 to 21 Years Old Served Under Individuals with Disabilities Education Act (IDEA), Part B, by Race/Ethnicity and Age Group: 2000–01 Through 2015–16. 2017. Retrieved from https://nces.ed.gov/programs/digest/d17/tables/dt17_204.40.asp. Accessed March 7, 2018.
U.S. Department of Education, Office of Special Education and Rehabilitative Services, Office of Special Education Programs. “39th Annual Report to Congress on the Implementation of the Individuals with Disabilities Education Act, 2017.” Washington, DC: 2017.
Wolf, P. J., and S. Lasserre-Cortez. Special Education Enrollment and Classification in Louisiana Charter Schools and Traditional Schools (REL 2018–288). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southwest, 2018. Available at http://ies.ed.gov/ncee/edlabs.
www.mathematica-mpr.com
Improving
public well-being by conducting high quality,
objective
research and data collection
Princeton, NJ ■ Ann Arbor, MI ■ Cambridge, MA ■ Chicago, IL ■ Oakland, CA ■ seattle, wa ■ TUCSON, AZ ■ Washington, DC ■ WOODLAWN, MD
1 Of the 665 districts selected overall, 63 did not offer pre-kindergarten instruction and are not eligible for the preschool-age district survey.
2 Subsequent to providing input on this study, Laurie VanderPloeg became the Director of the Office of Special Education Programs (OSEP), U.S. Department of Education.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |