Implementation of Title I/II-A Program Initiatives
Supporting Statement for Paperwork Reduction Act Submission PART A: Justification Contract ED-IES-11-C-0063
|
|
January 2018
Prepared for: Institute of Education Sciences U.S. Department of Education
|
Prepared by: Westat Mathematica Policy Research
|
Justification 1
A.1. Explanation of Circumstances That Make Collection of Data Necessary 3
A.2. How the Information Will Be Collected, by Whom, and For What Purpose 5
A.3. Use of Improved Information Technology to Reduce Burden 6
A.4. Efforts to Identify and Avoid Duplication 6
A.5. Efforts to Minimize Burden on Small Business or Other Entities 7
A.6. Consequences of Less-Frequent Data Collection 7
A.8. Federal Register Comments and Persons Consulted Outside the Agency 7
A.9. Payments to Respondents 7
A.10. Assurance of Confidentiality 8
A.11. Questions of a Sensitive Nature 9
A.12. Estimates of Respondent Burden 9
A.13. Estimates of the Cost Burden to Respondents 10
A.14. Estimates of Annualized Government Costs 10
A.15. Changes in Hour Burden 10
A.16. Time Schedule, Publication, and Analysis Plan 11
A.16.1. State and District Level Implementation 11
A.16.2. Patterns of Cross-Level Implementation 12
A.16.3. Changes in Implementation over Time 12
A.16.4. Student Achievement Trends 13
A.17. Display of Expiration Date for OMB Approval 14
A.18. Exceptions to Certification Statement 14
Appendix A. State Survey A-1
Appendix B. District Survey B-1
Appendix C. Notification Letters and Reminder Emails C-1
Tables
A-1. Estimates of respondent burden 10
Part A. Justification
This package is the second of two for the Implementation of Title I/II-A Program Initiatives study. The first data collection was conducted during spring and summer 2014 (OMB # 1850-0902; expiration date: 02/28/17). This package requests approval for a follow-up round of data collection that will include surveys of all states including the District of Columbia, the same nationally representative sample of school districts selected for the 2014 data collection, and a new nationally representative sample of charter school districts. Unlike the first data collection, this follow-up data collection will not include surveys of principals and teachers. We anticipate that the state education agency (SEA) and school district surveys will begin in April 2018.
The Title I and Title II-A programs are part of the Elementary and Secondary Education Act (ESEA). These programs are intended to promote equal access to education by providing financial assistance to schools and districts which have a high percentage of students from low-income families (Title I); and by improving teacher and principal quality (Title II-A). During the 2014–15 school year, more than 100,000 public schools used Title I funds, and the program served over 25 million children (U.S. Department of Education, 2016a). Ninety-eight percent of districts nationally received Title II-A funding for the 2015–16 school year (U.S. Department of Education, 2016b).
ESEA was most recently reauthorized as the Every Student Succeeds Act (ESSA) in December 2015. Under Title I, ESSA offers states and districts considerable autonomy while requiring them to adopt accountability systems that set state-specific accountability goals and identify and support low-performing schools, challenging academic content standards, and aligned assessments. Under Title II-A, ESSA also provides funding for a broad array of permissible activities to improve the effectiveness of educators and achieve an equitable distribution of effective educators. These four areas—accountability and turning around low-performing schools, improving teacher and leader effectiveness, state content standards, and assessments—were also focuses of Titles I and II under the previous version of ESEA that was in effect at the time of this study’s 2014 data collection. Relative to the prior version of the law, however, ESSA gives states substantially more discretion about the details of implementation, particularly with respect to accountability and educator effectiveness.
The purpose of the Implementation of Title I/II-A Program Initiatives study is to describe the implementation of policies and practices funded through Titles I and II-A of ESEA at multiple points in time. The 2014 Title I/II-A data collection examined the implementation of policies and practices at the state, district, school, and classroom levels. That data collection and resulting report1 focused on policies and practices funded through Titles I and II-A which were implemented under ESEA Flexibility and other related federal initiatives since the previous National Assessment of Title I concluded in 2006.
The 2018 follow-up data collection will be limited to surveys of states and school districts. Given that the 2017-18 school year is early in the implementation of ESSA, we expect that most new policies and activities will not reach the school and classroom levels by the time the surveys are administered in Spring 2018. This follow-up data collection will provide information on state and district activities in the same four core areas as in the 2014 data collection. This approach will allow the study to not only describe the implementation of Title I and Title II-A provisions in 2018 but also compare it to 2014, in order to see if states are, for example, continuing to implement specific policies or are leaving more decisions up to districts.
The 2018 data collection will also include a new nationally representative sample of charter school districts and questions on school choice, an area of increasing federal and state policy attention. With the addition of charter school districts, the study can investigate whether the implementation of policies varies in charter school districts compared to traditional school districts. A charter school district often consists of only a single charter school, which may operate quite differently from traditional school districts, not only due to small size but also due to the absence of an elected school board and freedom from state regulations.
In order to motivate the survey questions that are included in the 2018 data collection, we provide more detail below on what elements ESSA emphasizes in each of the four core content areas and how those emphases may differ from the prior version of the law.
School accountability and turning around low-performing schools
ESSA maintains requirements of the prior version of the law for several aspects of school accountability. State systems of school accountability must continue to include measures of reading and math achievement in grades 3-8 and high school, as well as measures of subgroup achievement and measures of high school graduation rates. ESSA also requires that states add a state-selected measure of school quality or student success and incorporate the achievement of English learners. ESSA differs from previous federal policy in that it provides states substantial discretion to design the specific aspects of their accountability systems, including setting long-term and interim goals for student achievement, determining measures of school performance and progress, and selecting the types of interventions they will implement in their lowest-performing schools. In addition, ESSA specifies that states need to designate only 5 percent of their Title I schools as needing comprehensive support and improvement because of low performance; however, this group also must include any high school where fewer than two-thirds of students graduate. An additional group of schools with underperforming subgroups are to be designated as needing targeted support and improvement. ESSA indicates that the supports and interventions provided to both types of schools are to be determined by states and districts.
Improving teacher and leader effectiveness
ESSA maintains requirements that states and districts must ensure that low-income and minority students are not served at disproportionate rates by inexperienced, ineffective, or out-of-field teachers. States are also required to evaluate inequities and report on how they are addressing any inequities. ESSA departs from the prior version of the law by eliminating requirements for “highly qualified teachers.” ESSA departs from prior federal initiatives by no longer requiring the development and implementation of teacher and leader evaluation systems, although it allows states to use Title II-A funds to do so using any evaluation components they choose. ESSA also allows Title II-A funds to be used for a broad range of activities to improve teacher and leader effectiveness, such as differentiating professional development and implementing innovative strategies for teacher preparation, such as teacher academies and residencies.
State content standards
ESSA maintains requirements to adopt challenging content standards in reading/English language arts, mathematics, and science that promote student readiness for college and careers. However, states have flexibility regarding the specific content standards they adopt in order to fulfill that requirement and are not incentivized to adopt specific content standards as they were under previous federal initiatives. States also must adopt English language proficiency standards that are aligned with the state standards and may adopt alternate academic achievement standards for students with the most significant cognitive disabilities. Since ESSA requires that high schools must be held accountable for graduation rates, there is also interest in what high school graduation requirements states have adopted.
Assessments
ESSA maintains annual testing requirements in reading and mathematics for public school students in each of grades 3-8 and at least once in grades 9-12, and in science at least once in each of the 3-5, 6-9, and 10-12 grade ranges. These assessments are meant to provide information on whether students have acquired the knowledge and skills identified by the content standards and if not, the areas where students fall short and may require additional instruction and assistance. Although the assessments must be aligned with state content standards, ESSA provides flexibility in a number of areas: the types of assessments and assessment formats, assessments in supplemental grade levels and content areas, and accommodations for students with disabilities and English learners. ESSA also allows states to set a limit on the percentage of instructional time devoted to assessment administration. Schools are required to have at least 95 percent student participation rates on assessments, but it is up to the state how to deal with schools that fall short of that threshold.
The purpose of the follow-up data collection is to provide information on the core policies and practices funded by Title I and Title II-A being implemented at the state and district levels, and the resources and supports that states and districts provide to schools and teachers. The timing of the study’s data collection will provide information on the early implementation of ESSA in the 2017–18 school year. The study is authorized under Section 8601 of ESSA, which permits IES to conduct “comprehensive, high-quality evaluations of [ESEA] programs.”
Although there are research studies that cover similar topics of recent federal education policy, this follow-up Title I/II-A data collection is set apart by the breadth of research questions and the depth of responses from all SEAs and a nationally representative sample of school districts. The research questions for the Title I/II-A follow-up data collection are as follows:
How has student achievement changed over time?
What are states’ long-term goals for academic achievement and other measures? How do states and districts identify and support their lowest-performing schools, and how do they offer differentiated support for schools of varying performance levels? How has state and district identification and support for these schools changed since 2014?
What components/practices are required by states and used by districts to evaluate teacher and principal effectiveness, how are evaluation results used, and what supports do states and districts provide to improve effectiveness? Do states and districts assess the equitable distribution of teachers, and if so, how? What actions are taken to address any inequities? Are states assessing the effectiveness of teacher preparation programs, and if so, how? How are states using their Title II-A funds? How have these policies and practices changed since 2014?
Have states and districts made changes to their content standards and high school graduation requirements, and what materials and resources do states and districts provide to help school leaders and teachers implement the state content standards? Have these requirements and the materials and resources provided changed since 2014? What are states and districts doing to address the needs of students at risk of dropping out?
What types of assessments do states and districts use (in terms of assessment format, coverage of grade levels and content areas, and accommodations for students with disabilities and English learners)? What materials and resources do states and districts provide to support the implementation of assessments and use of assessment data? Have assessments and supports changed since 2014? How much time are students spending on state summative assessments and are states setting time limits? What is the extent of student opt out on state tests, and how are states and districts responding to student opt outs?
In order to address the research questions described above, this study will rely on extant data and state and district surveys. The first research question (how student achievement has changed over time) will be addressed using trend data from the National Assessment of Educational Progress (NAEP) and the U.S. Department of Education’s EDFacts data. All other research questions will be addressed using responses from the state and district surveys.
There is no uniform source of current, detailed information on the study’s core areas. We will administer surveys to obtain this information. Each survey is described below.
State survey. The state survey, administered using an electronic, fillable PDF form, will focus on state policies, and in particular, accountability and support for low-performing schools, improving teacher and leader effectiveness, state content standards, and assessments. The survey will be sent to the chief school officer in each of the 50 states and the District of Columbia beginning in April 2018, with the expectation that different sections of the survey may be filled out by different SEA staff, as determined by their particular expertise. The state survey is in Appendix A.
District survey. This web-based survey will focus on the implementation of state policies, adoption of district policies such as state and district content standards and assessments, and supports provided to schools. The survey will be administered to superintendents or their designees at 719 school districts beginning in April 2018. The 719 school districts include the nationally representative sample of 567 districts used for the study’s 2014 data collection and a new sample of 152 charter school districts.2 The district survey is in Appendix B.
Extant data. The report will include national and state-level trends in student achievement as context for the implementation findings from the state and district surveys. The data for the student achievement analyses will come from the NAEP and EDFacts. NAEP is a nationally representative assessment of students in mathematics, reading, and other subjects that provides a common measure of achievement across states. EDFacts data are consolidated from annual state reports and include school-level performance on state assessments for all schools in the state and aggregated to the state level for all students and for student subgroups by grade level. The outcomes of interest for this study are state-level standardized test scores (using NAEP and state assessments). The analyses will focus on national and state-level trends in student proficiency levels over time. In addition, the study’s analyses will use EDFacts for school performance designations and Title I status.
The data collection plan is designed to obtain information in an efficient way that minimizes respondent burden. The state survey will use electronic fillable PDFs. The survey will be divided into modules, each a separate fillable PDF, which will allow the appropriate staff with expertise in each area to respond separately. This approach will reduce burden for respondents as (a) each individual will have fewer questions to answer and (b) respondents will be asked questions concerning topics in which they are well versed and answers should be readily available. State Education Agency staff may prefer to complete individual sections of the survey with the help of colleagues who can help ensure that responses to questions about state policies and programs are accurate. The fillable PDF documents facilitate this collaborative approach to completing the survey by making it easier to view and print an entire survey section or just parts of it, both before and after it is completed. Each state will be given access to a secure SharePoint site to facilitate access to the survey for multiple respondents and to allow respondents to forward any documents they feel would help us better understand their state policies. An electronic fillable PDF is also a more cost-effective electronic version for a survey of 51 entities.
A web-based survey will be the primary mode of data collection for the district survey. We have found web-based surveys to be a preferred method for survey completion among district respondents. Burden will be reduced with the use of skip patterns and prefilled information based on responses to previous items when appropriate. Built-in edits will reduce response errors. Like the state survey, the web-based district survey will facilitate the completion of the survey by multiple respondents, so that the most appropriate individual will be able to access each section and provide the data in their area of expertise. Each district will be assigned a single password for the district’s web survey, which can be shared with the most appropriate respondents within the district. More than one respondent can work on different sections of the web-based survey simultaneously.
Both the fillable PDF (state) and web-based survey (district) decrease the cost for postage, coding, keying and cleaning of the survey data. These survey modes also will allow respondents to complete the survey at a location and time of their choice.
For district respondents who choose not to use the web-based survey, we will offer the option of completion of an electronic version of the survey or a paper-and-pencil instrument. A phone survey option will be offered to all respondents as part of the nonresponse follow-up effort.
A.4. Efforts to Identify and Avoid Duplication
To avoid duplication, we will use the NAEP and EDFacts data for the achievement analysis rather than ask for this information from the states as part of the study’s data collection.
No small businesses will be involved as respondents. Every effort will be made to minimize the burden on respondents.
The Title I/II-A study most recently conducted data collection in 2014. ESSA was passed in December 2015, and the 2017-18 school year is early in the implementation of ESSA. Therefore, it is necessary to examine initiatives in 2018 to determine any implementation changes since the passage of ESSA. As states transition to the full implementation of ESSA, states and districts may be changing policies or changing how they implement existing policies, and the 2018 data collection will capture their activities in the areas of: school accountability and support for low-performing schools, improving teacher and leader effectiveness, state content standards, assessments, and school choice. These data will provide policy makers with updated and detailed information on how initiatives in these core areas are playing out in states and school districts. It will be useful to understand how states and districts are implementing initiatives under Title I and Title II-A as they are given more autonomy over areas such as school accountability systems, supports for low-performing schools, and the assessment and support of educator effectiveness compared to the 2013–14 school year. Collecting the data less frequently would leave policy makers and the public poorly informed about the state of current federal and state education reform initiatives.
There are no special circumstances involved with this data collection.
The 60-day Federal Register notice was published on October 17, 2017 (82 FR 48217), and the 30-day Federal Register notice was published. There were no public comments on the package from the 60-day notice.
There will be no payment to respondents.
Other than the names and contact information for the survey respondents, which is information typically already available in the public domain (i.e., state and school district websites), no data collected for surveys will contain personally identifiable information. No names and contact information will be released.
Responses will be used for research or statistical purposes. States and districts receiving Title I and Title II-A funds have an obligation to participate in Department evaluations (Education Department General Administrative Regulations (EDGAR) (34 C.F.R. § 76.591)).
The following language will be included on the cover sheet of the school district survey under the Notice of Confidentiality: Information collected for this study comes under the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Responses to this data collection will be used only for statistical purposes. The reports prepared for the study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law.
On the state survey, we will modify the Notice of Confidentiality statement, noting that reports will not associate responses with a specific individual. That is, while individual states may be identified in reporting, individual respondents will not be identified.
The Education Sciences Reform Act of 2002, Title I, Part E, Section 183 of this Act requires, “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” Respondents will be assured that confidentiality will be maintained, except as required by law.
Specific steps to guarantee confidentiality include the following:
Identifying information about respondents (e.g., respondent name, address, and telephone number) will not be entered into the analysis data file, but will be kept separate from other data and will be password protected. A unique identification number for each respondent will be used for building raw data and analysis files.
A fax machine used to send or receive documents that contain confidential information will be kept in a locked field room, accessible only to study team members.
Confidential materials will be printed on a printer located in a limited access field room. When printing documents that contain confidential information from shared network printers, authorized study staff will be present and retrieve the documents as soon as printing is complete.
In public reports, findings will be presented in aggregate by type of district respondent or for subgroups of interest. No reports will identify individual respondents or school districts.
Access to the sample files will be limited to authorized study staff only; no others will be authorized such access.
All members of the study team will be briefed regarding confidentiality of the data.
Most data will be entered via the web systems. However, a control system will be established to monitor the status and whereabouts of any hard copy data collection instruments during data entry.
All data will be stored in secure areas accessible only to authorized staff members. Computer-generated output containing identifiable information will be maintained under the same conditions.
Hard copies containing confidential information that is no longer needed will be shredded.
There are no questions of a sensitive nature asked in the surveys.
Beginning in April 2018, surveys will be administered to respondents in the 50 states and the District of Columbia, 567 school districts previously sampled for the 2014 Title I/II-A data collection, and a new sample of 152 charter school districts.
In all, responses will be required from 770 respondents (51 state officials for the state survey, 567 district officials from the original district sample, and 152 officials from the new charter school district sample). Although we expect there may be more than one respondent completing the survey, we are estimating the burden to complete the total survey as one respondent per state/district times the number of minutes for the total survey. In practice, the total number of minutes is likely to be divided among a larger number of respondents.
Based on the survey pretests, we estimate that it will take (1) state respondents an average of 180 minutes (in total, summed across multiple respondents in each state working on separate survey sections) for the survey and (2) district respondents an average of 60 minutes for the survey, so total burden for the 2018 data collection is 52,320minutes or 872 hours (see Table A-2 below).
Table A-1. Estimates of respondent burden
Informant/Data Collection Activity |
Number of Respondents |
Minutes per completion |
Number of administrations |
Burden in minutes |
Total Burden Hours |
Total Costs |
State |
||||||
SEA survey |
51 |
180 |
1 |
9,180 |
153 |
$7,016.58 |
District |
||||||
District survey |
719 |
60 |
1 |
43,140 |
719 |
$32,973.34 |
Total |
770 |
n/a |
1 |
52,320 |
872 |
$39,989.92 |
NOTE: Assumes an hourly rate $45.86 per hour for educational administrators (derived from the Bureau of Labor Statistics’ Occupational Employment and Wages for educational administrators, May 2016. See: https://www.bls.gov/oes/current/oes119032.htm).
There is no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the information.
The amount for the design, conduct of surveys, and analysis and reporting related to the 2018 data collection is $3,671,068. The annualized cost over the three years for these activities is $1,223,689.
The follow-up data collection is a reinstatement with change of a previously approved information collection request. There was a reduction in burden and responses from the previously approved request of 11,551 responses and 5,752 hours. This results in an increase in burden and responses of 872 hours and 770 responses.
We anticipate beginning the state and district data collection in April 2018 and continuing into summer.
The
follow-up report will follow the principles of the Federal Plain
Language Action and Information Network and adhere to the
requirements of the NCES Statistical Standards (2002), IES Style
Guide (2005) and other IES guidance and requirements for public
reporting.
The focus of the follow-up report will be on early implementation of ESSA and changes since 2014. This follow-up study report will answer a clearly established set of questions using information from the state and school district surveys and extant sources of data, when appropriate. The report will start with an Executive Summary and chapter 1 will provide an overview of the study. The body of the report will contain chapters addressing each of the major themes of the study. Each chapter will have a brief context section summarizing the provisions of Titles I and II-A that embody policy-relevant issues in each area, and how these policies have evolved since 2013–14. The report will include information at the state and district levels.
The follow-up report will be supported by analyses that will have four main objectives: (1) describing the extent to which policy and program initiatives related to the objectives of Title I and Title II-A are being implemented at the state and district levels, including how implementation varies by selected state and district characteristics; (2) describing patterns of cross-level implementation; (3) describing changes in implementation since 2014; and (4) describing trends in student achievement. Each set of planned analyses is described below.
The primary goal of the follow-up data collection is to describe the implementation of policy and program initiatives related to the objectives of Title I and Title II-A. To achieve this goal, extensive descriptive analyses will be conducted using survey data. We anticipate that relatively straightforward descriptive statistics (e.g., means, frequencies, and percentages) and simple statistical tests (e.g., tests for differences of proportions) will typically be used to answer the research questions detailed in section A.1 above.
While simple descriptive statistics such as means and percentages will provide answers to many of our questions, cross-tabulations will be important to answer questions about variation across state and district characteristics. The primary characteristics of interest for the cross-tabulations are:
District poverty level: Poverty is included because Title I is specifically intended to ameliorate the effects of poverty on local funding constraints and educational opportunity.
District size: Size is included because it may be related to district capacity to develop and implement programs.
District urbanicity: Urbanicity is included because of the relationships between educational opportunity and rural isolation and the concentration of poverty in urban schools.
Concentration of English learners: Concentration of English learners is included because of the increased emphasis on ensuring that this group of students reaches English proficiency and meets state standards, and the recognition that modifications in testing as well as instruction will be needed to facilitate progress of these students.
District charter school status: There is interest in examining whether district policies and practices vary in districts consisting entirely of charter schools.
Because of the use of a statistical sample, survey data presented for districts will be weighted to national totals (tabulations will provide standard errors for the reported estimated statistics). In addition, the descriptive tables will indicate where differences between subgroups are statistically significant. We will use Chi-Square tests to test for significant differences among distributions and t-tests for differences in means.
Planned analyses of cross-level implementation involve examining responses of districts by categories of state responses. These analyses will examine the relationship between policies and programs originating at the state level and implementation “on the ground” in districts. Though the planned analyses cannot support causal conclusions about the effects of state actions on district implementation, they can provide evidence on the extent to which district practices are consistent with state policies.
Conceptually, these analyses posit that certain state policy choices influence what happens in districts. Examples of potential cross-level analyses include:
The actions planned by districts to turn around lowest-performing schools by the interventions required by states and/or the guidance provided by states.
District reports of major challenges implementing state content standards by whether the state reported making major changes to content standards since April 2014.
These cross-level analyses will be based on cross-tabulations. For example, a straightforward set of descriptive tables will show the relationships between survey responses at the state level and the mean responses or percentages at the district level. Breakdowns by the previously discussed state and district characteristics will further sharpen the interpretation of these relationships.
Analyses will address how implementation of policies and programs may have changed since the 2014 data collection. These analyses will document changes over time and will not attempt to derive any causal inferences or attribute any changes directly to the passage of ESSA.
These analyses will examine key policies that may have changed since 2014, such as:
The number of states requiring teacher evaluation systems with certain components compared with the number of states requiring systems with those same components in 2014.
The percentage of school districts with lowest-performing schools that reported receiving higher levels of professional development for teachers and principals in these schools compared with the percentage in 2014.
These analyses will be based on a comparison of responses to the same questions posed in 2014 and 2018. The analyses of state responses will report differences over time in the number of states reporting particular policy choices. The analyses of district responses will be weighted to national totals and the tables will indicate where differences between 2014 and 2018 are statistically significant, as discussed previously.
We will conduct a descriptive analysis of state-level and national trends in student proficiency levels, using NAEP results and results on state assessments. These analyses will address the research question of whether students are making progress on meeting state academic achievement standards within states and how this progress varies across states. The analyses will be entirely descriptive and will not attempt to derive any causal inferences. While these data do not provide evidence on the effects of any particular ESSA policy or of ESSA as a whole, they provide context for the implementation analyses in the remainder of the report.
The state-level analysis will use two data sources:
State-by-state NAEP proficiency rates since 2005, math, reading, and science, grades 4, 8, and 12; and
State-by-state proficiency rates on each state’s own assessments since 2007, math and reading, grades 4, 8, and 9-12 (from EDFacts).
For the analyses using state-by-state NAEP proficiency rates, we use 2005 as the baseline year because it was the final year included in the last national assessment of Title I. For the analyses using state-by-state proficiency rates on each state’s own assessments, we use 2007 as the baseline year because EDFacts does not consistently have proficiency data for all states prior to 2007. The results of this analysis will consist of a comparison of changes in student proficiency levels across states.
We will also examine national trends in student proficiency rates for the nation as a whole and for certain subgroups of students. For the nationwide average, racial/ethnic subgroups, students with disabilities, and English learners, we will present the following information:
Percent proficient on NAEP, in 2005 and most recent year, and the difference between the two years; and
Percent proficient on state assessments, in 2007 and the most recent year, and the difference between the two years.
For each of these items we will report results separately for math and reading, and for grades 4, 8, and high school. NAEP assessments are given in grade 12, but the tested high-school grade(s) on state assessments vary across states, and we will report results for the tested high-school grade(s) for each state. Additionally, we will report the percentage of states that are improving, declining, or remaining the same in their proficiency rates on NAEP.
Finally, we will use existing data to show the trend in the nationwide graduation rate and the latest graduation rates for each state using the four-year adjusted-cohort graduation rate data from EDFacts. These data will be examined by student subgroups (e.g., major racial and ethnic groups, economically disadvantaged).
A.17. Display of Expiration Date for OMB Approval
The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date. The surveys (Appendices A and B) and notification letters (Appendix C) will display the expiration date for OMB approval.
This
submission does not
require an exception to the Certificate for Paperwork
Reduction Act
(5 CFR 1320.9).
U.S. Department of Education. (2016a). Retrieved from: http://eddataexpress.ed.gov/
U.S. Department of Education. (2016b). Findings from the 2015–2016 Survey on the Use of Funds Under Title II, Part A. Retrieved from: http://www2.ed.gov/programs/teacherqual/leasurveyfundsrpt82016.pdf
1 Troppe, P., Milanowski, A., Heid, C., Gill, B., and Ross, C. (2017). Implementation of Title I and Title II-A Program Initiatives: Results from 2013-14. (NCEE 2017-4014). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
2 The study sampled 570 districts for the 2014 data collection. However, three of these districts have since closed.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |