National Center for Education Statistics
National Assessment of Educational Progress
National Assessment of Educational Progress (NAEP)
2025 Long-Term Trend (LTT) Clearance Package
Supporting Statement
Part A
OMB# 1850-0928 v.35
October 2023
revised June 2024
A.1.Circumstances Making the Collection of Information Necessary 3
A.1.a. Purpose of Submission 3
A.1.b. Legislative Authorization 5
A.1.c. Overview of NAEP Assessments 5
A.1.c.2. Cognitive Item Development 6
A.1.d. Overview of 2025 NAEP Assessments 9
A.2. How, by Whom, and for What Purpose the Data Will Be Used 10
A.3. Improved Use of Technology 11
A.4. Efforts to Identify Duplication 11
A.5. Burden on Small Businesses or Other Small Entities 12
A.6. Consequences of Collecting Information Less Frequently 12
A.7. Consistency with 5 CFR 1320.5 12
A.8. Consultations Outside the Agency 12
A.9. Payments or Gifts to Respondents 14
A.10. Assurance of Confidentiality 14
A.12. Estimation of Respondent Reporting Burden (2025) 18
A.14. Estimates of Cost to the Federal Government 21
A.16. Time Schedule for Data Collection and Publications 22
A.17. Approval for Not Displaying OMB Approval Expiration Date 22
A.18. Exceptions to Certification Statement 22
Appendices Included:
Appendix A External Advisory Committees
Appendix B NAEP 2012 Long-Term Trend (LTT) Weighting Procedures Design
Appendix C NAEP 2025 Long-Term Trend (LTT) Sampling Memo
Appendix D NAEP 2022 Long-Term Trend (LTT) Communications and Recruitment Materials
Appendix E NAEP 2012 Long-Term Trend (LTT) Sampling Design
Appendix F1 NAEP 2025 Long-Term Trend (LTT) Assessment Management Systems (AMS) Screens
Appendix G NAEP 2025 Long-Term Trend (LTT) Survey Questionnaires
Appendix G-S NAEP 2025 Long-Term Trend (LTT) Bilingual Spanish Survey Questionnaires
The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, and civics.
NAEP is conducted by the National Center for Education Statistics (NCES) in the Institute of Education Sciences of the U.S. Department of Education. As such, NCES is responsible for designing and executing the assessment, including designing the assessment procedures and methodology, developing the assessment content, selecting the final assessment content, sampling schools and students, recruiting schools, administering the assessment, scoring student responses, determining the analysis procedures, analyzing the data, and reporting the results.1
The National Assessment Governing Board (henceforth referred to as the Governing Board or NAGB), appointed by the Secretary of Education but independent of the Department, is a bipartisan group whose members include governors, state legislators, local and state school officials, educators, business representatives, and members of the general public. The Governing Board sets policy for NAEP and is responsible for developing the frameworks and test specifications that serve as the blueprint for the assessments.
The NAEP assessments contain diverse items such as “cognitive” assessment items, which measure what students know and can do in an academic subject, and “survey” or “non-cognitive” items, which gather information such as demographic variables, as well as construct-related information, such as courses taken. The survey portion includes a collection of data from students, teachers, and school administrators. Since NAEP assessments are administered uniformly using the same sets of test booklets across the nation, NAEP results serve as a common metric for all states and select urban districts. The assessment stays essentially the same from year to year, with only carefully documented changes. This permits NAEP to provide a clear picture of student academic progress over time.
NAEP consists of two assessment programs: the NAEP long-term trend (LTT) assessment and the main NAEP assessment. The LTT assessments are given at the national level only and are administered to students at ages 9, 13, and 17 in a manner that is very different from that used for the main NAEP assessments. LTT reports mathematics and reading results that present trend data since the 1970s. In addition, the timing of the LTT assessments differs: October through December for 13-year-olds, January through March for 9-year-olds, and March through May for 17-year-olds. The long-term trend evaluations for ages 9 and 13 were last administered in 2022, and the LTT Age 17 evaluation was scheduled to take place in March 2020. When schools closed in the spring of 2020 in response to the global coronavirus pandemic, the LTT Age 17 was delayed indefinitely. Of note, NCES and NAGB decided to administer LTT Age 9 again as part of NAEP 2022; although it would delay the age 17 collection further, repeating the age 9 collection in 2022 had the advantage of enabling more direct pre- and post-pandemic comparisons for age 9 students. Similarly, NAGB decided in November 2021 that NAEP would re-administer LTT Age 13 in the fall of 2022, further allowing pre- and post-pandemic comparisons for the age 13 students. The last administration of LTT Age 17 occurred in 2012. This submission covers the administration of the 2025 NAEP LTT assessments at Ages 9, 13, and 17.
NAEP provides results on subject-matter achievement, instructional experiences, and school environment for populations of students (e.g., all fourth-graders) and groups within those populations (e.g., female students, Hispanic students). NAEP does not provide scores for individual students or schools. The main NAEP assessments report current achievement levels and trends in student achievement at grades 4, 8, and 12 for the nation and, for certain assessments (e.g., reading and mathematics), states and select urban districts. The Trial Urban District Assessment (TUDA) is a special project developed to determine the feasibility of reporting district-level results for large urban districts. Currently, the following 27 districts participate in the TUDA program: Albuquerque, Atlanta, Austin, Baltimore City, Boston, Charlotte, Chicago, Clark County (NV), Cleveland, Dallas, Denver, Detroit, District of Columbia (DCPS), Duval County (FL), Fort Worth, Guilford County (NC), Hillsborough County (FL), Houston, Jefferson County (KY), Orange County (FL), Los Angeles, Miami-Dade, Milwaukee, New York City, Philadelphia, San Diego, and Shelby County (TN). Note, the LTT reporting is at the national-level only.
The possible universe of student respondents is estimated to be 12 million at grades 4, 8, and 12 for main NAEP, and at ages 9, 13, and 172 for LTT, attending the approximately 154,000 public and private elementary and secondary schools in 50 states and the District of Columbia, and including Bureau of Indian Education and Department of Defense Education Activity (DoDEA) Schools. Note that territories, including Puerto Rico, are not included in the national samples.
This request is to conduct NAEP LTT assessments in 2025 at Ages 9, 13, and 17 in reading and mathematics, which will follow the traditional NAEP design which assesses each student in 45-minutes (three 15-minute blocks) for one cognitive subject and 5-minutes for one survey questionnaire block. Additionally, school survey questionnaires will be administered on paper for all 3 Ages, and the estimated burden will be 30-minutes. NAEP LTT is a paper-based assessment (PBA), and thus will be conducted without the use of the technology NAEP utilizes in the Operational Main NAEP administrations.
The library of possible items to be used in the NAEP LTT 2025 questionnaires is provided in Appendices G and G-S, which includes English and Spanish items. These documents include the final 2025 LTT questionnaires (approved OMB# 1850-0928 v.32). The previously approved placeholder versions of the LTT Age 9, 13, and 17 communication and recruitment materials have now been updated, and the final versions are available in Appendix D. The final version of the 2025 Assessment Management System (AMS) materials have also been updated accordingly in this Amendment #1 submission (see Appendix F1).
Some of the assessment, questionnaire, and recruitment materials are translated into Spanish. Specifically, Spanish versions of the student assessments and questionnaires are used for qualified English learner (EL) students when a bilingual accommodation is offered for ages 9 and 13 for LTT (note that no LTT bilingual accommodation is offered for reading or Age 17 mathematics). In addition, every year, Spanish versions of parent/guardian communication materials are used nationwide for Spanish-speaking parents/guardians.
In the current legislation that reauthorized NAEP, the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622), Congress mandates the collection of national education survey data through a national assessment program:
ESTABLISHMENT- The Commissioner for Education Statistics shall, with the advice of the Assessment Board established under section 302, carry out, through grants, contracts, or cooperative agreements with one or more qualified organizations, or consortia thereof, a National Assessment of Educational Progress, which collectively refers to a national assessment, State assessments, and a long-term trend assessment in reading and mathematics.
PURPOSE; STATE ASSESSMENTS-
(1) PURPOSE- The purpose of this section is to provide, in a timely manner, a fair and accurate measurement of student academic achievement and reporting of trends in such achievement in reading, mathematics, and other subject matter as specified in this section.
The National Assessment of Educational Progress Authorization Act also requires the assessment to collect data on specified student groups and characteristics, including information organized by race/ethnicity, gender, socio-economic status, disability, and English learners. This allows for the fair and accurate presentation of achievement data and permits the collection of background, non-cognitive, or descriptive information that is related to academic achievement and aids in the fair reporting of results. The intent of the law is to provide representative sample data on student achievement for the nation, the states, and a variety of populations of students, and to monitor progress over time.
The statute and regulation mandating or authorizing the collection of this information can be found at https://www.law.cornell.edu/uscode/text/20/9622.
This section provides a broad overview of NAEP assessments, including information on the assessment frameworks, the cognitive and survey items, inclusion policies, and the assessment types.
Main NAEP assessments follow subject-area frameworks developed by the Governing Board and use the latest advances in assessment methodology. Frameworks capture a range of subject-specific content and thinking skills needed by students in order to deal with the complex issues they encounter inside and outside their classrooms. The NAEP frameworks are determined through a development process that ensures they are appropriate for current educational requirements. Because the assessments must remain flexible to mirror changes in educational objectives and curricula, the frameworks must be forward-looking and responsive, balancing current teaching practices with research findings.
NAEP frameworks can serve as guidelines for planning assessments or revising curricula. They also can provide information on skills appropriate to grades 4, 8, and 12 and can be models for measuring these skills in innovative ways. The subject-area frameworks evolve to match instructional practices. Developing a framework generally involves the following steps:
widespread participation and reviews by educators and state education officials;
reviews by steering committees whose members represent policymakers, practitioners, and members of the general public;
involvement of subject supervisors from education agencies;
public hearings; and
reviews by scholars in the field, by NCES staff, and by a policy advisory panel.
The frameworks can be found at https://www.nagb.gov/naep-frameworks/frameworks-overview.html.
Information about what the LTT assessments measure can be found at NAEP - Long-Term Trend: What Mathematics Measures (ed.gov) for mathematics and NAEP - Long-Term Trend: What Reading Measures (ed.gov) for reading.
As part of the item development process, NCES calls on many constituents to guide the process and review the assessment. Item development is guided by a multi-year design plan, which is guided by the framework and establishes the design principles, priorities, schedules, and reporting goals for each subject. Based on this plan, the NAEP contractor creates a development plan outlining the item inventory and objectives for new items and then begins the development process by developing more items than are needed. This item pool is then subjected to:
internal contractor review with content experts, teachers, and experts on political sensitivity and bias;
playtesting, tryouts, or cognitive interviews with small groups of students for select items (particularly those that have new item types, formats, or challenging content); and
refinement of items and scoring rubrics under NCES guidance.
Next, a standing committee of content experts, state and local education agency representatives, teachers, and representatives of professional associations reviews the items. The standing committee considers:
the appropriateness of the items for the particular grade or age;
the representative nature of the item set;
the compatibility of the items with the framework and test specifications; and
the quality of items and scoring rubrics.
For state-level assessments, this may be followed by a state item review where further feedback is received. Items are then revised and submitted to NCES and the Governing Board Assessment Development Committee for approval prior to pilot testing.
The pilot test is used to finalize the testing instrument. Items may be dropped from consideration or move forward to the operational assessment. The item set is once again subjected to review by the standing committee and NCES following generally the same procedure described above. A final set of test items is then assembled for NCES and the Governing Board’s review and approval. After the operational assessment, items are once again examined. In rare cases where item statistics indicate problems, the item may be dropped from the assessment. The remaining items are secured for reuse in future assessments, with a subset of those items publicly released.
In addition to assessing subject-area achievement, NAEP collects information that serves to fulfill the reporting requirements of the federal legislation and to provide context for the reporting of student performance. The legislation requires that, whenever feasible, NAEP includes information on special groups (e.g., information reported by race, ethnicity, socio-economic status, gender, disability, and limited English proficiency). As part of most NAEP assessments, three types of questionnaires are used to collect information: student, teacher, and school. An overview of the questionnaires is presented below, and the survey questionnaires are available in Appendix G and G-S.
Each NAEP student assessment booklet includes non-cognitive items, also known as the student questionnaire. The questionnaires appear in separately timed blocks of items in the assessment forms. The items collect information on students’ demographic characteristics, classroom experiences, and educational support. Students’ responses provide data that give context to NAEP results and/or allow researchers to track factors associated with academic achievement. Students complete the questionnaires voluntarily (for confidentiality provisions see Section A.10 for more information). Student names are never reported with their responses or with the other information collected by NAEP.
Each Operational NAEP student questionnaire includes three types of items:
General student information: Student responses to these items are used to collect information about factors such as race or ethnicity and parents’ education level. Answers on the questionnaires also provide information about factors associated with academic performance, including homework habits, the language spoken in the home, and the number of books in the home.
Other contextual/policy information: These items focus on students’ educational settings and experiences and collect information about students’ attendance (i.e., days absent), family discourse (i.e., talking about school at home), reading load (i.e., pages read per day), and exposure to English in the home. There are also items that ask about students’ effort on the assessment and the difficulty of the assessment. Answers on the questionnaires provide information on how aspects of education and educational resources are distributed among different groups.
Subject-specific information: In most NAEP administrations, these items cover three categories of information: (1) time spent studying the subject; (2) instructional experiences in the subject; and (3) student factors (e.g., effort, confidence) related to the subject and the assessment.
Note, teacher and school administrator survey questionnaires are collected for the Operational Main NAEP administrations. School survey questionnaires will be administered in LTT 2025 across all 3 Ages, but teacher survey questionnaires will not be applicable for this 2025 NAEP LTT data collection.
The school questionnaire provides supplemental information about school factors that may influence students’ achievement. It is given to the principal or another official of each school that participates in the NAEP assessment. While schools’ completion of the questionnaire is voluntary, NAEP encourages schools’ participation since it makes the NAEP assessment more accurate and complete.
The LTT 2025 school questionnaire will be administered via paper only across Ages 9, 13, and 17. The first part tends to cover characteristics of the school, including the length of the school day and year, school enrollment, absenteeism, dropout rates, and the size and composition of the teaching staff. Subsequent parts of the school questionnaire tend to cover tracking policies, curricula, testing practices, special priorities, and schoolwide programs and problems. The questionnaire also collects information about the availability of resources, policies for parental involvement, special services, and community services. In 2025, school administrators will answer questions about their school’s instructional organization and practices related to learning recovery following the COVID-19 outbreak and gaps in learning that have developed due to the extended period of remote and hybrid learning that took place during the pandemic.
The Background Information Framework and the Governing Board’s Policy on the Collection and Reporting of Background Data (located at https://www.nagb.gov/content/nagb/assets/documents/policies/collection-report-backg-data.pdf), guide the collection and reporting of non-cognitive assessment information. In addition, subject-area frameworks provide guidance on subject-specific, non-cognitive assessment questions to be included in the questionnaires. The development process is very similar to the cognitive items, including review of the existing item pool; development of more items than are intended for use; review by experts (including the standing committee); and cognitive interviews with students, teachers, and schools. When developing the questionnaires, NAEP uses a pretesting process so that the final questions are minimally intrusive or sensitive, are grounded in educational research, and the answers can provide information relevant to the subject being assessed.
In the web-based NAEP Data Explorer,3 (located at https://www.nationsreportcard.gov/ndecore/landing) the results of the questionnaires are sorted into eight broad categories: Major Reporting Groups, Student Factors, Factors Beyond School, Instructional Content and Practice, Teacher Factors, School Factors, Community Factors, and Government Factors.
Previously approved versions of the LTT Age 9, 13, and 17 questionnaires are provided in Appendix G and G-S (OMB# 1850-0928 [v.16 for Age 17] [v.22 for Ages 9 and 13]). Slight updates were made to the school core items for Ages 9, 13, and 17, and all items (including the revisions) were approved in OMB# 1850-0928 v.32.
It is important for NAEP to assess as many students selected to participate as possible. Assessing representative samples of students, including students with disabilities (SD) and English learners (EL), helps to ensure that NAEP results accurately reflect the educational performance of all students in the target population and can continue to serve as a meaningful measure of U.S. students’ academic achievement over time.
The Governing Board, which sets policy for NAEP, has been exploring ways to ensure that NAEP continues to appropriately include as many students as possible and to do so in a consistent manner for all jurisdictions assessed and reported on. In March 2010, the Governing Board adopted a policy, NAEP Testing and Reporting on Students with Disabilities and English Language Learners (located at https://www.nagb.gov/content/nagb/assets/documents/policies/naep_testandreport_studentswithdisabilities.pdf). This policy was the culmination of work with experts in testing and curriculum and those who work with exceptional children and students learning to speak English. The policy aims to:
maximize participation of sampled students in NAEP;
reduce variation in exclusion rates for SD and EL students across states and districts;
develop uniform national rules for including students in NAEP; and
ensure that NAEP is fully representative of SD and EL students.
The policy defines specific inclusion goals for NAEP samples. At the national, state, and district levels, the goal is to include 95 percent of all students selected for the NAEP samples, and 85 percent of those in the NAEP sample who are identified as SD or EL.
Students are selected to participate in NAEP based on a sampling procedure4 designed to yield a sample of students that is representative of students in all schools nationwide and in public schools within each state. First, schools are selected, and then students are sampled from within those schools without regard to disability or English language proficiency. Once students are selected, those previously identified as SD or EL may be offered accommodations or excluded.
Accommodations in the testing environment or administration procedures are provided for SD and EL students. Some examples of accommodations permitted by NAEP are extra time, testing in small-group or one-on-one sessions, reading aloud to a student, and scribing a student’s responses. Some examples of testing accommodations not allowed are giving the reading assessment in a language other than English or reading the passages in the reading assessment aloud to the student.
States and jurisdictions vary in their proportions of students with disabilities and in their policies on inclusion and the use of accommodations. Despite the increasing identification of SD and EL students in some states, in particular of EL students at grade 4, NAEP inclusion rates have generally remained steady or increased since 2003. This reflects efforts on the part of states and jurisdictions to include all students who can meaningfully participate in the NAEP assessments. The NAEP inclusion policy is an effort to ensure that this trend continues.
NAEP uses three types of assessment activities, which may simultaneously be in the field during any given data collection effort. Each is described in more detail below.
Operational NAEP administrations, unlike pilot administrations, collect data to publicly report on the educational achievement of students as required by federal law. The NAEP results are reported in The Nation’s Report Card (http://nationsreportcard.gov/), which is used by policymakers, state and local educators, principals, teachers, and parents to inform educational policy decisions.
Pilot testing (also known as field testing) of cognitive and non-cognitive items is carried out in all subject areas. Pilot assessments are usually conducted in conjunction with operational assessments and use the same procedures as the operational assessments. The purpose of pilot testing is to obtain information regarding clarity, difficulty levels, timing, and feasibility of items and conditions. In addition to ensuring that items measure what is intended, the data collected from pilot tests serve as the basis for selecting the most effective items and data collection procedures for the subsequent operational assessments. Pilot testing is a cost-effective means for revising and selecting items prior to an operational data collection because the items are administered to a small nationally representative sample of students, and data are gathered about performance that crosses the spectrum of student achievement. Items that do not work well can be dropped or modified before the operational administration.
Prior to pilot testing, many new items are pre-tested with small groups of sample participants (cleared under the NCES pretesting generic clearance agreement; OMB# 1850-0803). All non-cognitive items undergo one-on-one cognitive interviews, which are useful for identifying questionnaire and procedural problems before larger-scale pilot testing is undertaken. Select cognitive items also undergo pre-pilot testing, such as item tryouts or cognitive interviews, in order to test out new item types or formats, or challenging content. In addition, usability testing is conducted on new technologies and digitally-based platforms and instruments.
Special studies are an opportunity for NAEP to investigate particular aspects of the assessment without impacting the reporting of NAEP results. Previous special studies have focused on linking NAEP to other assessments or linking across NAEP same-subject frameworks, investigating the expansion of the item pool, evaluating specific accommodations, investigating administration modes (such as DBA alternatives), and providing targeted data on specific student populations.
In addition to the overarching goal of NAEP to provide data about student achievement at the national, state, and district levels, NAEP also provides specially targeted data on an as-needed basis. At times, this may only mean that a special analysis of the existing data is necessary. At other times, this may include the addition of a short, add-on questionnaire targeted at specified groups. For example, in the past, additional student, teacher, and school questionnaires were developed and administered as part of the National Indian Education Study (NIES) that NCES conducted on behalf of the Office of Indian Education. Through such targeted questionnaires, important information about the achievement of a specific group is gathered at minimal additional burden. These types of special studies are intentionally kept to a minimum and are designed to avoid jeopardizing the main purpose of the program.
Since the early 1970s, NAEP has monitored student performance in mathematics and reading through LTT assessments. These assessments measure students’ educational progress over long time periods to look for and monitor trends in performance. Results from the LTT assessments are based on nationally representative samples of 9-, 13-, and 17-year old students.
Unlike the main NAEP assessments, which develop assessment instruments that reflect current educational content and assessment methodology, LTT has remained relatively unchanged since 1990. In the 1970s and 1980s, the assessments changed to reflect changes in curriculum in the nation's schools. Continuity of assessment content was sufficient not to require a break in trends. LTT assessment is administered on paper only.
The LTT assessment measures students’ knowledge in mathematics and reading. Survey questionnaires, administered to students who participate in an LTT assessment, are used to collect and report contextual information about the students’ learning experience in and out of the classroom.
Student performance on the NAEP LTT assessment is presented in two ways: scale scores5 and performance levels:
Scale scores represent the average performance of students who took the LTT assessment in mathematics and reading. Scores are aggregated and reported for the nation and groups of students based on gender, race/ethnicity, etc.
Performance levels are reported as the percentages of students attaining specific levels of performance corresponding to five points on the NAEP long-term trend reading and mathematics scales (150, 200, 250, 300, and 350).
Information related to the 1970-2022 Trends can be found at https://nces.ed.gov/nationsreportcard/ltt/.
The Governing Board determines NAEP policy and the assessment schedule,6 and future Governing Board decisions may result in changes to the plans represented here. Any changes will be presented in subsequent clearance packages or revisions to the current package.
The 2025 data collection will consist of the following:
Operational national PBA LTT at ages 9-, 13-, and 17-year old students. Each age group will be administered mathematics and reading assessments. The assessments are conducted at different time periods: October through December 2024 for 13-year-olds, January through March 2025 for 9-year-olds, and March through May 2025 for 17-year-olds.
As mentioned above in Section A.1.a, the long-term trend evaluations for ages 9 and 13 were completed in NAEP 2022, and the LTT Age 17 evaluation was scheduled to take place in March 2020. When schools closed in the spring of 2020 in response to the global coronavirus pandemic, the LTT Age 17 was delayed indefinitely. Of note, NCES and NAGB decided to administer LTT Age 9 again as part of NAEP 2022; although it would delay the age 17 collection further, repeating the age 9 collection in 2022 had the advantage of enabling more direct pre- and post-pandemic comparisons for age 9 students. Similarly, NAGB decided in November 2021 that NAEP would re-administer LTT Age 13 in the fall of 2022, further allowing pre- and post-pandemic comparisons for the age 13 students. The last administration of LTT Age 17 occurred in 2012. NCES currently plans to administer LTT Ages 9, 13, and 17 in 2025. Although it is considered a 2025 administration, LTT Age 13 will be administered in the Fall of 2024.
Results will be reported on the 2025 operational LTT assessments in mathematics and reading.
The NAEP operational results are reported in The Nation’s Report Card, which is used by policymakers, state and local educators, principals, teachers, and parents to help inform educational policy decisions. The LTT NAEP report cards provide national trend results (to most recent years, as well as trends to the early 1970s), trends for different student groups, and results for different percentiles. If NCES elects to release sample items, percentage correct statistics on those items will be provided in the report. NAEP does not provide scores for individual students or schools.
Results from each LTT NAEP assessment are provided online in an interactive website (http://nationsreportcard.gov/). Additional LTT data tools are available online for those interested in:
analyzing NAEP data and creating tables and graphics (https://www.nationsreportcard.gov/ndecore/xplore/ltt); and
searching, sorting, and providing data for sample NAEP items (https://nces.ed.gov/nationsreportcard/nqt/).
In addition to contributing to the reporting tools mentioned above, data from the questionnaires are used as part of the marginal estimation procedures that produce the student achievement results. Questionnaire data are also used to perform quality control checks on school-reported data and in special reports, such as the Black–White Achievement Gap report (http://nces.ed.gov/nationsreportcard/studies/gaps/) and the Classroom Instruction Report in reading, mathematics, and science based on the 2015 Student Questionnaire Data (https://www.nationsreportcard.gov/sq_classroom/#mathematics).
Lastly, there are numerous opportunities for secondary data analysis because of NAEP’s large scale, the regularity of its administrations, and its stringent quality control processes for data collection and analysis. NAEP data are used by researchers and educators who have diverse interests and varying levels of analytical experience.
NAEP has continually moved to administration methods that include greater use of technology, as described below.
Each school participating in NAEP has a designated staff member to serve as its NAEP school coordinator. Pre-assessment and assessment activities include functions such as finalizing student samples, verifying student demographics, reviewing accommodations, and planning logistics for the assessment. NAEP is moving in the direction of paperless administrations. An electronic pre-assessment system (known as the Assessment Management System or AMS) was developed so that school coordinators would provide requested administration information online, including logistical information, updates of student and teacher information, and the completion of inclusion and accommodation information.7
NAEP administers a combination of selected-response items and open-ended or constructed-response items. NAEP currently uses human scorers to score the constructed-response items, using detailed scoring rubrics and proven scoring methodologies. With the increased use of technologies, the methodology and reliability of automated scoring (i.e., the scoring of constructed-response items using computer software) has advanced. While NAEP does not currently employ automated scoring methodologies operationally, these are being investigated for possible future use. In particular, NCES recently held a competition to examine a variety of automated scoring engines and methods for consideration in NAEP (see: https://nces.ed.gov/whatsnew/press_releases/1_21_2022.asp).
The proposed assessments, including the questionnaires, do not exist in the same format or combination in the U.S. Department of Education or elsewhere. The non-cognitive data gathered by NAEP comprise the only comprehensive cross-sectional survey performed regularly on a large-scale basis that can be related to extensive achievement data in the United States. No other federally funded studies have been designed to collect data for the purpose of regularly assessing trends in educational progress and comparing these trends across states. None of the major non-federal studies of educational achievement were designed to measure changes in national achievement. In short, no existing data source in the public or private sector duplicates NAEP.
While the survey items in NAEP are unique, the items are not developed in a vacuum. Their development is informed by similar items in other assessments and survey programs. In addition, in future rounds of development, NCES will continue to better align the NAEP survey questions with other surveys (particularly, but not limited to, those from other NCES and federal survey programs).
Historically, NAEP has served as a critical national “audit” function, offering an extremely helpful reference point in the interpretation of score trends on “high-stakes” tests used for school accountability. The main NAEP scales have served this function well even though high-stake state assessments were not always closely aligned with the corresponding NAEP assessments. Given the significant changes currently underway in the American educational landscape, including the Next Generation Science Standards, the Common Core State Standards, and Partnership for Assessment of Readiness for College and Careers (PARCC), and Smarter Balanced consortia, this “audit” function is even more important.
NAEP has provided the best available information about the academic achievement of the nation’s students in relation to consensus assessment frameworks, maintaining long-term trend lines for decades. In addition to reporting at the national level, NAEP has offered achievement comparisons among participating states for more than two decades, and since 2003, all states have participated in the NAEP mathematics and reading assessments at the fourth and eighth grades. More recently, NAEP has also reported achievement for selected large urban school districts. In addition to characterizing the achievement of fourth-, eighth-, and twelfth-grade students in a variety of subject areas, NAEP has also served to document the often substantial disparities in achievement across demographic groups, tracking both achievement and achievement gaps over time. In addition to describing educational achievement, NAEP has furthered deliberation as to the scope and meaning of achievement in mathematics, reading, and other subject areas. NAEP assessments are aligned to ambitious assessment frameworks developed by a thoughtful process to reflect the best thinking of educators and content specialists. These frameworks have served as models for the states and other organizations to follow. Finally, NAEP has also served as a laboratory for innovation, developing and demonstrating new item formats, as well as statistical methods and models now emulated by large-scale assessments worldwide.
NAEP has functioned well as a suite of complex survey modules conducted as assessments of student achievement in fixed testing windows. The complexity of NAEP evolved by necessity to address its legal and policy reporting requirements and the complex sampling of items and students needed to make reliable and valid inferences at the subgroup, district, state, and national level for stakeholders, ranging from policymakers to secondary analysts, and do so without creating an undue burden on students and schools.
The school samples for NAEP contain small-, medium-, and large-size schools, including private schools. Schools are included in the sample proportional to their representation in the population, or as necessary to meet reporting goals. It is necessary to include small and private schools so that the students attending such schools are represented in the data collection and in the reports. The trained field staff work closely with all schools to ensure that the pre-assessment activities and the administration can be completed with minimal disruption.
Under the National Assessment of Educational Progress Authorization Act, Congress has mandated the on-going collection of NAEP data. Failure to collect the 2025 NAEP LTT assessment data on the current schedule would affect the quality and schedule of the NAEP assessments and would result in assessments that would not fulfill the mandate of the legislation.
No special circumstances are involved. This data collection observes all requirements of 5 CFR 1320.5.
The NAEP assessments are conducted by an alliance of organizations under contract with the U.S. Department of Education.8 The Alliance includes the following:
Management Strategies is responsible for managing the integration of multiple NAEP project schedules and providing data on timeliness, deliverables, and cost performance.
Educational Testing Service (ETS) is responsible for coordinating Alliance contractor activities, developing the assessment instruments, analyzing the data, preparing the reports, and platform development.
Huntington Ingalls Industries (HII) is responsible for NAEP web technology, development, operations, and maintenance including the Integrated Management System (IMS).
Pearson is responsible for printing and distributing the assessment materials, and for scanning and scoring students’ responses.
Westat is responsible for selecting the school and student samples and managing field operations and data collection.
In addition to the NAEP Alliance, other organizations support the NAEP program, all of which are under contract with the U.S. Department of Education. The current list of organizations include:9
American Institutes for Research (AIR) is responsible for providing technical support, conducting studies on state-level NAEP assessments, and running the NAEP Validity Studies Panel.
CRP, Inc. is responsible for providing logistical and programmatic support.
Hager Sharp is responsible for supporting the planning, development, and dissemination of NAEP publications and outreach activities.
Manhattan Strategy Group is responsible for providing technical support.
State Education Agencies (SEAs) establish a liaison between the state education agency and NAEP, serve as the state’s representative to review NAEP assessment items and processes, coordinate the NAEP administration in the state, analyze and report NAEP data, and coordinate the use of NAEP results for policy and program planning.
Tribal Tech is responsible for providing support for the National Indian Education Study.
In addition to the contractors responsible for the development and administration of the NAEP assessments, the program involves many consultants and is also reviewed by specialists serving on various technical review panels. These consultants and special reviewers bring expertise concerning students of different ages, ethnic backgrounds, geographic regions, learning abilities, and socio-economic levels; the specific subject areas being assessed; the analysis methodologies employed; and large-scale assessment design and practices. Contractor staff and consultants have reviewed all items for bias and sensitivity issues, grade appropriateness, and appropriateness of content across states.
In particular, subject-area standing committees play a central role in the development of NAEP assessment instruments and have been essential in creating assessment content that is appropriate for the targeted populations, and that meets the expectations outlined in the Governing Board frameworks. One of the most important functions of the committees is to contribute to the validation of the assessments. Through detailed reviews of items, scoring guides, tasks, constructed-response item training sets for scorers, and other materials, the committees help establish that the assessments are accurate, accessible, fair, relevant, and grade-level appropriate, and that each item measures the knowledge and skills it was designed to measure. When appropriate, members of subject-area standing committees will also review the questionnaires with regards to appropriateness with existing curricular and instructional practices.
Appendix A lists the current members of the following NAEP advisory committees:
NAEP Validity Studies Panel
NAEP National Indian Education Study Technical Review Panel
NAEP Mathematics Standing Committee
NAEP Reading Standing Committee
NAEP Survey Questionnaires Standing Committee
NAEP Mathematics Translation Review Committee
NAEP Grade 4 and 8 Survey Questionnaire and eNAEP DBA System Translation Review Committee
NAEP Principals’ Panel Standing Committee
NAEP Delivery and Technology Panel
As has been the practice for the past few years, OMB representatives will be invited to attend the technical review panel meetings that are most informative for OMB purposes.
In addition to the contractors and the external committees, NCES works with the NAEP State Coordinators, who serve as the liaison between each state education agency and NAEP, coordinating NAEP activities in his or her state. NAEP State Coordinators work directly with the schools sampled for NAEP.
For the 2025 LTT Clearance package, a 60 day notice was published in the Federal Register on January 18, 2024. There were no public comments. A 30 day notice was also published, which did not result in any public comments.
In general, there will be no gifts or payments to respondents, although students do get to keep the pencil they use to complete NAEP LTT assessments. On occasion, NAEP will leave educational materials at schools for their use (e.g., science kits from the science hands-on assessments). Some schools also offer recognition parties with pizza or other perks for students who participate; however, these are not reimbursed by NCES or the NAEP contractors. If any incentives are proposed as part of a future special study, they would be justified as part of that future clearance package. As appropriate, the amounts would be consistent with amounts approved in other studies with similar conditions.
Data security and confidentiality protection procedures have been put in place for NAEP to ensure that all NAEP contractors and agents (see section A.8 in this document) comply with all privacy requirements, including:
The Statements of Work of NAEP contracts;
National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622);
Family Educational Rights and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));
Privacy Act of 1974 (5 U.S.C. §552a);
Privacy Act Regulations (34 CFR Part 5b);
Computer Security Act of 1987;
U.S.A. Patriot Act of 2001 (P.L. 107-56);
Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);
Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);
Foundations of Evidence-Based Policymaking Act of 2018, Title III, Part B, Confidential Information Protection;
The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);
The U.S. Department of Education Incident Handling Procedures (February 2009);
The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;
NCES Statistical Standards; and
All new legislation that impacts the data collected through the contract for this study.
Furthermore, all NAEP contractors and agents will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/. In addition, the Sampling and Data Collection (SDC) contractor has obtained an Authority to Operate (ATO) for the NCESLS System from the Education OCIO to operate at the FISMA moderate level through the Certification & Accreditation (C&A) process. Security controls include secure data processing centers and sites; properly vetted and cleared staff; and data sharing agreements.
An important privacy and confidentiality issue is the protection of the identity of assessed students, their teachers, and their schools. To assure this protection, NAEP has established security procedures, described below, that closely control access to potentially identifying information.
All assessment and questionnaire data are protected. This means that NAEP applications that handle assessment and questionnaire data:
enforce effective authentication password management policies;
limit authorization to individuals who truly need access to the data, only granting the minimum necessary access to individuals (i.e., least privilege user access);
keep data encrypted, both in storage and in transport, utilizing volume encryption and transport layer security protocols;
utilize SSL certificates and HTTPS protocols for web-based applications;
limit access to data via software and firewall configurations as well as not using well known ports for data connections; and
restrict access to the portable networks utilized to administer an assessment to only assessment devices.
Students’ names are submitted to the Sampling and Data Collection (SDC) contractor for selecting the student sample. This list also includes the month/year of birth, race/ethnicity, gender, and status codes for students with disabilities, English learners, and participation in the National School Lunch Program. This data request for NAEP fully conforms to the requirements of the Family Educational Rights and Privacy Act of 1974 (FERPA) [20 U.S.C. 1232g; 34 CFR Part 99]. FERPA is designed to protect the privacy rights of students and their families, by providing consistent standards for the release of personally identifiable student and family information. NCES and its agents are explicitly authorized under an exception to FERPA’s general consent rule to obtain student level data from institutions. For the purposes of this collection of data, FERPA permits educational agencies and institutions to disclose personally identifiable information from students’ education records, without consent, to authorized representatives of the Secretary of Education in connection with an evaluation of federally supported education programs (34 CFR §§ 99.31(a)(3)(iii) and 99.35).
After the student sample is selected, the data for selected students are submitted to the Materials Preparation, Distribution, Processing and Scoring (MDPS) contractor, who includes the data in the packaging and distribution system for the production of student-specific materials (such as labels to attach to the student booklets or log-in ID cards), which are then forwarded to field staff and used to manage and facilitate the assessment. These data are also uploaded to the AMS Prepare for Assessments online system for review by schools and added to the AMS School Control System (SCS) used by field staff to print materials used by the schools. Student information is deleted from the packaging and distribution system before the assessment begins. Student information is securely deleted from the AMS system typically two weeks after all quality control activities for the assessment are complete.
All paper-based student-specific materials linking personally identifiable information (PII) to assessment materials are destroyed at the schools upon completion of the assessment. The field staff remove names from forms and place the student names in the school storage envelope. The school storage envelope contains all of the forms and materials with student names and is kept at the school until the end of the school year and then destroyed by school personnel.10
Furthermore, to protect collected data, NAEP staff will use the following precautions:
Assessment and questionnaire data files will not identify individual respondents.
No personally identifiable information, either by schools or respondents, will be gathered or released by third parties. No permanent files of names or other direct identifiers of respondents will be maintained.
Student participation is voluntary.
NAEP data are perturbed. Data perturbation is a statistical data editing technique implemented to ensure privacy for student and school respondents to NAEP’s assessment questionnaires for assessments in which data are reported or attainable via restricted-use licensing arrangements with NCES. The process is coordinated in strict confidence with the IES Disclosure Review Board (DRB), with details of the process shared only with the DRB and a minimal number of contractor staff.
The following text appears on covers of all booklets for all student assessments, school questionnaires, and the AMS system:
Paperwork Reduction Act (PRA) Statement
National Center for Education Statistics (NCES) conducts the National Assessment of Educational Progress to evaluate federally supported education programs. All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). By law, every NCES employee as well as every NCES agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of $250,000, or both if he or she willfully discloses ANY identifiable information about you. Electronic submission of your information will be monitored for viruses, malware, and other threats by Federal employees and contractors in accordance with the Cybersecurity Enhancement Act of 2015.
NCES estimates the time required to complete this information collection to average XX11 minutes, including the time to review instructions and complete and review the information collection. This voluntary information collection was reviewed and approved by OMB (Control No. 1850-0928). If you have any comments concerning the accuracy of the time estimate, suggestions for improving this collection, or any comments or concerns regarding the status of your individual submission, please write to: National Assessment of Educational Progress (NAEP), National Center for Education Statistics (NCES), Potomac Center Plaza, 550 12th St., SW, 4th floor, Washington, DC 20202, or send an email to: nces.information.collections@ed.gov.
OMB No. 1850-0928 APPROVAL EXPIRES 2/28/2027
In addition, the following text appears on the log-in screen for the AMS.
AMS
When you have finished or if you need to stop before finishing, please LOG OUT of the survey system by clicking "Save and exit" and CLOSE ALL browser windows or screens to keep your responses secure. For example, if you used Chrome or Safari to open the survey, make sure no Chrome or Safari windows or screens are open after you end the survey. Not closing all browsers may allow someone else to see your responses.
More specific information about how NAEP handles PII is provided in the table below:
PII is created in the following ways |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
PII is moved in the following ways |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
PII is destroyed in the following ways |
|
|
|
|
|
|
In addition, parents/legal guardians are notified of the assessment. See Appendix D for the updated communication materials which provide a parental notification letter. The letter is adapted for each grade or age/subject combination and the school principal may edit it. However, the information regarding confidentiality and the appropriate law reference will remain unchanged. Please note that parents/guardians are required to receive notification of student participation but NAEP does not require explicit parental consent (by law, parents/guardians of students selected to participate in NAEP must be notified in writing of their child’s selection prior to the administration of the assessment).
NAEP emphasizes voluntary respondent participation. Insensitive or offensive items are prohibited by the National Assessment of Educational Progress Authorization Act, and the Governing Board reviews all items for bias and sensitivity. The nature of the questions is guided by the reporting requirements in the legislation, the Governing Board’s Policy on the Collection and Reporting of Background Data, and the expertise and guidance of the NAEP Survey Questionnaire Standing Committee (see Appendix A-6). Throughout the item development process, NCES staff works with consultants, contractors, and internal reviewers to identify and eliminate potential bias in the items.
The NAEP student questionnaires may include items that require students to provide responses on factual questions about their family’s socio-economic background, self-reported behaviors, and learning contexts, both in the school setting as well as more generally. In compliance with legislation, student questionnaires do not include items about family or personal beliefs (e.g., religious or political beliefs). The student questionnaires focus only on contextual factors that clearly relate to academic achievement.
Educators, psychologists, economists, and others have called for the collection of non-cognitive student information that can explain why some students do better in school than others. Similar questions have been included in other NCES administered assessments such as the Trends in International Mathematics and Science Study (TIMSS), the Program for International Student Assessment (PISA), the National School Climate Survey, and other federal questionnaires, including the U.S. Census. The insights achieved by the use of these well-established survey questions will help educators, policymakers, and other stakeholders make better informed decisions about how best to help students develop the knowledge and skills they need to succeed.
NAEP does not report student responses at the individual or school level, but strictly in aggregate forms. To reduce the impact of any individual question on NAEP reporting, the program has shifted to a balanced reporting approach that includes multi-item indices, where possible, to maximize robustness and validity. In compliance with legislation and established practices through previous NAEP administrations, students may skip any question.
The burden numbers for NAEP data collections fluctuate considerably, with the number of students sampled every other year being much larger than in the years in between.
Exhibit 1 provides the burden information per respondent group, by age and by year, for the 2025 NAEP LTT data collections.
Exhibit 2 summarizes the burden by respondent group.
A description of the respondents or study is provided below, as supporting information for Exhibit 1:
Students—Nine-, 13-, and 17-year-old students complete LTT assessment forms that contain 45 minutes of cognitive blocks, followed by non-cognitive block(s) which require a total of 5 minutes to complete. The core non-cognitive items are answered by students across subject areas and are related to demographic information. In addition, students answer subject-specific non-cognitive items. Based on timing data collected from cognitive interviews and previous DBA, fourth-grade students can respond to approximately four non-cognitive items per minute, while eighth- and twelfth-grade students can respond to approximately six non-cognitive items per minute. Using this information, the non-cognitive blocks are assembled so that most students can complete all items in the allocated amount of time. Each cognitive and non-cognitive block is timed so that the burden listed above is the maximum burden time for each student. Additional student burden accounts for time to read directions, and distribute test booklets (for PBA), which is estimated to be 10 minutes. The cognitive or assessment items are not included in the burden estimate because they are not subject to the Paperwork Reduction Act. Therefore, the total burden for LTT students is 15 minutes, the sum of the time allocated for the non-cognitive blocks (student survey) and the time allocated for reading directions and other small administrative procedural tasks.
Teachers— A teacher questionnaire is not administered as part of LTT.
Principals/Administrators— The school administrators in the sampled schools are asked to complete a questionnaire. The core items are designed to measure school characteristics and policies that research has shown are highly correlated with student achievement. Subject-specific items concentrate on curriculum and instructional services issues. In 2025 LTT, school administrators will also answer questions about their school’s instructional organization and practices related to learning recovery of students following the COVID-19 outbreak and gaps in learning that have developed due to the extended period of remote and hybrid learning that took place during the pandemic. The burden for school administrators is estimated to average 30-minutes per principal/administrator, although burden may vary depending on the number of subject-specific sections included.
SD and EL—SD and EL information is provided by school personnel concerning students identified as SD or EL. This information will be used by those personnel to determine the appropriate accommodations for students. The burden for school administrators is estimated at 10-minutes, on average, for each student identified as SD and/or EL.
Submission of Samples— Survey sample information is collected from schools in the form of lists of potential students who may participate in NAEP. This sample information can be gathered manually or electronically at the school, district, or state level. If done at the state level, some states require a data security agreement, which is customized based on the specific requests of the state and provides verbatim security and confidentiality information from Section A.10. If done at the school or district level, some burden will be incurred by school personnel. It is estimated that it will take two hours, on average, for school personnel to complete the submission process. Based on recent experience, it is estimated that approximately 26 percent of the schools will complete the submission process (based on the data from 2022).
Pre-Assessment and Assessment Activities—Each school participating in 2025 NAEP LTT has a designated staff member to serve as its NAEP school coordinator. Pre-assessment and assessment activities include functions such as finalizing student samples, verifying student demographics, reviewing accommodations, and planning logistics for the assessment. An electronic pre-assessment system (known as AMS) was developed so that school coordinators would provide requested administration information online, including logistical information, updates of student and teacher information, and the completion of inclusion and accommodation information. More information about the school coordinators’ responsibilities is included in Part B Section B.2. Based on information collected from previous years’ use of AMS, it is estimated that it will take four hours and 30 minutes, on average, for school personnel to complete these activities, including looking up information to enter into the system. We will continue to use AMS system data to learn more about participant response patterns and use this information to further refine the system to minimize school coordinator burden.
Assessment Feedback Survey—As part of the on-going quality control of the assessment process, school coordinators will be asked to respond to an additional follow-up survey. Survey questions solicit assessment day feedback. The sample post-assessment follow-up survey is conducted via Survey Monkey (see Appendix D-21). It is estimated that this interview will take on average 2 minutes.
EXHIBIT 1
Estimated Burden for 2025 NAEP LTT Assessments
(Note: all explanatory notes and footnotes are displayed following the table)
Subjects |
Students |
Teachers |
School
Questionnaire |
Pre-assessment, |
SD/EL (school personnel) |
Total |
||||||||||
#
of |
Avg. minutes per response |
Burden |
#
of |
Avg. minutes per response |
Burden |
#
of |
Avg. minutes per response |
Burden |
# of Schools |
Burden
|
#
of |
# of SD/ELL Students |
Avg. minutes per response |
Burden |
||
Age 9 |
||||||||||||||||
LTT Operational Mathematics and Reading |
16,000 |
15 |
4,000 |
N/A |
N/A |
N/A |
449 |
30 |
225 |
449 |
2,269 |
449 |
4,320 |
10 |
720 |
7,214 |
Age
9 |
16,000 |
N/A |
4,000 |
N/A |
N/A |
N/A |
449 |
N/A |
225 |
449 |
2,269 |
449 |
4,320 |
N/A |
720 |
7,214 |
|
||||||||||||||||
LTT Operational Mathematics and Reading |
16,000 |
15 |
4,000 |
N/A |
N/A |
N/A |
480 |
30 |
240 |
480 |
2,426 |
480 |
3,680 |
10 |
613 |
7,279 |
Age
13 |
16,000 |
N/A |
4,000 |
N/A |
N/A |
N/A |
480 |
N/A |
240 |
480 |
2,426 |
480 |
3,680 |
N/A |
613 |
7,279 |
|
||||||||||||||||
LTT Operational Mathematics and Reading |
16,000 |
15 |
4,000 |
N/A |
N/A |
N/A |
471 |
30 |
236 |
471 |
2,380 |
471 |
2,560 |
10 |
427 |
7,043 |
Age
17 |
16,000 |
N/A |
4,000 |
N/A |
N/A |
N/A |
471 |
N/A |
236 |
471 |
2,380 |
471 |
2,560 |
N/A |
427 |
7,043 |
Total |
48,000 |
N/A |
12,000 |
N/A |
N/A |
N/A |
1,400 |
N/A |
701 |
1,400 |
7,075 |
1,400 |
10,560 |
N/A |
1,760 |
21,536 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Total number of respondents |
52,200 |
|
Total number of responses |
61,360 |
|
|
|
|
|
|
|
Notes for 2025 NAEP LTT table in Exhibit 1
The burden for the school coordinator is as follows: Pre-assessment burden is 4.5 hours, sample submission burden is 2 hours (for 26% of schools in 2022 based on 2019 data), and the post-assessment follow-up survey is 2 minutes. For the purposes of the calculation of burden, we consider the performance of all of these tasks to constitute 1 response.
The estimated percent of SD/EL students is 27%, 23%, and 16%, at ages 9, 13, and 17, respectively.
There are no survey questionnaires for teachers.
Total Annual Estimated Burden Time Cost for 2025 NAEP LTT Assessments
Data Collection Year |
Number of Respondents |
Number of Responses |
Total Burden (in hours) |
2025 |
52,200 |
61,360 |
21,536 |
The estimated respondent burden across all these activities translates into an estimated total burden time cost 21,536 hours12, broken out by respondent group in the table below.
|
Students |
Teachers and School Staff |
Principals |
Total |
||||
|
Hours |
Cost |
Hours |
Cost |
Hours |
Cost |
Hours |
Cost |
2025 |
12,000 |
$87,000 |
8,835 |
$290,583 |
701 |
$35,954 |
21,536 |
$413,537 |
There are no direct costs to respondents.
The total cost to the federal government for the administrations of the 2025 NAEP LTT data collections (contract costs and NCES salaries and expenses) is estimated to be $13,190,327. The 2025 LTT assessment cost estimate is shown in the table below.
NCES salaries and expenses |
$1,400,300 |
|
Contract costs |
$11,790,027 |
|
Printing, packaging, and distribution, and scoring |
$2,350,000 |
|
Item Development |
$150,000 |
|
Sampling, recruiting and training, data collection, and weighting |
$7,290,027 |
|
Design, analysis and reporting |
$2,000,000 |
|
The previous package under this number requested burden for the operational NAEP assessment, as opposed to the Long-Term Trend assessment described here. These different programs have different research aims, different populations, and therefore different respondent burdens and budgets. Total burden and responses are 61,360 responses and 21,536 hours respectively.
For this Amendment #1 submission, the costs to the Federal Government increased slightly from $13,148,725 (OMB# 1850-0928 v.32) to $13,190,327.
|
Program Change Due to New Statute |
Program Change Due to Agency Discretion |
Change Due to Adjustment in Agency Estimate |
Total Burden |
|
-448,714 |
|
Total Responses |
|
-798,772 |
|
Total Costs (if applicable) |
|
|
|
The time schedule for the LTT data collection for the 2025 assessments is shown below.
Age 13 Long-term Trend Administration |
October–December 2024 |
Age 9 Long-term Trend Administration |
January–March 2025 |
Age 17 Long-term Trend Administration |
March–May 2025 |
Operational assessments are typically released about 12 months after the end of data collection.
The operational schedule for the NAEP assessments generally follows the same schedule for each assessment cycle. The dates below show the typical timeframe, applied to the 2025 LTT assessments:
Spring–Summer 2024: Select the school sample and notify schools
August–November 2024: States, districts, or schools submit the list of students
September–December 2024: Select the student sample
August 2024–January 2025: Schools prepare for the assessments using the AMS system
October 2024-May 2025: Administer the assessments, per the table above
June–July 2025: Process the data, score constructed response items, and calculate sampling weights
November 2025–February 2026: Prepare the reports, obtaining feedback from reviewers
No exception is requested.
No exception is requested.
1 The role of NCES, led by the Commissioner for Education Statistics, is defined in 20 U.S.C. §9622 (https://www.law.cornell.edu/uscode/text/20/9622) and OMB Statistical Policy Directives No. 1 and 4 (https://obamawhitehouse.archives.gov/omb/inforeg_statpolicy).
2 In some instances, students eligible for LTT may be a year younger or a year older depending on their birthday and on when the LTT assessment is administered.
3 See Section A.2 for more information about how NAEP results are reported.
4 See Section B.1.a for more information on the NAEP sampling procedures.
5 For LTT, there are separate scale scores for mathematics and reading.
6 The Governing Board assessment schedule can be found at https://www.nagb.gov/about-naep/assessment-schedule.html.
7 Additional information on the AMS site is included in the Part B, Section B.2 and Appendix F.
8 The current contract expires on October 31, 2024. As such, the vast majority of the work associated with conducting the 2025 NAEP LTT assessments will occur after this current contract expires.
9 The current contracts expire at varying times. As such, the specific contracting organizations may change during the course of the time period covered under this submittal.
10 In early May, schools receive an email from the AMS system reminding them to securely destroy the contents of the NAEP storage envelope and confirm that they have done so. The confirmation is recorded in the system and tracked.
11 For the purpose of this document, a placeholder for the number of minutes is added to the PRA in section A.10. The number of minutes will be updated to reflect the correct PRA for the system and the user.
12 The average hourly earnings of teachers and principals derived from May 2022 Bureau of Labor Statistics (BLS) Occupation Employment Statistics is $32.89 for teachers and school staff and $51.29 for principals. If mean hourly wage was not provided, it was computed assuming 2,080 hours per year. The exception is the student wage, which is based on the federal minimum wage of $7.25 an hour. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation codes: Elementary school teachers (25-2021); Middle school teachers (25-2022); High school teachers (25-2031); Principals (11-9032); last accessed October 2023.
NAEP
2025 LTT Clearance: Supporting Statement Part A
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | System Clearance Part A - Revisions with track changes |
Author | #Administrator |
File Modified | 0000-00-00 |
File Created | 2024-07-22 |