NCES System Clearance for Cognitive,
Pilot, and Field Test Studies 2022-2025
OMB# 1850-0803 v.312
Supporting Statement Part A
Prepared by:
National Center for Education Statistics
U.S. Department of Education
Washington, DC
March 2022
revised May 2022
Part A JUSTIFICATION
A.1 Importance of Information 1
A.2 Purposes and Uses of the Data 4
A.3 Improved Information Technology (Reduction of Burden) 5
A.4 Efforts to Identify Duplication 5
A.5 Minimizing Burden for Small Institutions 6
A.6 Frequency of Data Collection 6
A.7 Special Circumstances 6
A.8 Consultations Outside NCES 6
A.9 Payments or Gifts to Respondents 6
A.10 Assurance of Confidentiality 6
A.11 Justification for Sensitive Questions 7
A.12 Estimates of Burden 7
A.13 Total Annual Cost Burden 8
A.14 Annualized Cost to Federal Government 8
A.15 Program Changes or Adjustments 8
A.16 Plans for Tabulation and Publication 8
A.17 Display OMB Expiration Date 8
A.18 Exceptions to Certification Statement 8
PART B COLLECTION OF INFORMATION EMPLOYING STATISTICAL INFORMATION
B.1 Respondent Universe and Response Rates 1
B.2 Procedures for Collection of Information 1
B.3 Maximizing Response Rates 1
B.4 Tests of Procedures 1
B.5 Individuals Consulted on Statistical Design 1
Appendix A NCES Authorization Legislation
This is a request for a 3-year renewal of the generic clearance for the National Center for Education Statistics (NCES) that will allow it to continue to develop, test, and improve its survey and assessment instruments and methodologies. The procedures utilized to this effect include but are not limited to experiments with levels of incentives for study participants, tests of various types of survey operations, focus groups, cognitive laboratory activities, pilot testing, exploratory interviews, experiments with questionnaire design, and usability testing of electronic data collection instruments.
This generic testing clearance is a helpful vehicle for evaluating questionnaires/assessments and various data collection procedures. It has allowed NCES to take advantage of a variety of methods to identify questionnaire/assessment and procedural problems, suggest solutions, and measure the relative effectiveness of alternative solutions. Through the use of these techniques, employed routinely in the testing phase of NCES surveys, questionnaires and assessments have been simplified for respondents, respondent burden has been reduced, and the quality of the questionnaires and assessments used in continuing and one-time surveys and assessments has been improved. Thus an increase in the quality of the data collected through these surveys has been achieved as well.
During the three-year generic clearance, NCES will provide periodic reports on the testing activities, which, in addition to methods used in the past, may include expanded field tests including split sample questionnaire experiments in multiple panels and usability testing of electronic data collection instruments. The focus of these activities will include testing of items and research about incentives (cash and non-cash), mode (telephone, paper and pencil, computer-based, mail-out and mail-in, etc.), and other methodologies of questionnaires and assessments.
This request for clearance provides a description of the scope of possible activities that might be covered, and NCES requests the same conditions that have been included in the previous clearance agreements for Cognitive, Pilot, and Field Test Studies (OMB# 1850-0803 v.153, 194, and 248), last approved on June 12, 2019. This system generic clearance will go through the usual two Federal Register Review periods, subsequent to which, NCES requests that OMB review and clear proposed studies within a two-week period with no Federal Register Notice period required under the generic clearance. This clearance is similar to the testing clearances held by the Census Bureau, the Bureau of Labor Statistics, and the National Science Foundation, allowing the statistical agencies to develop, redesign, and test data collection instruments and procedures in a timely manner.
Some of the programs that have submitted developmental studies under this clearance in the last three years include the Early Childhood Longitudinal Studies (ECLS), High School and Beyond 2020 (HS&B:20), Trends in International Mathematics and Science Study (TIMSS), International Computer and Information Literacy Study (ICILS), National Assessment of Educational Progress (NAEP), National Household Education Surveys (NHES), National Teacher and Principal Surveys (NTPS), School Survey on Crime and Safety (SSOCS), Teaching and Learning International Survey (TALIS), Progress in International Reading Literacy Study (PIRLS), Program for International Student Assessment (PISA), Fast Response Survey System (FRSS), School Pulse Panel (SPP), NPEFS, Program for the International Assessment of Adult Competencies (PIAAC), National Public Education Financial Survey (NPEFS) and Local Education Agency Finance Survey (F-33), and a variety of postsecondary survey activities, including National Postsecondary Student Aid Study (NPSAS), Baccalaureate and Beyond Longitudinal Study (B&B), Beginning Postsecondary Studies (BPS) and Integrated Postsecondary Education Data Systems (IPEDS). We anticipate that other NCES programs will also be able to use this clearance for developmental projects.
The specific methods proposed for coverage by this clearance are described below. Also outlined are the procedures currently in place for keeping OMB informed about the identity of the surveys and the nature of the research activities being conducted.
The methods proposed for use in questionnaire and assessment development are as follows:
Field or pilot test. For the purposes of this clearance, we are defining field tests as data collection efforts conducted among either purposive or statistically representative samples, for which evaluation of the questionnaire and/or procedures is the main objective, and as a result of which only research and development (R&D) and methodological reports may be published but based on which no statistical reports or data sets will be published. Field tests are an essential component of this clearance package because they serve as the vehicle for investigating basic item properties, such as reliability, validity, and difficulty, as well as feasibility of methods for standardized administration (e.g., computerized administration) of forms. Under this clearance a variety of surveys will be tested, and the exact nature of the surveys and the samples is undetermined at present. However, due to the smaller nature of the tests, we expect that some will not involve representative samples. In these cases, samples will basically be convenience samples, which will be limited to specific geographic locations and may involve expired rotation groups of a current survey blocks that are known to have specific aggregate demographic characteristics. The needs of the particular sample will vary based on the content of the survey being tested, but the selection of sample cases will not be completely arbitrary in any instance.
Behavior coding. This method involves applying a standardized coding scheme to the completion of an interview or questionnaire, either by a coder using a tape-recording of the interview or by a "live" observer at the time of the interview. The coding scheme is designed to identify situations that occur during the interview that reflect problems with the questionnaire. For example, if respondents frequently interrupt the interviewer before the question is completed, the question may be too long. If respondents frequently give inadequate answers, this suggests there are other problems with the question. Quantitative data derived from this type of standardized coding scheme can provide valuable information to identify problem areas in a questionnaire, and research has demonstrated that this is a more objective and reliable method of identifying problems than the traditional interviewer debriefing, which is typically the sole tool used to evaluate the results of a traditional field test (New Techniques for Pretesting Survey Questions by Cannell, Kalton, Oksenberg, Bischoping, and Fowler, 1989).
Interviewer debriefing. This method employs the knowledge of the employees who have the closest contact with the respondents. In conjunction with other methods, we plan to use this method in our field tests to collect information about how interviewers react to the survey instruments.
Exploratory polls and interviews. These may be conducted with individuals to understand a topical area and may be used in the very early stages of developing a new survey. They may cover discussions related to administrative records (e.g., what types of records, where, and in what format), subject matter, definitions, etc. Exploratory interviews may also be used in evaluating whether there are sufficient issues related to an existing data collection to consider a redesign. Exploratory polls give researchers an opportunity to briefly survey a pool of stakeholders or potential respondents in order to discover their readiness to provide more complex data.
Respondent debriefing questionnaire. In this method, standardized debriefing questionnaires are administered to respondents who have participated in a field test. The debriefing form is administered at the end of the questionnaire being tested and contains questions that probe how respondents interpret the questions and whether they have problems in completing the survey/questionnaire. This structured approach to debriefing enables quantitative analysis of data from a representative sample of respondents, to learn whether respondents can answer the questions, and whether they interpret them in the manner intended by the questionnaire designers.
Follow-up interviews (or reinterviews). This involves re-interviewing or re-assessing a sample of respondents after the completion of a survey or assessment. Responses given in the reinterview are compared with the respondents’ initial responses for consistency. In this way, reinterviews provide data for studies of test–re-test reliability and other measures of the quality of data collected. In turn, this information aids in the development of more reliable measures.
Split sample experiments. This involves testing alternative versions of questionnaires and other collection methods, such as mailing packages and incentive treatments, at least some of which have been designed to address problems identified in draft questionnaires or questionnaires from previous survey waves. The use of multiple questionnaires, randomly assigned to permit statistical comparisons, is the critical component here; data collection can include mail, telephone, Internet, personal visit interviews, or group sessions at which self-administered questionnaires are completed. Comparison of revised questionnaires against a control version, preferably, or against each other, facilitates statistical evaluation of the performance of alternative versions of the questionnaire. Split sample tests that incorporate questionnaire design experiments are likely to have a large sample size (e.g. several hundred cases per panel) to enable the detection of statistically significant differences, and facilitate methodological experiments that can extend questionnaire design knowledge more generally for use in a variety of data collection instruments.
Cognitive and usability interviews. This method involves intensive, one-on-one interviews in which the respondent is typically asked to "think aloud" as he or she answers survey questions. A number of different techniques may be involved, including asking respondents to paraphrase questions, probing questions asked to determine how respondents came up with their answers, and so on. The objective is to identify problems of ambiguity or misunderstanding, or other difficulties respondents have answering questions. This is frequently the first stage in revising a questionnaire.
Focus groups. This method involves group sessions guided by a moderator, who follows a topical outline containing questions or topics focused on a particular issue, rather than adhering to a standardized questionnaire. Focus groups are useful for surfacing and exploring issues which people may feel some hesitation about discussing (e.g., confidentiality concerns).
Procedures for Clearance
Before a testing activity is undertaken, NCES will provide OMB with a memo describing the study to be conducted and a copy of questionnaires and debriefing materials to be used. Depending on the stage of questionnaire development, this may be a printed questionnaire, a set of prototype items showing each item type to be used and the range of topics to be covered by the questionnaire, or an interview script. When split sample experiments are conducted, either in small group sessions or as part of a field test, the different version questionnaires to be used will be provided. For a test of alternative procedures, the description and rationale for the procedures will be submitted. A brief description of the planned field activity will also be provided. NCES requests that OMB raise comments on substantive issues within 10 working days of receipt.
Data collection for this project is authorized under the legislation authorizing the questionnaire being tested. In most cases, this is the Education Sciences Reform Act (ESRA 2002, 20 U.S.C. §9543), although other legislation, such as Title 13 or 15, may apply for surveys conducted in concert with other Federal agencies. A copy of ESRA 2002 (20 U.S.C.) Section 9543 is attached. We do not now know what other titles will be referenced, since we do not know what survey questionnaires will be tested during the course of the clearance. If other than ESRA, the relevant authorizing statute will be specified.
The information collected in this program of developing and testing questionnaires will be used by staff from NCES and sponsoring agencies to evaluate and improve the quality of surveys and assessments before they are conducted. None of the data collected under this clearance will be published for its own sake. Because the questionnaires being tested under this clearance are still in the process of development, the data that result from these collections are not considered official statistics of NCES or other Federal agencies. Data will not be made public, except when included in research reports prepared for sponsors inside and outside of NCES. The results may also be prepared for presentations related to survey methodology at professional meetings or publications on NCES website and in professional journals.
Information quality is an integral part of the pre‑dissemination review by NCES (fully described in NCES’s Statistical Standards and IES Style Guide, both of which can be found at http://nces.ed.gov/statprog/standards.asp). Information quality is also integral to the information collections conducted by NCES and is incorporated into the clearance process required by the Paperwork Reduction Act (PRA).
During the past three years this generic testing clearance has been used for:
1850-0803 v.250 SSOCS-CRDC Incident Count Cognitive Interviews
1850-0803 v.251 NTPS 2020-21 Usability Testing
1850-0803 v.252 HS&B:20 Cognitive and Usability Testing Round 2
1850-0803 v.253 ECLS-K:2023 Preschool FT Instruments Usability Testing
1850-0803 v.254 FRSS 110: Educational Technology in Public Schools Pretest
1850-0803 v.255 ECLS-K:2023 Elementary School Administrator Focus Groups
1850-0803 v.256 NAEP Private School Focus Groups
1850-0803 v.257 PIRLS 2021 FT Pretest
1850-0803 v.258 PISA 2021 FT Pretest
1850-0803 v.259 NTPS 2020-21 Teacher Cognitive Testing
1850-0803 v.260 BPS:20/22 Cognitive and Usability Testing
1850-0803 v.261 Current Practices Survey for ED’s Data Protection Toolkit (DBT) Development
1850-0803 v.262 SSOCS-CRDC Incident Count Cognitive Interviews Update (revised v.250)
1850-0803 v.263 TIMSS 2023 Cognitive Testing
1850-0803 v.264 ECLS-K:2023 Parent, Teacher, and School Administrator Focus Groups
1850-0803 v.265 NHES:2022 Cognitive and Web Instrument Usability Testing
1850-0803 v.266 eNAEP 2021 Assessments Student Usability and Pretesting Study
1850-0803 v.267 BPS 20/22 Cog Testing Change Request
1850-0803 v.268 NHES:2022 Cognitive and Paper Instrument Usability Testing
1850-0803 v.269 2020 CPS School Enrollment Cognitive Testing
1850-0803 v.270 NAEP 2021 COVID-19 Developmental Clearance
1850-0803 v.271 Poll Questions on Annual Survey of State and Local Government Finances (F-33)
1850-0803 v.272 Test Assembly Survey Assessment Innovations Laboratory (SAIL)
1850-0803 v.273 TIMSS 2023 Cognitive Testing Update (revised v.263)
1850-0803 v.274 2023 NAEP Family Structure Study
1850-0803 v.275 NAEP SAIL Dynamic Assessment
1850-0803 v.276 CARES Act Poll Questions on NPEFS and F-33
1850-0803 v.277 Adding Questions to the Teacher Questionnaire on Sexual Orientation and Gender Identity (SOGI), and Branding Changes
1850-0803 v.278 Principal and Teacher Follow-up Surveys (PFS and TFS) to the NTPS 2020-21 Teacher Coglabs
1850-0803 v.279 NAEP Engagement Augmentation Study
1850-0803 v.280 ECLS-K:2023 K-1 Instrument Usability Testing
1850-0803 v.281 CRDC Burden Research Study
1850-0803 v.282 2022 School Crime Supplement to the National Crime Victimization Survey (SCS:22/ NCVS) Cognitive Interviews
1850-0803 v.283 A National Survey with Principals about International Assessments (PISA, PIRLS, and TIMSS)
1850-0803 v.284 2023 NAEP Family Structure Study REVISION
1850-0803 v.285 School Survey on Crime and Safety (SSOCS) 2022 Cognitive Interviews
1850-0803 v.286 NAEP SAIL Dynamic Assessment REVISION
1850-0803 v.287 2023 National Household Education Survey (NHES) Web Usability and Cognitive Testing
1850-0803 v.288 A National Survey with Principals about International Assessments (PISA, PIRLS, and TIMSS) CHANGE
1850-0803 v.289 2023 National Household Education Survey (NHES) Web Usability and Cognitive Testing (revised)
1850-0803 v.290 2023 National Household Education Survey (NHES) English and Spanish Screener Cognitive Interviews
1850-0803 v.291 High School and Beyond Longitudinal Study of 2022 (HS&B:22) Focus Groups
1850-0803 v.292 2021 School Pulse Panel Cognitive/Usability Testing
1850-0803 v.293 National Assessment of Educational Progress (NAEP) 2022 eNAEP and Field Trial Pretesting Study
1850-0803 v.294 Adding Questions to the Teacher Questionnaire on Sexual Orientation and Gender Identity (SOGI), and Branding Changes (CHANGE was 277)
1850-0803 v.295 Test Assembly Survey Assessment Innovations Laboratory (SAIL) (revision from 272)
1850-0803 v.296 Contact Materials Testing for the 2023 National Household Education Surveys Program (NHES:2023)
1850-0803 v.297 Message Testing Focus Groups with Teachers, Principals, and Prospective Study Participants (PIRLS, PISA, PIAAC, and TIMSS)
1850-0803 v.298 NAEP SAIL Mathematics Fluency and Collaborative Study
1850-0803 v.299 National Assessment of Educational Progress (NAEP) 2022 eNAEP and Field Trial Pretesting Study - Puerto Rico
1850-0803 v.300 School Pulse Panel Cognitive/Usability Testing Round 2
1850-0803 v.301 National Assessment of Educational Progress (NAEP) 2022 Next Generation Usability Study
1850-0803 v.302 Message Testing Focus Groups with Teachers, Principals, and Prospective Study Participants (PIRLS, PISA, PIAAC, and TIMSS) CR
1850-0803 v.303 COVID-19 Federal Assistance Poll Questions on National Public Education Financial Survey (NPEFS) and Annual Survey of State and Local Government Finances (F-33)
1850-0803 v.304 International Computer and Information Literacy Study (ICILS 2023) Pilot Field Test
1850-0803 v.305 National Assessment of Educational Progress (NAEP) 2022 eNAEP Assessment Delivery Study
1850-0803 v.306 Message Testing Focus Groups with Teachers, Principals, and Prospective Study Participants (PIRLS, PISA, PIAAC, and TIMSS) CR
1850-0803 v.307 National Assessment of Educational Progress (NAEP) 2026 Math Skills and Behaviors Study
1850-0803 v.308 NAEP SAIL Mathematics Fluency and Collaborative Study Revision
1850-0803 v.309 Early Childhood Longitudinal Study, Kindergarten Class of 2023-24 (ECLS-K:2024) Parent and Teacher Focus Groups
1850-0803 v.310 COVID-19 Federal Assistance Poll Questions on National Public Education Financial Survey (NPEFS) and Annual Survey of State and Local Government Finances (F-33)
1850-0803 v.311 2023-24 National Teacher and Principal Survey (NTPS) Cognitive and Usability Testing
1850-0803 v.313 International Computer and Information Literacy Study (ICILS) 2023 Pilot Field Test Data Collection Revision
1850-0803 v.314 National Assessment of Educational Progress (NAEP) 2023 eNAEP Field Trial and Field Test Studies
1850-0803 v.315 2023-24 National Teacher and Principal Survey (NTPS) Respondent Portal Usability Testing and Focus Groups
When the survey or assessment being tested employs automated methods for its data collection, the research conducted under this submission will also use automated data collection techniques. This clearance offers NCES the opportunity to try innovative technologies that would reduce burden and increase the use of information technology.
Research under this clearance does not duplicate any other questionnaire design work being done by NCES or other Federal agencies. Instead, its purpose is to stimulate additional research, which would not be done under other circumstances due to time constraints. When appropriate, this research involves collaborations with staff from other federal and non-federal agencies. Additionally, to the extent possible, NCES makes use of existing information, including reviewing results of previous evaluations of survey data, however, such information is typically not sufficient to refine survey questionnaires without conducting additional research.
This research will be designed as relatively small-scale data collection efforts so as to minimize the amount of burden required to improve questionnaires and procedures, test new ideas, and refine or improve upon positive or unclear results from other tests. The results of the research conducted under this clearance are expected to improve the methods and instruments utilized in full scale studies and thereby improve information quality while minimizing burden to respondents.
Without questionnaire development testing, the quality of the data collected in full surveys would suffer.
There are no special circumstances.
Consultation with staff from other Federal agencies that sponsor surveys conducted by NCES will occur in conjunction with testing individual surveys. Consultation with staff from other Federal laboratory facilities may also occur as part of joint research efforts. These consultations will include discussions concerning potential response problems, clarity of questions and instructions, and other aspects of respondent burden. Additional efforts to consult with potential respondents to obtain their views on the availability of data, clarity of instructions, etc., may be undertaken as part of the testing that is conducted under this clearance.
A 60-day notice was published in the Federal Register on March 3, 2022 (87 FR 12144). There were no public comments received. A 30-day notice will be published.
Respondents for activities conducted in the laboratory (e.g. cognitive interviews and focus groups) under this clearance may receive compensation for travel and participation. This practice has proven necessary and effective in recruiting subjects to participate in such research, and is also employed by the other federal cognitive laboratories. Research on incentives that may be conducted under this clearance may also involve nonmonetary incentives. The Office of Management and Budget (OMB) has noted that effectiveness of such incentives is a worthwhile research topic. If incentives need to be proposed for any research activity under this clearance, justification will be provided and NCES will work closely with OMB on the incentive strategy to be employed. NCES will typically propose incentives at the level approved by the Office of Management and Budget for cognitive laboratories and focus groups. If a higher-level incentive is proposed for approval, a meaningful justification will be provided.
If the collection is under the authority of Education Sciences Reform Act of 2002 (ESRA 2002), all respondents who participate in research under this clearance will be informed that their participation is voluntary and that all of the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). For personal visit and telephone interviews, this information will be conveyed verbally by the interviewer, and in personal visit interviews respondents will also receive this information in writing. For self-administered questionnaires, the information will be included in the mailing package, either as part of communication materials or on the questionnaire or instructions. For Internet-based data collections, this information will be displayed prominently, and in a format that allows the respondent to print it out. All participants in cognitive research will be required to sign written notification concerning the voluntary and confidential nature of their participation. NCES will also inform respondents in writing of the need to have an OMB number. No participant direct identifiers will be maintained as part of research under this generic clearance.
Most of the questions that are included on NCES questionnaires are not of a sensitive nature and should not pose a problem to respondents. However, it is possible that some potentially sensitive questions may be included in questionnaires that are tested under this clearance. One of the purposes of the testing is to identify such questions, determine sources of sensitivity, and alleviate them insofar as possible before the actual survey is administered.
We estimate that the number of people involved in our exploratory, field test, pilot, cognitive, and focus group work will be at most 200,000 per year; the vast majority of which will be contacted as part of screening and recruitment activities preceding the actual research. Given that screening and recruitment activities are included in the burden calculations, we estimate the annual burden hours will be approximately 0.4 hours per person or an annualized 80,000 hours overall. The total estimated respondent burden is 240,000 hours for the 3-year period beginning on the date of OMB approval in 2022:
Time Period |
Respondents |
Responses |
Respondent burden (hours) |
2022 – 2023 |
200,000 |
200,000 |
80,000 |
2023 – 2024 |
200,000 |
200,000 |
80,000 |
2024 – 2025 |
200,000 |
200,000 |
80,000 |
Total |
600,000 |
600,000 |
240,000 |
A variety of forms will be used in conducting the research under this clearance, and the exact number of different forms, length of each form, and number of subjects/respondents per form are unknown at this time. However, we can project that our activities will likely include testing items, data collection modes, and incentive payment levels, in the form of “hothouse” field tests, expanded field tests including split sample questionnaire experiments in multiple panels, cognitive labs, exploratory interviews, reinterviews, and focus groups among students, teachers, parents, and other types of respondents.
There is typically no cost to respondents for participating in the research being conducted under this clearance, except for their time to complete the questionnaire.
There is no way to anticipate the actual number of participants, length of interview, and/or mode of data collection for the surveys to be conducted under this clearance. Thus, it is impossible to estimate in advance the cost to the Federal Government. Costs will be covered by divisions conducting the research from their data collection budgets. We will include information about costs in the individual submissions.
No change to the estimated respondent burden is being requested.
This research program is for questionnaire and procedure development purposes. Data tabulations will be used to evaluate the results of questionnaire and methods testing. The information collected in this effort will not be the subject of population estimates or other statistics in NCES reports; however, it may be published in research and development reports or be included as a methodological appendix or footnote in a report containing data from a larger data collection effort. The results of this research may be prepared for presentation at professional meetings or publication on NCES website and in professional journals. Due to the nature of this clearance, there is no definite or tentative time schedule for individual testing activities at this point. We expect work to continue more or less continuously throughout the duration of the clearance.
No exemption is requested.
There are no exceptions to the certification.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Hi All, |
Author | Edith.McArthur |
File Modified | 0000-00-00 |
File Created | 2022-06-08 |