National Center for Education Statistics
National Assessment of Educational Progress
Volume I
Supporting Statement
National Assessment of Educational Progress (NAEP)
2023 NAEP Family Structure Study
OMB#1850-0803 v.274
September 2020
1) Submittal-Related Information 3
2) Background and Study Rationale 3
3) Recruitment and Data Collection 4
4) Consultations outside the agency 7
5) Justification for Sensitive Questions 7
6) Paying Respondents 7
7) Assurance of Confidentiality 8
8) Estimate of Hourly burden 8
9) Cost to federal government 9
10) Project Schedule 9
This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB# 1850-0803), which provides for NCES to conduct various procedures (such as pilot tests, cognitive interviews, and usability studies) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments and procedures.
The National Assessment of Educational Progress (NAEP) is a federally authorized survey, by the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622), of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, and technology & engineering literacy. NAEP is conducted by NCES, which is part of the Institute of Education Sciences, within the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the different subject areas and collect survey questionnaire (i.e., non-cognitive) data to provide context for the reporting and interpretation of assessment results.
Socioeconomic status (SES) is a legislatively mandated reporting category in NAEP, and questions related to SES have been included in all past NAEP survey questionnaires. The current NAEP domain general (Core) Student Survey Questionnaires include five items used to collect information about household composition, parental education, and parental employment status. However, these existing items assume students live in a single home with a mother and a father, and do not allow for complete reporting of multiple household living arrangements (e.g., children in shared custody situations) and other family types, such as households with non-parent caregivers like grandparents or aunts and uncles.
Previous special studies, including the Socioeconomic Status Indicator Development study (OMB# 1850-0803 v.201, August 2017) and the Extended Student Questionnaire study (OMB# 1850-0928 v.10-13, 2018-19), have been conducted to better understand the lives of students living in non-traditional homes and to capture household and caregiver information from students living in a broad range of households. Building upon those efforts, the present study concerns development of an interactive approach to collecting data regarding family structure. This approach includes a series of prompts that allow students to “build” their family/household instead of asking discrete or matrix items about the number of homes, the number of adults over 18 years old in the home(s), the number of caregivers, who those caregivers are, and their educational and employment status. The intent of this approach is to (a) make these survey questions more inclusive to our diverse student sample, (b) make these survey questionnaire items more engaging and intuitive to answer, given that the data show SES-related items are difficult for some students to answer, and (c) reduce student cognitive load by summarizing student-provided family/household information and presenting that information at key points in the task.
This request is to conduct, as part of the NAEP survey questionnaire development process, pretesting activities including cognitive interviews and usability testing to collect data on the recently developed family structure interactive items for the 2023 grades 4, 8, and 12 survey questionnaires. Pretesting provides essential data about whether the newly developed assessment instruments are achieving their intended goals. Pretesting occurs before piloting and helps to identify and eliminate problems with items and tasks. This can mean fewer challenges in scoring and analysis and higher pilot item survival rates. Results of this pretesting will be used to finalize the family structure interactive items for grades 4, 8, and 12, to be piloted in 2022 and administered nationally as part of the 2023 operational assessment(s).
In general, the focus of this pretesting is (a) to investigate whether this set of interactive items elicits the targeted knowledge students have about their households and caregivers; (b) to investigate whether any item content, interaction, or presentation causes confusion or introduces construct-irrelevant errors; and (c) to gather information about how long students take to complete the overall task. This study employs aspects of multiple pretesting methods, including cognitive interviews and usability testing, conducted in the same one-on-one interview sessions with a single sample. Cognitive interviews allow for the gathering of qualitative data about how students work through item sets and offer opportunities to probe potential sources of construct irrelevance. Usability testing entails observing student interactions with one or more variants of the interface for the interactive items to identify any sources of confusion, which can be another source of construct-irrelevant variance. The methodology used in this study will incorporate aspects of both methods.
In cognitive interviews, an interviewer uses a structured protocol in a one-on-one interview, drawing on methods from cognitive science. A retrospective think-aloud and verbal probing technique will be used for this cognitive interview study. The student’s on-screen actions will be recorded while they complete the task without interruptions. After the students completes the survey questions, the interviewer will play the video recording for the student, and have the student describe and explain their thinking while they completed the task. In addition, probes or questions, as necessary, will be asked to explore issues with item text and presentation that have been identified a priori as being of interest. Usability testing strategies will be added to this approach, allowing for examination of the clarity and intuitiveness of item interaction and presentation variants (e.g. screens with or without progress bars) alongside the standard examination of item wording.
The main purposes of this pretesting activity study are to:
Identify potential problems with the items (i.e., ensure the items are understood by all participants and confirm items are not sensitive in nature and do not make participants uncomfortable);
Evaluate the effects of different item interaction and presentation variants on student completion of the task.
Volume I of this submittal contains descriptions of the design and sampling, as well as burden, cost, and schedule information for the study. Volume II contains the welcome script, cognitive interview instructions and user testing scripts, and probes for the interviewers. The appendices contain recruitment materials, notifications, and thank you documents.
Recruitment and Sample Characteristics
Educational Testing Service (ETS) is the survey questionnaire developer for NAEP survey questionnaires and will be responsible for the overall conduct and management of the cognitive interview activity described in this package. EurekaFacts will conduct the cognitive interviews (see Section 4).
Students will be recruited for this study by EurekaFacts from the following demographic populations:
students who are enrolled in 4, 8, and 12 grades for the 2020-2021 school year;
students who live in a range of housing situations (i.e., 1 or 2 or more homes);
students who live with a range of caregivers (i.e., parents, stepparents, non-parent adults); and
students who represent a mix of gender, race/ethnicity, urban/suburban/rural locations.
Please note that housing situation, caregiver type, and SES will be given a higher priority than other respondent characteristics when recruiting while also ensuring sufficient balance of other criteria. ETS will document the information collected in the screeners using a tracking sheet, which will be used to determine the targeted sample including diversification on key characteristics (see Appendix Q).
Table 1 summarizes the numbers of interviews that are planned for these pretesting activities. A minimum number of five respondents per subgroup is recommended to identify major problems with an item and for a meaningful analysis of data from exploratory cognitive interviews.1 Some demographic populations will be oversampled to better ensure that a variety of caregivers and family structures is represented, and that sources of confusion or sensitivity issues can be identified. Grade 4 students will be further oversampled to better ensure that younger participants can navigate and understand the task. This also allows for a meaningful analysis of data to be conducted from exploratory cognitive interviews to test the usability of prototype questions and interactive design variants.
Table 1. Sample Size for Student Cognitive Interviews
Respondent Group (Housing Situation) |
Grade 4 |
Grade 8 |
Grade 12 |
Total |
Single Household Students |
5* |
5* |
5* |
15* |
Multiple Household Students |
10-15* |
5-10* |
5-10* |
20-35* |
Respondent Group (Caregiver Type) |
Grade 4 |
Grade 8 |
Grade 12 |
Total |
Two Parents Living Together |
5-7 |
3-5 |
3-5 |
11-17 |
Stepparents (One or More) |
5-7 |
4-5 |
4-5 |
13-17 |
Non-Parent Adults |
5-6 |
3-5 |
3-5 |
11-16 |
Overall Total |
15-20 |
10-15 |
10-15 |
35-50 |
*Note: Included in the totals of the Respondent Group (Caregiver Type)
Various resources will be employed to recruit student participants2 for the cognitive interviews, including some combination of the following:
existing participant databases (i.e., a list of potential participants comprised of parents/guardians of prior EurekaFacts study participants, individual referrals provided by prior study participants’ parents/guardians, and people recruited for the database via EurekaFacts social media)
targeted telephone, email, and mail contact lists (i.e., lists that consist of individuals meeting basic criteria such as age or school grade);
school system research/assessment directors; and
outreach/contact methods via community organizations (e.g., Boys & Girls clubs, Parent-Teacher Associations, and limited mass media recruiting)
Interested participants will be screened to ensure that they meet the criteria for participation in the study (e.g., parents/legal guardians of minor students have given consent, and they are from the targeted demographic groups outlined above). In addition, all participants will be screened to ensure that they have access to a computer or tablet with a video camera and microphone. When recruiting participants, EurekaFacts will contact the parent/legal guardian of the potential student participant under the age of 18 (see Appendix B). Grade 12 students age 18 and older will be contacted directly (see Appendix C). The parent/legal guardian or adult student will be informed about the objectives, purpose, and participation requirements of the data collection effort as well as the activities that it entails. Participation criteria screening will be conducted via either a phone call (see Appendices D and E) or a web-surveyor intake form (see Appendices F and G). In order to participate, students will need to be able to participate in a virtual meeting from a quiet place using a computer or tablet with a video camera and a microphone. After confirmation that participants are qualified, willing, and available to participate in the research project, they will receive a confirmation email (see Appendices H and I). Informed consent from parents/legal guardians and adult students will be obtained for all respondents who are interested in participating in the data collection efforts (see Appendices J and K).
Data Collection Process
Student cognitive interviews will be conducted via videoconferencing (e.g., Skype or Zoom) to comply with social distancing mandates. The interviewer and the participants will both share their video, allowing the interviewer to build rapport, observe nonverbal student reactions (e.g., facial expression) to the items, and monitor the participant’s emotional state. Each interview will include an interviewer, and an observer will be present. Each cognitive interview session will last no more than 60-minutes.
The interviewer and the observer will introduce themselves to each participant and explain that he/she/they is there to help answer research questions about how people answer survey questions. Participants will be reassured that their participation is voluntary and that their responses will be used for research purposes only (see Section 7). Then, the interviewers will explain the cognitive interview process.
After these introductory steps, students will be asked to complete the set of draft items while the interviewer observes their progress. There may be more than one variant of some item text or user interface elements across the sample, but each student will complete one version of the set of questions.
This study employs a retrospective think aloud and probing method, where students are asked about their experience with completing the set of items. Students will be asked to complete the family structure items without assistance, and the interviewer will only intervene if the student expresses that they are unable to finish without help. The task will have no time limit, but the interviewer will record how long the student takes to complete this section of the process. The student’s on-screen actions will be recorded as the student completes the survey question. The interviewer will also note any behavioral cues (e.g., looks of frustration, smiling, a long response time for one specific item, etc.) for discussion during the probing phase. The on-screen recording will stop when the student completes the survey questions. The interviewer will then play the recorded video for the student, and ask the student to describe and explain their thinking as they watch their recorded actions. The interviewer will then administer probing questions about each screen in the interactive item set. At the end of the session, a set of debrief questions will be administered.
The probing questions that the interviewers ask the students concern both item wording and usability topics, such as whether the student understands how to use certain item functionality (see Volume II). For example, students may be asked what each item is asking them to do, how they decided to respond to each item, and if they experienced any difficulty or confusion related to the item text or using the user interface elements (e.g., progress bar, drag-and-drop) while completing the items. The protocol may be supplemented with additional ad hoc questions about what the student is thinking or requests for additional details on the student’s response to a question. To minimize the burden on the student, efforts will be made to limit the number of probes used in any one session.
Audio and video from the cognitive interviews will be recorded. Interviewers will also record their own notes separately, such as behaviors (e.g., “the participant appeared confused”), questions posed by students, and observations of how long various items take to complete.
The types of data collected about task items will include:
student reactions and responses to items and presentation details;
behavioral data (e.g., observable actions recorded in interviewer notes, process data (if available), and screen-captures);
student verbalizations during the think aloud;
responses to all questions and probes (i.e., debriefing, generic, item-specific and usability);
responses to targeted questions specific to the item(s); and
additional volunteered participant comments.
Analysis Plan
The general analysis approach will be to compile the different types of data to facilitate identification of response patterns for the interactive items. Types of response patterns can include frequency counts of verbal report codes and responses to probes or debriefing questions, or student actions observed at specific points in a given item or item set. This overall approach will help to ensure that the data are analyzed in a way that is thorough, systematic, and will enhance identification of weaknesses with items and components and provide recommendations for addressing those problems. In cases where an alternative item wording or user interface element is being examined, student responses to probing questions for each version will be compared.3
For the cognitive interview aspects of this study, information gathered from the think aloud and answers to probing questions will be analyzed across participants. After the session, the notes and audio recording will be summarized to report main findings and illustrative statements that will be analyzed by the NAEP questionnaire development team. The cognitive interview results will be used to help improve the tested survey items and inform the specific item wording that should be administered during the 2023 assessments.
Students’ ease or difficulty in completing assigned tasks will be analyzed to determine which information or design elements are more effective in supporting successful completion of anticipated user tasks. While successful completion of tasks will be recorded, it is these information and design elements that are being evaluated rather than the students. All results will be used only to make recommendations regarding the design and development of the interactive item wording and features.
User testing results will be analyzed chiefly in terms of descriptive statistics detailing the distribution of success rates and subjective user ratings. An example finding would be: “40 percent of participants were able to edit their caregivers’ employment status without assistance.” Such a finding would be used to determine whether the mechanism for revising answers needs to be redesigned to make it easier to use. Other statistical comparisons may be performed as appropriate to the variables and populations.
Educational Testing Service (ETS) (headquartered in New Jersey) is the Item Development, Data Analysis, and Reporting contractor for NAEP and will develop the interactive items, analyze results, and draft a report with results.
EurekaFacts will recruit, schedule, and conduct the cognitive interviews, and provide the results to ETS. EurekaFacts is located in Rockville, Maryland. It is an established for-profit research and consulting firm, offering facilities, tools, and staff to collect and analyze both qualitative and quantitative data. The NAEP State Coordinators serve as the liaisons between state education agencies and NAEP, coordinating NAEP activities within their respective states. The coordinators will be notified about this study before any recruitment takes place and may also support recruitment efforts.
Throughout the user survey development processes, effort has been made to avoid asking for information that might be considered sensitive or offensive. Given the nature of the study, some questions about sensitive topics will need to be asked, such as about the relationships between household members, the number of people in the household, or how students allocate their time across multiple households. In the unlikely event that a student exhibits distress during participation, the distress protocol will be implemented, and students will be provided with a list of resources at the end of the interview, should they have a need for any counseling as a result of the topic (see Volume II and Appendices L, M, and N). Reviewers have attempted to identify and minimize potential bias in questions.
To encourage participation in a 60-minute cognitive interview session, a $25 virtual gift card from a major credit card company will be offered to each student who participates in a pretesting session as a thank you for his or her or their time and effort. The parent or legal guardian facilitating the students’ remote participation will receive a $15 virtual gift card from a major credit card company to thank them for their time and effort. Additionally, the parent or legal guardian will receive a thank you letter (Appendix O) for allowing the student to participate in the study. Students 18 years of age or older will also receive a thank you letter for their participation (Appendix P).
The study will not retain any personally identifiable information. Prior to the start of the study, students will be notified that their participation is voluntary. As part of the study, students will be notified that the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).
For all participants, written consent will be obtained from parents/legal guardians (of minor students) before interviews are administered. Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files, secured for the duration of the study, and will be destroyed after the final report is released. Pretesting activities may be recorded using audio or video capture. The only identification included on the files will be the participant ID. These recorded files will be secured for the duration of the study and will be destroyed after the final report is completed.
The estimated burden for recruitment assumes attrition throughout the process4. All pretesting sessions will be scheduled for no more than 60-minutes. Table 2 details the estimated burden for the survey questionnaire pretesting activities.
Table 2. Hourly Burden for Students and Parents or Legal Guardians for Family Structure Pretesting Activities
Respondent |
Number of Respondents |
Number of Responses |
Hours per Respondent |
Total Hours |
Student Recruitment via Youth Organizations, PTAs, and After School Programs |
||||
Initial contact |
50 |
50 |
0.05 |
3 |
Follow-up and identify students |
33* |
33 |
1.0 |
33 |
Sub-Total |
50 |
83 |
|
36 |
Students (Over 18) and Parent/Legal Guardian for Student Recruitment |
||||
Initial contact |
250 |
250 |
0.05 |
13 |
Follow-up via phone |
166* |
166 |
0.15 |
25 |
Consent and confirmation |
83* |
83 |
0.15 |
13 |
Sub-Total |
250 |
499 |
|
51 |
Participation (Cognitive Interviews) |
||||
Students** |
50*** |
50 |
1 |
50 |
Sub-Total |
50 |
50 |
1 |
50 |
Total Burden |
300 |
632 |
|
137 |
* Subset of initial contact group
** Group pooled from the two parallel recruitment efforts; all respondents previously counted
***Maximum of the range of participants.
Note: numbers have been rounded and therefore may affect totals
The total cost of the study is $312,000.
Organization |
Costs |
ETS |
$150,000 |
EurekaFacts |
$162,000 |
Total Costs |
$312,000 |
Table 3 provides the overall schedule.
Table 3: Pretesting Schedule
Activity Each activity includes recruitment, data collection, and analyses |
Dates |
Recruitment and Cognitive interviews |
Late October 2020 – February 2021 |
Recruitment and Usability testing |
Late October 2020 – February 2021 |
Pretesting report submitted |
March 2021 |
1 Roach, A. T., & Sato, E. (2009). White paper: Cognitive interview methods in reading test design and development for alternate assessments based on modified academic achievement standards (AA-MAS). Dover, NH: Measured Progress and Menlo Park, CA: SRI International.
2 For students under age 18, parents/legal guardians will receive the various contact information.
3 Minor changes may be made to item wording or presentation during the study. Changes would be made based on early indications that item text or presentation is consistently causing student confusion or difficulty.
4 Based on our experiences in other similar NAEP studies, the estimated attrition rates for direct participant recruitment are 33 percent from initial contact to follow-up, 50 percent from follow-up to confirmation, and 40 percent from confirmation to participation for students. The estimated attrition rate for the initial youth organization contact for student identification is 25 percent from contact to follow-up.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Dresher, Amy R |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |