2019 National Household Education Survey (NHES) Web Usability Testing
Volume I
OMB# 1850-0803 v.234
National Center for Education Statistics (NCES)
June 2018
Justification
The National Household Education Survey (NHES) is conducted by the National Center for Education Statistics (NCES) and provides descriptive data on the educational activities of the U.S. population, with an emphasis on topics that are appropriate for household surveys rather than institutional surveys. NHES topics have covered a wide range of issues, including early childhood care and education, children’s readiness for school, parents’ perceptions of school safety and discipline, before- and after-school activities of school-age children, participation in adult and career education, parents’ involvement in their children’s education, school choice, homeschooling, and civic involvement.
Beginning in 1991, NHES was administered approximately every other year as a landline random-digit-dial (RDD) survey. During a period of declining response rates across RDD surveys, NCES conducted a series of field tests to determine whether using self-administered mailed questionnaires would improve NHES response rates. NCES conducted the first NHES full-scale mail-out administration in 2012, which included the Early Childhood Program Participation (ECPP) and the Parent and Family Involvement in Education (PFI) surveys. In NHES:2016, a web response mode was offered in addition to the paper questionnaire. In 2017, an NHES web test was conducted. In NHES:2019, redesigned ECPP and PFI topical surveys will be fielded. NHES, which is administered in English and in Spanish, uses a two-stage design in which sampled households complete a screener questionnaire to enumerate household members and their key characteristics. Within-household sampling from the screener data determines which household member is sampled for the topical survey.
The ECPP, previously conducted in 1991, 1995, 2001, 2005, 2012, and 2016, surveys families of children ages 6 or younger who are not yet enrolled in kindergarten and provides estimates of children’s participation in care by relatives and non-relatives in private homes and in center-based daycare or preschool programs (including Head Start and Early Head Start). Additional topics addressed in ECPP interviews have included family learning activities; out-of-pocket expenses for non-parental care; continuity of care; factors related to parental selection of care; parents’ perceptions of care quality; child health and disability; and child, parent, and household characteristics. Very few changes have been made to the survey instrument between the Web Test in 2017 and 2019. Changes include some additional items about characteristics of chosen care arrangements and a reorganization of the Finding and Choosing Care section.
The PFI, previously conducted in 1996, 2003, 2007, 2012, and 2016, surveys families of children and youth enrolled in kindergarten through 12th grade or homeschooled for these grades, with an age limit of 20 years, and addresses specific ways that families are involved in their children’s school; school practices to involve and support families; involvement with children’s homework; and involvement in education activities outside of school. Parents of homeschoolers are asked about their reasons for choosing homeschooling and resources they used in homeschooling. Information about child, parent, and household characteristics is also collected. In NHES:2016 and the 2017 NHES Web Test, separate instruments existed for parents of homeschoolers and for parents of enrolled children. Starting in 2019, because many homeschooled children also participate in more structured classroom environments, and to adequately capture their experiences, the two questionnaires were combined into one. The testing described in this document focuses on whether the combined instrument flows smoothly for respondents or whether it is repetitive.
NHES:2019 Web Usability Test
This request is to conduct usability testing to refine the functionality of the survey for the 2019 web-based data collection. Specific goals of this testing include:
gathering user feedback about the length of the new combined PFI instrument and whether there are redundant questions, focusing on respondents who have homeschooled children who also participate in courses at physical and/or virtual schools;
gathering more information about how respondents answer the PFI question that asks them to report what type of school(s) their child attends and whether there is measurement error in that reporting based on the current definition of those choices, again focusing on respondents who have homeschooled children who also participate in courses at physical and/or virtual schools;
determining whether the questions asked for each school type in the PFI are unique (i.e., not repetitious) and applicable;
identifying any usability issues with the redesign of the school look-up feature in the PFI;
identifying any cognitive issues with a subset of PFI questions (as identified in Attachment 3 debriefing questions);
identifying any issues with the Spanish translation of questions in both the PFI and ECPP; and
identifying any cognitive issues with the Finding and Choosing Care section in the ECPP and a subset of questions in the ECPP (as identified in Attachment 3 debriefing questions).
This usability testing of the NHES:2019 web instrument will be conducted with both the English and Spanish versions1 of the questionnaires and on both mobile devices and personal computers (PCs). In the Spanish testing, there will be less emphasis on the homeschooling and combination of homeschooling and enrolled schooling distinctions because homeschooling is less common with Spanish-only households. Rather, in addition to the goals listed above, the Spanish testing will focus on identifying any translation issues or difficulty respondents have finding the Spanish-translated version in the survey.
During this usability testing, we will keep track of how long each section of the questionnaire took to complete, and of spontaneous comments about the length of the questionnaire. The timing data are not necessarily reflective of how long respondents will take to complete the questionnaire in production, but will be used to compare the time taken to whether spontaneous comments were made. We will also note any response errors (i.e., either missing data or incorrect responses) based on observation alone. We will gather satisfaction data from a post-survey questionnaire used in usability sessions at the Census Bureau. These data can be used to compare the survey between the 2017 and 2019 iterations, as the same satisfaction survey was conducted in 2017. We will debrief both English and Spanish sessions, focusing on the redesigned screens and combined aspects of the questionnaire and translation issues.
Usability will be evaluated in terms of respondent’s effectiveness and efficiency in survey completion, and satisfaction with the experience of survey completion. The primary deliverable from this study will be the revised, final NHES:2019 online application. A report highlighting key findings will also be prepared.
Design
After the instrument development is complete, two rounds of formative usability testing will be conducted with English-speakers and one round with Spanish-speakers. During testing, the participants’ performance will be investigated using a think-aloud protocol, in order to identify usability problems and to better understand the causes of the problems.
During usability testing, each participant will be required to complete the screener and one of the topical surveys (the PFI or the ECPP, as determined by the age of his/her children). During response to the survey, the participant will be asked to think aloud (verbalizing what he/she is thinking). Interviewers will ask probing questions as needed. In addition, the eye-tracking technique will be employed to record the participant’s visual scan and gaze pattern so that the design of the instrument may be evaluated for English-speaking PC-laptop users only2. After the completion of the survey, the participant will be debriefed about his/her experience with the instrument. Each session is expected to last ninety minutes.
In July 2018, we will conduct a usability study with four English speakers to catch any major issues in the survey design. Two participants will complete the PFI and two the ECPP. Within each topical survey, one participant will answer on a Census Bureau provided laptop (the PC version) and the other on their own mobile device. Based on findings from this round of testing, changes to the instrument may be made prior to the main round of testing.
For the main round of testing that will be conducted in September and October 2018, we will recruit 18 participants. The focus of recruiting and testing will be on the PFI because the majority of the changes were made in that instrument. We will recruit 12 English speakers and 6 monolingual or Spanish-dominant Spanish-speakers to take part in that testing. Of the 12 English speakers, four will be recruited to complete the ECPP and eight the PFI. Of the six Spanish speakers, three will be recruited to complete the ECPP and three the PFI. For each topical instrument in each language, half of the cases will be tested on a Census-provided laptop and the other half on their mobile phone.
Planned number of NHES:2019 web usability test participants, by topical survey and language
|
ECPP respondents |
PFI respondents |
Total |
English |
6 |
10 |
16 |
Spanish |
3 |
3 |
6 |
For the PFI English-speakers, we will attempt to recruit households where students participate in multiple school environments (such as homeschool and community college work) in addition to households with students in a public or a private school only. Spanish-speakers will represent a variety of national origins and education levels. The English and Spanish-speakers recruited to test the ECPP will, if possible, have children in a variety of childcare arrangements and will have at least one childcare arrangement.
The rationale for choosing laptop and smartphone as testing devices follows. Desktop and laptop computers share similar screen display and manual control technology, with a laptop typically being smaller. If a participant can successfully complete a survey on a laptop computer, the participant is likely to be successful on a desktop as well. The eye-tracking software has been installed on Census Bureau-owned laptop computers, and thus these laptop computers will be used in the study. Likewise, mobile devices share similar screen display and manual control technology, with smartphones being more difficult to operate because of their smaller size. If one can complete a survey with a smartphone, a success can be expected with a larger mobile device.
Each participant will complete the NHES:2019 screener and their assigned topical survey. While completing the survey, the participant will also be asked to carry out the following two tasks: 1) login/logout and re-login; and 2) read or review help/FAQ to determine how long the survey is expected to take. These are the same tasks conducted in the 2017 NHES usability testing and because there has been a redesign on both of these features, this test will confirm whether the redesign works well for the users.
The following data collection methods will be used to collect participants’ performance data:
Think-aloud protocol with minimal probing such as “Keep Talking;” “What are you thinking?” and acknowledgement tokens (linguists refer to this as backchannels) such as “Um-hum?”;
Real-time observation by the researcher;
Satisfaction questionnaire;
Retrospective Debriefing;
Audio and video recording;
Eye tracking recording for English-speaking respondents using laptops; and
General timing data for each section of the survey and the two extra tasks.
Analysis of the data will include behavioral observations and spontaneous verbalizations and answers to debriefing questions in order to identify problems. We will also produce gaze patterns on PC to investigate whether participants attended to or ignored important parts of the screens. Finally, we will compute the overall satisfaction ratings.
Recruiting and Paying Respondents
To assure that we are able to recruit participants from all desired populations and to thank them for completing the interview, each respondent will be offered $60 for participation in a ninety-minute interview. The longer interview is necessary to ensure adequate time for a participant to complete the screener and a topical survey – this is the same amount as was offered in the 2017 NHES usability testing. Participants will be asked to bring their own mobile device for use during the study.
English-speaking participants will be recruited by the U.S. Census Bureau, using multiple sources, including the U.S. Census Bureau’s recruiting database, flyers posted in libraries, social media/craigslist, personal and professional contacts, a Bureau of the Census Broadcast email, and homeschooling listservs. Spanish-speaking participants will be recruited through contacts at community resource centers using the intercept method and using a Spanish-language flier sent to the centers. Recruitment contact materials are included in Attachment 1. The translation of the recruiting flier for Spanish-speaking participants will be provided to OMB for approval by early September 2018. The questions used to screen respondents for participation are included in Attachment 2. The usability interview protocols are included in Attachment 3. A testing session will be carried out in either the U.S. Census Bureau usability lab, or another offsite location such as a community center or library. The session will be conducted one-on-one, i.e., one participant and one test administrator (TA), with one note taker.
Assurance of Confidentiality
Participation is voluntary, and respondents will read a confidentiality statement and sign a consent form before interviews are conducted. The confidentiality statement and consent form are provided in Attachment 1. The translation of the consent form for Spanish-speaking participants will be provided to OMB for approval by mid-September 2018. No personally identifiable information will be maintained after the usability testing interview analyses are completed. Data entered into the survey will be stored on the U.S. Census Bureau’s secure data servers.
The interviews will be audio and video-recorded. Participants will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The audio/video/eye tracking recorded files will be secured for the duration of the study – with access limited to key U.S. Census Bureau and NCES project staff – and will be destroyed after completion of the testing. Interviews may also be observed by key project staff. Participants will be informed when observers attend.
Estimate of Hour Burden
We expect the usability interviews to last approximately ninety minutes. Screening potential participants will require 10 minutes per screening. We anticipate needing to conduct 88 screening interviews to yield the 22 needed participants for sessions in both English and Spanish.
Estimated response burden for 2019 NHES Web Usability Testing
Respondents |
Number of Respondents English |
Number of Respondents Spanish |
Total Number of Respondents |
Total Number of Responses |
Burden Hours per Respondent |
Total Burden Hours |
Recruitment Screener |
64 |
24 |
88 |
88 |
0.17 |
15 |
Formative Usability Interviews |
16 |
6 |
22 |
22 |
1.5 |
33 |
Total Burden |
64 |
24 |
88 |
110 |
- |
48 |
Estimate of Cost Burden
There is no direct cost to respondents.
Project Schedule
Recruitment will begin upon OMB approval. Interviewing is expected to be completed within 4 months of OMB approval. The data collection instrument will be revised after the completion of both rounds of testing as documented in the table below.
Estimated Project Schedule for 2019 NHES Usability Testing
Activity |
Start date |
End date |
Project Planning |
1/15/2018 |
6/11/2018 |
Expert Web Application Review |
6/7/2018 |
6/27/2018 |
Expert Web Application Review Debriefing |
6/28/2018 |
6/28/2018 |
Round 1 Respondent Recruitment |
7/9/2018 |
8/2/2018 |
Round 1 Testing (English Only) |
7/20/2018 |
8/2/2018 |
Data Analysis / Quick Report Preparation and Delivery |
8/2/2018 |
8/9/2018 |
Quick Report Debriefing |
8/9/2018 |
8/9/2018 |
Round 2 Respondent Recruitment |
9/4/2018 |
10/15/2018 |
Round 2 Usability Testing (English and Spanish) |
9/17/2018 |
10/15/2018 |
Accessibility Testing |
9/17/2018 |
10/15/2018 |
Data Analysis / Quick Report Preparation |
10/15/2018 |
10/19/2018 |
Quick Report Delivery |
10/22/2018 |
10/22/2018 |
ADDP Reviews Report |
10/23/2018 |
10/29/2018 |
Quick Report Debriefing |
10/30/2018 |
10/30/2018 |
Final Report |
11/1/2018 |
11/30/2018 |
Cost to the Federal Government
The cost to the federal government for this usability testing laboratory study is approximately $140,000.
1 Spanish language versions of the following study materials will be submitted to OMB for review as follows: (a) recruitment flier by early September 2018 (to allow recruitment of Spanish speaking participants) and (b) the consent form, incentive consent form, recruitment screener, usability testing protocol, background questionnaire, debriefing questions, and satisfaction questionnaire by mid-September 2018.
2 Because of the extra time needed to analyze eye-tracking data for mobile users, we have not found it cost effective to incorporate that technique into the protocol for mobile users. Additionally, we have found that the eye-tracking set-up time adds significantly to the length of Spanish-speaking sessions (partially because the participants tend to be wary of the technology and of the session itself), and so we will not incorporate eye-tracking into those sessions (whether PC or mobile).
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | andy |
File Modified | 0000-00-00 |
File Created | 2021-01-20 |