Part A NHES 2019

Part A NHES 2019.docx

National Household Education Survey 2019 (NHES:2019)

OMB: 1850-0768

Document [docx]
Download: docx | pdf




National Household Education Survey 2019 (NHES:2019)

Full-scale Data Collection



OMB# 1850-0768 v.16

Part A
















May 2018

revised October 2018








Appendices


Appendix 1 – NHES 2019 Contact Materials

Appendix 2 – NHES 2019 Screener and Topical Surveys

Appendix 3 – NHES 2019 Web Screener and Topical Facsimile

Appendix 4 – Study of NHES 2019 Nonresponding Households

Appendix 5 – NHES 2019 Developmental Studies Results

Appendix 6 – NHES 2017 Web Test Results



JUSTIFICATION

NHES Program - Request for Clearance

The National Household Education Survey (NHES) is a data collection program of the National Center for Education Statistics (NCES) designed to provide descriptive data on the education activities of the U.S. population, with an emphasis on topics that are appropriate for household surveys rather than institutional surveys. Such topics have covered a wide range of issues, including early childhood care and education, children’s readiness for school, parents’ perceptions of school safety and discipline, before- and after-school activities of school-age children, participation in adult and career education, parents’ involvement in their children’s education, school choice, homeschooling, and civic involvement. This request is to conduct the NHES:2019 full scale data collection, as described in this submission.

Additionally, in conjunction with NHES:2019, NCES plans to conduct an In-Person Study of NHES:2019 Nonresponding Households, designed to provide insight about nonresponse that can help plan future survey administrations. This in-person study is described in detail in Appendix 4 of this submission. The contact materials, final interview protocols, and final sample selection details for this study are also included in Appendix 4.

A.1 Circumstances Necessitating Collection of Information

NCES is authorized to conduct NHES by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543), which defines the legislative mission of NCES to collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations. NHES is specifically designed to support this mission by providing a means to investigate education issues that cannot be adequately studied through the Center’s institution-based data collection efforts. For example, some school-age children are homeschooled rather than attending a public or private school. There is no available sample frame that includes all homeschooled students across the United States. It is more efficient and economical to interview parents about their children’s participation in child care programs and family participation in school and other education activities through a household-based approach than to incur the cost and nonresponse involved in enlisting schools, obtaining lists of parents, and sampling parents from those lists.

Repeating the NHES:2016 child surveys will provide trend data from the 2012 and 2016 NHES administrations. Tracking trends in education topics on a regular, repeated basis is a key research goal of the NHES program. The Adult Training and Education Survey (ATES), that was part of the NHES:2016 administration, will not be administered as part of NHES:2019. NCES, in collaboration with the National Science Foundation (NSF), is currently testing revisions to and planning new methods of data collection for ATES.

A.2 Purposes and Uses of the Data

The NHES:2019 data collection will provide policymakers and researchers with data on early childhood education, parent and family involvement in education, and homeschooling that are not available elsewhere. Researchers nationwide rely on NHES data for important policy analyses. Survey data from NHES have been used for a large number of descriptive and analytic reports and articles, including NCES education indicators, reports, and statistical abstracts; publications of other Federal agencies; policy analyses; theses and dissertations; conference papers; and journal articles. A list of NHES publications issued by NCES can be found on the NHES website at http://nces.ed.gov/nhes. A list of NHES publications from non-NCES researchers can be found on the NCES website at https://nces.ed.gov/bibliography/.

NHES Program

NHES uses a two-stage design in which sampled households complete a screener questionnaire to enumerate likely eligible household members and their key characteristics. Within-household sampling based on the screener data determines which household member is sampled for which topical survey. NHES typically fields 2 to 3 topical surveys at a time, although the number has varied across its administrations. Surveys are administered in English and in Spanish. Data from NHES are used to provide national cross-sectional estimates on populations of special interest to education researchers and policymakers.

Beginning in 1991, NHES was administered approximately every other year as a landline random-digit-dial (RDD) survey. During a period of declining response rates in all RDD surveys, NCES decided to conduct a series of field tests to determine if a change to self-administered mailed questionnaires would improve response rates. A feasibility test of the new design was conducted in 2009, followed by a field test in 2011. The field test results helped to inform the final design of a full-scale NHES mail-based collection in 2012 (OMB# 1850-0768 v.9), which included the Early Childhood Program Participation (ECPP), the Parent and Family Involvement in Education-Enrolled (PFI-E), and the Parent and Family Involvement in Education-Homeschooled (PFI-H) topical surveys.

NHES:2016 fielded the same child surveys as those fielded in 2012 and also fielded the Adult Training and Education Survey (ATES). NHES:2016 continued the mail-based collection methodology, while also experimenting with the use of web survey collection. In 2016, a subsample of 25,000 addresses were sent an invitation to complete the survey by web, with one follow-up letter asking them again to complete it by web. Two additional follow-up mailings included the paper survey and asked the sample members to complete the survey by paper. The overall response rate (the screener response rate multiplied by the topical response rate) for the 25,000 addresses in the web experiment exceeded the overall response rate for the main NHES:2016 collection because of the gain in topical response realized from web survey respondents who were able to complete the topical surveys in the same sitting as the screener survey.

NHES:2019 will again field the ECPP survey plus a single, combined PFI survey. The PFI surveys for parents of enrolled and homeschooled students have been merged into one survey instrument. NHES:2019 will use the same methodology developed in the NHES:2016 web experiment for the majority of sampled addresses.

NHES:2016 Mixed-Mode Experiment and NHES:2017 Web Test

In NHES:2016, the 25,000 subsample of households in the web experiment were assigned to a mixed-mode protocol, with two survey requests for web completion followed by two survey requests for paper survey completion. This group had a slightly higher final response rate (52 percent) than the paper-only group (48 percent), and about two-thirds of the screener responses from the mixed-mode group were received via the web. NCES also conducted a large web test of NHES in 2017. It was the first time NHES responses were collected entirely online. Sampled households were sent contact materials that included information about how to access the NHES web instrument; they did not have the option to complete a paper questionnaire. Sample members with no internet access were asked to call the Census Bureau to complete the first stage of the survey by phone. The intent of this test was to determine the feasibility of moving forward using web as a primary mode of data collection, given the willingness of most mixed-mode design respondents to respond by web in 2016. The 2017 web test experimented with strategies for contacting sample members; an alternate presentation of the household screener to maximize the accuracy of screener responses and the overall usability of the screener instrument; and asking respondents to complete two topical surveys instead of one. Based on the results of the 2016 and 2017 tests, a sequential mixed-mode design, from web survey to paper survey, will be implemented in 2019. NHES:2019 will include several experiments intended to continue to improve the implementation of this mixed-mode design (described later in this section).

NHES Cognitive Interviews

NCES conducted several rounds of cognitive interview and focus group studies that led to changes designed to improve survey items and contact materials. The ECPP survey was revised to add additional items about parents’ decision-making around early care and education arrangements for young children. The PFI survey was revised to combine the separate instruments for enrolled and homeschooled students into one instrument, acknowledging the fact that students are often educated in multiple educational settings, including virtual schools. The results of these studies, summarized in Appendix 5 and Appendix 6, informed the development of the instruments and contact materials in this submission.

Overview of NHES:2019 Target Population

NHES:2019 will include the PFI and ECPP topical surveys. Children from birth through 12th grade who are ages 20 years or younger will be eligible for the child-focused surveys. ECPP samples children ages 6 or younger who are not yet enrolled in kindergarten. PFI samples children and youth ages 20 or younger enrolled in kindergarten through 12th grade and children and youth ages 20 or younger who are homeschooled for the equivalent of kindergarten through 12th grade. Adults knowledgeable about the care and education of the sampled child are asked to respond to these surveys. Only one child per household will be sampled.

This submission includes several letters and postcards for each stage of the study that are tailored for the screener phase or a particular topical survey. All respondent contact materials, text content of the NHES web survey login page and all of the corresponding links within that page, along with example pages of the web survey, are provided in Appendix 1.

NHES:2019 Screener Instrument

Because there is no adult education survey component in 2019, the household screener instrument was revised from the NHES:2016 version to request only a listing of all children in the household rather than all household members. The 2019 NHES screener is similar to the screener instrument used in NHES:2012, when the surveys were last fielded without an adult topical survey. The response rates for a 5-person child-only screener and a 10-person all household member screener were found to be comparable in an experiment conducted during the National Adult Training and Education Survey (NATES) 2013 Pilot Test (OMB# 1850-0803 v.72). English and Spanish versions of the paper screener are provided in Appendix 2 and the web screener versions in Appendix 3.

NHES:2019 Topical Surveys

Each administration of NHES has included more than one topical survey. Exhibit 1 shows the years in which different NHES topical surveys were administered between 1991 and 2016. NHES:2019 will include two child-focused topical surveys (PFI and ECPP). The paper and web versions of the surveys are provided in Appendix 2 and Appendix 3 respectively. The NHES:2019 administration of the PFI and ECPP surveys is a repeat of the child-focused topics that were administered for the first time by mail as part of NHES:2012 and again in both mail and web formats in NHES:2016. Tracking changes in the population over time is a key research goal of the NHES program. English and Spanish versions of the topical surveys are provided in Appendix 2 (paper) and Appendix 3 (web).

Exhibit 1. Topical surveys conducted under the NHES Program, by years administered: 1991–2016

Topical survey

NHES survey administration

1991

1993

1995

1996

19991

2001

2003

2005

2007

2012

2016

Young children












Early childhood education/ program participation

X


X


X

X


X


X

X

School readiness


X



X




X



School-aged children












School safety and discipline


X










Parent and family involvement in education




X

X


X


X

X

X

Homeschooling





X


X


X

X

X

After-school programs and activities



X 2


X

X 3


X




Adults












Adult education

X


X


X

X

X

X




Credentials for work











X

Civic involvement




X

X







Household library use




X








1 NHES:1999 was a special end-of-decade administration that measured key indicators from the surveys fielded during the 1990s.

2 The After-School Programs and Activities Survey of NHES:1995 only collected data about children in the first through third grades.

3 The After-School Programs and Activities Survey of NHES:2001 also included items on before-school programs.

SOURCE: U.S. Department of Education, National Center for Education Statistics, National Household Education Surveys Program (NHES), 1991–2016.

The Parent and Family Involvement in Education Survey (PFI)

The PFI, previously conducted in 1996, 2003, 2007, 2012, and 2016, surveys families of children and youth enrolled in kindergarten through 12th grade or homeschooled for these grades, with an age limit of 20 years. It addresses specific ways that families are involved in their children’s school, school practices to involve and support families, involvement with children’s homework, and involvement in education activities outside of school. Parents of homeschoolers are asked about their reasons for choosing homeschooling and resources they used in homeschooling. New for 2019, parents of children who attend online or virtual schools will be asked about their reasons for choosing an online or virtual school and the cost of that type of schooling. Information about child, parent, and household characteristics is also collected. NHES:2019 will be the first administration that combines the enrolled student and homeschooling surveys into one instrument. Since 2016, content was added to PFI about virtual schools and online coursetaking.

The Early Childhood Program Participation Survey (ECPP)

The ECPP, previously conducted in 1991, 1995, 2001, 2005, 2012, and 2016, surveys families of children ages 6 or younger who are not yet enrolled in kindergarten and provides estimates of children’s participation in care by relatives and non-relatives in private homes and in center-based daycare or preschool programs (including Head Start and Early Head Start). Additional topics addressed in ECPP interviews have included family learning activities; out-of-pocket expenses for nonparental care; continuity of care; factors related to parental selection of care; parents’ perceptions of care quality; child health and disability; and child, parent, and household characteristics.

NHES:2019 Experiments

NCES is planning several experiments as part of NHES:2019 to evaluate the impact of different contact strategies and modes on survey response.

Targeted Screener Mailings

This experiment will test whether using targeted screener mailings for likely Spanish-speaking households (based on information available on the sampling frame and from ACS) increases the response rate for those cases. The experiment will test the presentation and wording in a set of contact materials targeted specifically to these households, which may include different pictures, letter language, and Spanish-first bilingual presentation. Approximately 15,000 randomly assigned cases will be sampled for this experiment, with the expectation that about 3,300 cases will be predicted as Spanish-speaking and receive the targeted mailings. Incentives will be the same as in the main study. These materials are provided in Appendix 1.

Sequential Mixed Mode Contact Strategies

One of the goals of NHES is to improve participation relative to the resources expended to secure sample member response by making response to the survey mailings easy and efficient. One way to achieve this is to vary contact methods during later contact attempts to encourage response among more reluctant respondents. Two experiments related to this will build on lessons learned in NHES:2016 and in the NHES:2017 web test.

First, because NHES:2019 will be comprised of only the child-focused surveys, an “opt-out” screener and cover letter will be tested (10,000 randomly assigned cases). The “opt-out” screener will include a question on the cover of the survey about whether or not any children live in the household. Households without children only need to answer one screener question, and placing that item on the cover of the survey should allow them to respond without having to move beyond the cover of the paper questionnaire. About 60 percent of the 10,000 cases (6,000 cases) are predicted to be households without children. Analyses of the results will look at the screener response rates overall, in households with children, and in households without children, by the experimental vs. the control group.

Second, a 3x3 mailing experiment (70,000 randomly assigned cases) will vary the type of advance contact received and when a FedEx package is sent. The advance contact experiment is designed to assess (a) the effect of an advance letter within the context of a web survey (prior NHES experiments demonstrated a positive impact on NHES response rates of advance letters before phone and mail collections) and (b) the effect of an advance mailer campaign. The advance mailer campaign is comprised of two glossy oversized postcards designed to build brand awareness of the NHES surveys. The hypothesis is that the responding household will be more welcoming to the initial screener survey mailing if prior mailings lead the respondent to recognize the name of the survey. The advance contact experiment conditions are: advance mailing campaign / advance letter / no advance mailings (one third of cases in each). The advance mailing campaign will consist of two glossy postcard mailings with information about NHES, followed by an advance letter in a letter-size envelope. The advance letter condition will include a single letter in a letter-sized envelope. For households in the no advance letter group, the first screener package will be their first NHES mailing.

Survey invitations sent by FedEx tend to provide a larger bump in survey response than NHES yields from survey mailings sent using the U.S. Postal Service. However, FedEx mailing is more expensive than postal service First Class mailing. In prior NHES administrations, FedEx was leveraged consistently at the third mailing. In NHES:2019, NHES will experiment with sending some addresses a survey package via FedEx at the second mailing and some addresses a survey package via FedEx at the fourth mailing rather than at the third. Cost models will allow us to determine the relative efficiency of changes to FedEx package timing compared to the cost of the mailing and the cost of survey follow-up mailings. The FedEx experiment conditions will be: FedEx for the second survey package / FedEx for the fourth survey package / modeled FedEx timing (one third of cases in each). The modeled FedEx timing treatment will use data gathered from an NHES:2017 experiment about the effectiveness of the more expensive FedEx mailing on different households, as compared to a regular First Class mailing, to determine which households should receive FedEx for the second survey package and which should not receive it until the fourth package. For example, Spanish-speaking households responded well to the FedEx, so they might be more likely to receive this mailing type as the second survey package.

Choice-Plus Incentive

The choice-plus incentive experiment builds on promising research conducted as part of the 2015 Residential Energy Consumption Survey (RECS) National Pilot study—an experimental component of the main Residential Energy Consumption Survey (RECS) conducted by the U.S. Energy Information Administration (EIA). In the RECS, the choice-plus protocol was one in which both paper and web were offered and the respondent was offered a promised incentive for completing the survey by web. Across all combinations of mode offers in the RECS pilot study, the choice plus protocol garnered the highest response rates.1 In order to encourage respondents to respond via the web or helpdesk phone (the Census Telephone Questionnaire Assistance (TQA) line in NHES:2019) instead of by paper, this experiment will offer concurrent web and paper response options and will test two levels of promised incentives for incentivizing web or TQA response. For 24,000 randomly assigned cases, respondents will be offered a $10 promised incentive for web or TQA completion and, for 6,000 randomly assigned cases, respondents will be offered a $20 promised incentive for web or TQA completion. Respondents will be made aware of the incentive in the first mailing after the advance letter, which will also contain the standard NHES $5 prepaid incentive. Contingent incentives will be mailed to the addresses specified by respondents at the end of the survey and will be in the form of cash. Follow-up contacts will also reference the promised incentive. To get the promised incentive, cases will need to complete all surveys assigned to them. If no one is sampled for a topical survey, submitting the screener will be sufficient. If a child is sampled for a topical survey, the household must also submit the topical. Promised incentives will be sent with a cover letter via first class mail in a letter-size envelope shortly after completion of their survey via the web or TQA.

Modeled Mode

This experiment (40,000 randomly assigned cases) will test whether targeting some cases for a paper-only protocol, specifically those modeled to be least likely to respond to a web survey invitation, increases the response rate among those cases. We will use the NHES:2016 data about characteristics of the households that were most likely to respond by paper or least likely to respond by web to develop the model predicting response mode preference. The model will be used to identify NHES:2019 households with high propensity for paper response and low propensity for web response. These households will be sent the paper questionnaire only and will not be offered a web version at any point. All follow-up contacts with nonresponders will ask for paper response. All other cases assigned to the modeled mode experiment but not modeled as least likely to respond by web will be given the same mixed-mode protocol administered to the majority of NHES:2019 cases, first inviting the respondent to respond on the web.

Web will not be offered to the experimental group cases in this treatment, because we suspect that there is overlap between the types of households that respond at higher rates to paper than to web and the types of households that are “late responders,” that is, addresses for whom we should not expect to get response until they have received a third or fourth mailing, regardless of mode. If we offer web at the third or fourth contact in the paper experiment, we won’t know if the address is responding because they like web or if the address is responding because no matter what we sent, it would have required 3 or 4 follow-ups to get them to respond. Offering only paper to this group gives us maximum data to evaluate how best to leverage paper survey invitations in future NHES collections.

In-Person Study of NHES:2019 Nonresponding Households

Given continued declines in response rates to both NHES and to household surveys more broadly – and the growing challenges associated with conducting cost-efficient, high-quality, representative data collections, NCES will conduct the In-Person Study of NHES:2019 Nonresponding Households during the NHES:2019 administration. This study will focus on understanding the reasons for the growing level of nonresponse to NHES and to mail-based household surveys. Furthermore, the study will focus on screener rather than topical nonrespondents because of the impact that screener nonresponse has on both the screener response rate and each NHES topical survey’s overall response rate; and because NHES screener response rates are lower than topical response rates.2 This study will consist of two operations: (a) qualitative interviews and (b) address and neighborhood observations. Appendix 4 of this submission discusses the importance of conducting this study and provides a detailed overview of its methods. It also provides the study’s contact and interview protocol materials.

Qualitative interviews

Approximately 500 NHES:2019 screener nonrespondents will be selected to participate in qualitative interviews, with a goal of obtaining approximately 100 completed qualitative interviews (see appendix 4 for sampling details). Sampled cases will be sent an invitation letter that includes $5 cash, invites them to participate in the in-person study, informs them of the contingent incentive, and provides them with NHES staff contact information for letting the researchers know that they would like to participate. Sample members that do not respond to the invitation letter will receive up to two additional reminder postcards. Addresses with telephone numbers will additionally receive up to 4 phone call reminders. An in-person recruitment period will also take place, where all households that have neither agreed nor declined to participate will be visited by an interviewer. The follow-up mailings will be spread out over several weeks and will be paired with in-person recruitment. All sampled cases will be offered an additional $120 cash incentive for completing the 90-minute interview and this incentive will be mentioned in all recruitment materials. Interviews will be conducted in both English and Spanish, and will be audio-recorded, with participant’s permission.

Address and neighborhood observations

Approximately 750 addresses will be sampled from among NHES:2019 early screener nonrespondents for address and neighborhood observations. Of these, 500 will be selected for and invited to take part in the qualitative interview study (to yield approximately 100 completed 90-minute interviews). If nonresponse clustering patterns result in lower than estimated cost of observing and interviewing the target numbers of addresses, more addresses will be selected for the address/neighborhood observations. The objective of these observations is to determine the types of addresses that are prone to nonresponse or having their NHES mailings be undeliverable, and to assess the accuracy of the information available on the frame for such addresses.

A.3 Use of Improved Information Technology

NHES:2019 responses to paper and pencil instruments will be collected by the Census Bureau on behalf of NCES using three complementary survey systems - (1) Amgraf One Form Plus, (2) Docuprint, and (3) integrated Computer Assisted Data Entry (iCADE), chosen for their efficiency and accuracy in the data collection process.

  • Forms Design. Questionnaires will be created using Amgraf One Form Plus. Completed hardcopy forms can be processed by iCADE to capture responses through optical mark recognition (OMR) and keying from image (KFI). Questionnaires will be printed, trimmed, and stitched through an in-house print on-demand process using a Docuprint system which allows personalization of some survey items. The data from the questionnaires will be captured by the iCADE technology/software, which automatically extracts all check box entries (OMR) and captures and displays an image of all other entries to an operator for KFI.

  • Image Preprocessing. iCADE applies image preprocessing to the forms in their image format in order to correct any skewing at the time of scanning, and the iCADE software performs registration to align the individual questionnaire page template with the appropriate scanned image. The scanner despeckles the image to remove unwanted pixels.

  • Data Capture. iCADE reads the form image files, checks the presence of data, processes all check box fields through OMR, and presents an image of the handwritten fields to an operator for KFI.

  • Verification. Extracted KFI data are subject to 100% field validation according to project specifications. If a data value violates validation rules, the value is flagged for review by verifiers who interactively review the images and the corresponding extracted data, and resolve validation errors.

  • Archiving. Images will be scanned and archived to magnetic storage located on a secured server in case they are needed later. This eliminates the need to save paper copies of the completed questionnaires.

The NHES:2019 web-based instruments are designed to minimize respondent burden by eliminating the cumbersome skip patterns required in the paper and pencil instruments and allowing respondents to complete both the screener and a topical survey in one or more sittings. The instruments will be securely hosted on the Census Bureau’s server.

A.4 Efforts to Identify Duplication

Population: Most other surveys do not address the topics covered in NHES for the populations of interest. For example, the Head Start Family and Child Experiences Survey (FACES) focuses on children in Head Start, whereas all children who have not yet started kindergarten are of interest in the ECPP survey. The National Survey of Early Care and Education (NSECE) was fielded in 2012, and there are currently no plans to field the household component of this collection again. The National Survey of Parents of Public School Students and Survey of Family and School Partnerships in Public Schools focus on parents of children in public schools. Those whose children attend private or virtual schools or are homeschooled are not represented. Some studies, such as the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B); the Early Childhood Longitudinal Study, Kindergarten Class of 1998-1999 (ECLS-K); and the Early Childhood Longitudinal Study-Kindergarten Class of 2010-11 (ECLS-K:2011) focus on single-year cohorts that are followed over time and therefore do not provide nationally representative data on different age groups. The NHES surveys are designed to complement these longitudinal collections with more frequent and more inclusive cross-sectional data.

Survey Content: Extant studies are limited in the content that they include relative to the goals of the NHES surveys. Studies such as the National Survey of America’s Families and the National Study of the Changing Workforce have collected some information on child care or program participation, but their primary emphasis is on other topics, and the depth of information on early care and education experiences is limited. The Head Start FACES project collects information on Head Start program participation and some family measures, but does not account for all nonparental care and programs. The Current Population Survey October Education Supplement is limited to a relatively small number of items on education participation and does not address the roles that parents play in their children’s school, schoolwork, and home activities. Also, no nationally representative study other than NHES collects detailed data on homeschooling.

Current Estimates and Measuring Change Over Time. Many of the extant surveys follow one cohort or periodic cohorts (e.g., the ECLS-K, Head Start FACES, NSECE) or are no longer conducted (e.g., the household component of the NSECE; and the National Survey of America’s Families, Family Involvement in Education: A National Portrait). As a result, they cannot meet the NHES goal of providing up-to-date cross-sectional estimates and measures of change over time for all children who have not started kindergarten or for children in kindergarten through 12th grade, as is provided by NHES.

A.5 Collection of Data from Small Businesses

Not applicable.

A.6 Consequences of Less Frequent Data Collection

Topics covered in the NHES:2019 child-focused surveys have been addressed in previous NHES administrations. Repeating the surveys on a regular basis allows for analysis of trends over time. In the past, NHES has been administered on a biennial cycle. The last full NHES study was conducted in 2016. Due to funding constraints and in order to allow for developmental testing between cycles, NCES moved to a triennial NHES survey administration. NCES believes that this is the maximum periodicity that will allow NHES to maintain its purpose of tracking changes in key education estimates over time.

A.7 Special Circumstances of Data Collection

None of the special circumstances apply to NHES:2019.

A.8 Consultations Outside the Agency

A Technical Review Panel (TRP) comprising leading experts in survey methodology was established to provide input to the initial redesign of the NHES system. Most members of the panel met in February 2010 to discuss the proposed design for the field test, and their comments and suggestions led to changes reflected in the NHES redesign that took place from 2007 to 2012; such design changes carry into the NHES:2019 design and are reflected in this submission.

Technical Review Panel Participants and Their Affiliation at the Time of TRP Recruitment


Nancy Bates

U.S. Census Bureau

649 A. St. N.E.

Washington, DC 20002

nancy.a.bates@census.gov


Paul Beatty

National Center for Health Statistics

Division of Health Care Statistics

3311 Toledo Road,

Hyattsville, MD 20782

pbeatty@cdc.gov


Johnny Blair

Survey Sampling and Methodology

Abt Associates Inc.

4550 Montgomery Avenue

Bethesda, MD 20814-3343

Johnny_Blair@AbtAssoc.com


Stephen Blumberg

National Center for Health Statistics

3311 Toledo Road

Hyattsville, MD 20782

stephen.blumberg@cdc.hhs.gov


Mick Couper

Survey Research Center

University of Michigan

ISR, 426 Thompson Street

Ann Arbor, MI 48104

mcouper@umich.edu


Don Dillman

Social and Economic Sciences Research Center, Professor

Washington State University

133 Wilson Hall

Pullman, WA 99164-4014

dillman@wsu.edu


Robert Groves

Survey Research Center, Institute for Social Research

University of Michigan

426 Thompson Street

Ann Arbor, MI 48106-1248

bgroves@isr.umich.edu


Scott Keeter

Pew Research Center

1615 L. St. NW. Suite 700

Washington, DC 20036

skeeter@pewresearch.org


Kristen Olsen

Survey Research and Methodology

University of Nebraska-Lincoln

201 N. 13th St.

Lincoln, NE 68588-0241

kolson5@unl.edu


Roger Tourangeau

Joint Program in Survey Methodology

University of Maryland

1218 LeFrak Hall, University of Maryland

College Park, MD 20742

RTourango@survey.umd.edu


Gordon Willis

Division of Cancer Control / Population Sciences

National Cancer Institute

6130 Executive Blvd, MSC 7344, EPN 4005

Bethesda, MD 20892-7344

willisg@mail.nih.gov


The content of the NHES:2019 child-focused topical surveys builds upon the content developed for the NHES:2016 and prior NHES administrations. As a result, the PFI and ECPP surveys reflect the cumulative input of many experts in the field and past NHES TRPs. However, in order to ensure that the ECPP and PFI surveys address important issues in the topical areas of interest and incorporate important emerging issues, the design phase of NHES:2019 included consultations with experts in the substantive areas addressed in the surveys. These experts included persons in government agencies, academe, and research organizations.


Substantive Experts: ECPP and Their Affiliation at the Time of TRP Recruitment


Margaret Burchinal

Senior Research Scientist;Director, Data Management and Analysis Center

Frank Porter Graham Institute

Sheryl-Mar South, Room 266, Campus Box 8185

Chapel Hill, NC 27599

(919) 966-5059

burchinal@unc.edu


Rupa Datta

Vice President and Senior Fellow

National Opinion Research Center

NORC at the University of Chicago 55 E Monroe St, Suite 2000

Chicago, IL 60637

(312) 759-4219

datta-rupa@norc.org


Dan Ferguson

Research Associate

National Center on Children in Poverty

215 West 125th St, 3rd Floor

New York, NY 10027

(646) 284-9647

ferguson@nccp.org


Walter Gilliam

Director

The Edward Zigler Center in Child Development & Social Policy

310 Prospect Street

New Haven, CT 06511

(203) 785-3384

walter.gilliam@yale.edu


Anna Johnson

Assistant Professor,Department of Psychology; Research Fellow

Georgetown University

Department of Psychology, White-Gravenor Hall

Washington, DC 20007

(202) 687-5320

anna.johnson@georgetown.edu


Katherine Magnuson

Professor of Social Work

University of Wisconsin-Madison

School of Social Work, 1350 University Ave

Madison, WI 53706

(608) 263-4812

kmagnuson@wisc.edu


Megan McClelland

Endowed Professor in Child Development

Oregon State University

Hallie E. Ford Center 245, 2631 SW Campus Way

Corvallis, OR 97331

(541) 737-9225

megan.mcclelland@oregonstate.edu


Marcia Meyers

Professor

University of Washington

Daniel J. Evans School of Public Policy and Governance, Parrington Hall, 4100 15th Ave NE

Seattle, WA 98195

(206) 616-4409

mkm36@u.washington.edu


Heather Sandstrom

Senior Research Associate

Urban Institute

2100 M Street NW

Washington, DC 20037

(202) 833-7200

hsandstrom@urban.org


Diane Schilder

Principal Research Scientist

Education Development Corporation

43 Foundry Avenue

Waltham, MA 02453

(617) 618-2757

dschilder@edc.org


Bobbie Weber

Research Associate

Oregon State University

Hallie E. Ford Center 231, 2631 SW Campus Way

Corvallis, OR 97331

(541) 737-9243

bobbie.weber@oregonstate.edu

Substantive Experts: PFI and Their Affiliation at the Time of TRP Recruitment


Bruce Baker

Professor

Rutgers University

10 Seminary Place, Room 17

New Brunswick, NJ 08901

(848) 932-0698

bruce.baker@gse.rutgers.edu


Michael Barbour

Director of Doctoral Studies

Isabelle Farrington College of Education

5151 Park Avenue

Sacramento, CA 06825

(203) 396-8446

mkbarbour@gmail.com


Anna Egalite

Assistant Professor

North Carolina State University

Poe Hall 300C, Box 7801, NCSU Campus

Raleigh, NC 27695

(727) 804-8290

anna_egalite@ncsu.edu


Milton Gaither

Professor of Education

Messiah College

One College Avenue

Mechanicsburg, PA 17055

(717) 766-2511

mgaither@messiah.edu; contact@icher.org


Charisse Gulosino

Assistant Professor

University of Memphis

Ball Hall 123G

Memphis, TN 38152

(901) 678-5217

cglosino@memphis.edu


Luis Huerta

Associate Professor of Education and Public Policy

Teachers College, Columbia University

212B Zankel Hall, 525 W. 120th Street

New York, NY 10027

(212) 678-4199

lah2013@tc.columbia.edu


Robert Kunzman

Professor

Indiana University

W.W. Wright Education Building Room 3288

Bloomington, IN 47405

(812) 856-8122

rkunzman@indiana.edu


Bryan Mann

Doctoral Student

Penn State University

300 Rackley Building

University Park, PA 16802

(267) 566-5234

bmann4@gmail.com


Gary Miron

Professor of Evaluation, Measurement and Research

Department of Educational Leadership, Research and Technology

Western Michigan University

1903 W Michigan Ave

Kalamazoo, MI 49008-5283

(269) 387-5122

gary.miron@wmich.edu


Richard Murnane

Juliana W. and William Foss Thompson Research Professor of Education and Society

Harvard University

Gutman 406B 13 Appian Way

Cambridge, MA 02138

(617) 496-4820

richard_murnane@gse.harvard.edu


Jennifer Rice

Professor and Associate Dean

University of Maryland

3112A Benjamin Building

College Park, MD 20742

(301) 405-5580

jkr@umd.edu


John Watson

Founder of the Evergreen Education Group

Evergreen Education Group

700 Main Ave Suite E

Durango, CO 81301

(303) 883-6068

john@evergreenedgroup.com

Additionally, Dr. Paul Lavrakas has provided independent consultation on the NHES survey design and operations since 2016, particularly on nonresponse follow-up and participant contact strategies and design.

A.9 Payments to Respondents

Screener incentives. Based on NHES:2003 experiments, small cash incentives were used in NHES:2005 and NHES:2007 to improve unit response. Based on NHES:2011 Field Test experiments, a $5 cash incentive included in the initial screener mailing was used in NHES:2012 and 2016. NHES:2016 also contained an incentive experiment that tested differential incentive amounts of $0, $2, $5, or $10 based on response propensity. Results from this experiment showed high response rates for $0 or $2 among addresses predicted to respond at high rates to the screener and no effect for the $10 incentive. The $5 versus $2 incentive experiment was repeated for the NHES:2017 web test and, as in 2011, the $5 screener was associated with higher response rates than the $2 incentive (see Appendix 6 for details). We will thus use a $5 cash incentive in the NHES:2019 first screener survey package mailing.

Topical surveys incentives. NHES:2012 included an incentive experiment at the topical level to further refine an optimal strategy for the use of incentives in NHES. For those households in which a child was selected as the subject of an ECPP or PFI questionnaire, cases that responded to the first or second mailing of the screener received a $5 cash incentive with the initial topical survey mailing. Evidence from the 2011 Field Test indicated that topical response rates could benefit significantly by providing later screener respondents with a larger topical incentive. To confirm this finding, NCES subsampled late screener respondents (those responding to the 3rd or 4th questionnaire mailing) to receive either a $5 or $15 cash incentive with their first topical survey mailing. The results from NHES:2012 indicate that, among later screener responders, the $15 incentive was associated with higher topical response rates compared to the $5 incentive. Based on these findings, we used an incentive model that provides a $5 incentive for early-screener responders and a $15 incentive for late-screener respondents for NHES:2016 and will continue with this approach in NHES:2019, except for households allocated to treatment groups with incentive experiments, as described in section A-2 above. On the web, a respondent can complete both the screener and topical in one sitting and will not receive a separate topical incentive.

In-Person Study of NHES:2019 Nonresponding Households incentives. Cases sampled for the qualitative interview will be sent an advance letter that includes $5 cash and invites them to participate in the in-person study. All cases sampled for the qualitative interview will be offered an additional $120 in cash for completing the 90-minute in-person interview. These incentives are designed to demonstrate to participants that their time and participation is valued and to take into account that both samples are comprised of households that have already shown reluctance to participate in the survey (see Appendix 4 for additional discussion of the incentive amount).

A.10 Assurance of Confidentiality

Data security and confidentiality protection procedures have been put in place for NHES:2019 to ensure that all contractors and agents working on NHES:2019 comply with all privacy requirements including, as applicable:

  1. The Inter-agency agreement with NCES for this study and the statement of work of NHES contract;

  2. Family Educational and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));

  3. Privacy Act of 1974 (5 U.S.C. §552a);

  4. Privacy Act Regulations (34 CFR Part 5b);

  5. Computer Security Act of 1987;

  6. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  7. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  8. Confidential Information Protection and Statistical Efficiency Act of 2002;

  9. E-Government Act of 2002, Title V, Subtitle A;

  10. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  11. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  12. The U.S. Department of Education Incident Handling Procedures (February 2009);

  13. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  14. NCES Statistical Standards; and

  15. All new legislation that impacts the data collected through the inter-agency agreement for this study.

The U.S. Census Bureau will collect data under an interagency agreement with NCES, and maintain the individually identifiable questionnaires per the agreement, including:

  1. Provisions for data collection in the field;

  2. Provisions to protect the data-coding phase required before machine processing;

  3. Provisions to safeguard completed survey documents;

  4. Authorization procedures to access or obtain files containing identifying information; and

  5. Provisions to remove printouts and other outputs that contain identification information from normal operation (such materials will be maintained in secured storage areas and will be securely destroyed as soon as practical).

U.S. Census Bureau and contractors working on NHES:2019 will comply with the Department of Education’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to: the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/.

By law (20 U.S.C. §9573), a violation of the confidentiality restrictions is a felony, punishable by imprisonment of up to 5 years and/or a fine of up to $250,000. All government or contracted staff working on NHES:2019 and having access to the data, including NHES field staff, are required to sign an NCES Affidavit of Nondisclosure and have received public-trust security clearance. These requirements include the successful certification and accreditation of the system before it can be implemented. Appropriate memoranda of understanding and interconnection security agreements will be documented as part of the certification and accreditation process.

From the initial contact with the participants in this survey through all of the follow-up efforts, potential survey respondents will be informed that (a) the U.S. Census Bureau administers NHES on behalf of NCES; (b) NCES is authorized to conduct NHES by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543); (c) all of the information they provide may only be used for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151); and (d) that their participation is voluntary.

The following language will be included in respondent contact materials and on data collection instruments:

The National Center for Education Statistics (NCES), within the U.S. Department of Education, is authorized to conduct the National Household Education Survey (NHES) by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). The U.S. Census Bureau is administering this voluntary survey on behalf of NCES. There are no penalties should you choose not to participate in this study. All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

Above language plus the Paperwork Reduction Act of 1995 (PRA) statement text shown below is used on the covers of the paper screener and topical surveys and on the login screen of the web survey:

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this voluntary survey is 1850–0768. The time required to complete this survey is estimated to average [XX] minutes per response, including the time to review instructions, gather the data needed, and complete and review the survey. If you have any comments concerning the accuracy of the time estimate, suggestions for improving this survey, or any comments or concerns regarding the status of your individual submission of this survey, please e-mail: NHES@census.gov or write directly to: Sarah Grady, National Center for Education Statistics (NCES), PCP, 550 12th Street, SW, 4th floor, Washington, D.C. 20202.

In addition, on the login screen of the web survey, the following text is shown below the PRA statement:

** WARNING **

You have accessed a UNITED STATES GOVERNMENT computer. Use of this computer without authorization or for purposes for which authorization has not been extended is a violation of Federal law and can be punished with fines or imprisonment (PUBLIC LAW 99-474). System usage may be monitored, recorded, and subject to audit. Any information you enter into this system may be used by the Census Bureau for statistical purposes, including but not limited to improving the efficiency of our data collection programs. Use of this system indicates consent to the collection, monitoring, recording, and use of information provided inside this system.

A.11 Sensitive Questions

NHES is a voluntary survey, and no persons are required to respond to it. In addition, respondents may decline to answer any question in the survey. Respondents are informed of the voluntary nature of the survey in the cover letters that are sent to the household, as well as on the actual questionnaire. At the same time, some items in the surveys may be considered sensitive by some respondents:

Child development and education experts consider economic disadvantage and children’s disabilities to be important factors in children’s school experiences and their activities outside of school. As a result, the child surveys contain measures of these characteristics, including: household income; receipt of public assistance such as food stamps and the Women, Infants, and Children program (WIC); and children’s disability conditions.

Measures of household income and government assistance are important because access to early childhood programs by children at-risk and the education involvement of families of children from different socioeconomic backgrounds is of interest to policymakers, child development specialists, and educators. These items are important in identifying children at risk and have been administered successfully in previous NHES studies. Respondents are also asked the age at which they first became a parent. This may be sensitive for parents in some situations.

The 2016 response rates for these items were very high. For total household income, the 2016 PFI survey had an item response rate of 96.5 percent; for the Women, Infants, and Children Program, 95.1 percent; for Food Stamps, 97.5 percent; and for the set of child disability questions, 99.7 percent. Response to a question asking parents to rate the child’s health was 99.6 percent and, for participation in developing and IEP, it was 92.5 percent. The PFI item response rate for age at which the child’s parent first became a parent to any child was 95.8 for the first parent reported and 97.2 for the second parent reported.

The PFI survey also includes items concerning children’s school performance and difficulties in school. Among these are children’s school performance and difficulties, including school grades, grade retention, suspensions, and expulsions. Items concerning school performance and difficulty are important to the PFI survey as correlates of parent and family involvement in children’s education. These items were asked in NHES:2016 PFI and item response rates for these items were high: 99.3 percent for children’s grades, 97.6 percent for out-of-school suspension, and 96.9 percent for expulsion.

Another element of the PFI survey that may be sensitive to some parents is the identification of children’s schools. This feature allows analysts to link the NHES data to other NCES datasets containing additional information about schools, greatly enhancing the ability to examine the relationships between students’ and families’ experiences and the characteristics of schools. The item response rate for the identification of the child’s school was 97.1 percent in NHES:2016.

The ECPP survey includes additional questions about assistance to pay for child care. This measure is important to understand families’ and children’s access to early childhood programs. The item response for the item about children in center-based care was 94.2 percent in NHES:2016.

A.12 Estimated Response Burden

The response burden per instrument and the total response burden are shown in Table 1. The administration times for the main study are based on practice administrations and past experience. The expected number of respondents and number of responses are based on the expected numbers of completed surveys of each type, as discussed in section B.1.3 of the Supporting Statement Part B.


Table 1. Estimated response burden for NHES:2019

Interview forms

Number Sampled

Anticipated Response Rate

Estimated Number of Respondents

Estimated Number of Responses

Average Time Per Response (minutes)

Total Time (hours)

Main study1







Screener

205,000*

53.545%*

98,790

98,790

3

4,940

ECPP questionnaire

8,495

82.220%

6,984

6,984

20

2,328

PFI questionnaire

19,893

82.910%

16,493

16,493

20

5,498

Nonresponding household study2







Qualitative interview recruitment screener

500

75.000%

375

375

5

32

Qualitative interview

375

26.667%

100

100

90

150

Study Total

 

 

99,165

122,742

 

12,948

* Approximately 10% of addresses are expected to be returned by USPS as invalid, reducing the final sample size to 184,500 addresses. Calculations of number of screener respondents and the response rate are based on 184,500 addresses rather than 205,000.

1The estimated number of respondents for the screener in the main study is different than the number of topical (ECPP and PFI) respondents because it is expected that 71% of households that complete the screener will not have an eligible child to complete the topical portion of the survey.

2The estimated number of respondents to the recruitment screeners for qualitative interviews is different than the number of completed interviews because it is expected that about 73% of recruited participants in the 90-minute qualitative interview will not participate due to either not being home during the scheduled interview time or refusing to start the interview after recruitment, given the burden.

Note: Eligibility and response rates for the national sample are estimated based on NHES:2016 and the NHES:2017 web test, and represent rounded weighted averages of the rates expected within each experimental treatment group. The response rate for the nonresponse study is estimated. In the main study, theoretically, in all households there could be a different person responding to the topical than the person who responded to the screener. In the nonresponding household study, theoretically, in all households there could be a different person responding to the screener interviews than the person who is responding to the in-person interviews. Therefore, we are clearing the maximum possible burden in the estimated number of respondents. Details may not sum to totals due to rounding.


NHES:2019 will screen 205,000 households. An expected screener response rate of approximately 54 percent and an address ineligibility3 rate of approximately 10 percent are assumed, bringing the total number of expected screeners to 98,790.4 From these completed screeners, it is expected that approximately 29 percent will contain an eligible child. A detailed description of the planned sampling design is provided in this submission in Supporting Statement Part B. The hourly wage rate of $24.33 is based on the average for all civilian workers from the September 2017 National Compensation Survey (http://www.bls.gov/news.release/ecec.t02.htm). For NHES:2019, a total of 12,948 burden hours are anticipated, resulting in a total burden time cost to respondents of approximately $315,025.

A.13 Cost to Respondents

There are no recordkeeping requirements associated with NHES and no costs to respondents beyond the time to participate as presented in table 1 above.

A.14 Cost to the Federal Government

The total cost of NHES:2019 to the federal government is approximately $10.7 million over a period of 20 months. This includes all direct and indirect costs of the design, data collection, analysis, and reporting phases of the study, and the creation of data sets. Detail is provided below.

NHES:2019 component

Cost

NHES:2019 survey design, statistical design, and planning

$1 million

NHES:2019 collection

$8 million

Collection and analysis of NHES:2019 In-Person Study of Nonresponding Households

$1 million

NHES:2019 data dissemination

$0.7 million


A.15 Reasons for Program Changes

The decrease in burden from the last approval is due to the fact that the last request, for NHES:2016, included a third topical survey, the Adult Training and Education Survey (ATES), which will not be fielded in NHES:2019.

A.16 Publication Plans and Project Schedule

Exhibit 2 presents the schedule of project activities for NHES:2019. Based on the results of NHES:2019, datasets, statistics, and reports will be produced. The following are the planned outcomes of NHES:2019:

  • A fully documented public-use data set that will be available for download from the NCES website;

  • A fully documented restricted-use data set that will be available for restricted-use data license holders only;

  • A codebook with weighted and unweighted frequencies of all variables; and

  • First Look Reports that highlight key findings from the study.

Exhibit 2.  NHES:2019 schedule of major activities

Task

Date of Scheduled Conduct/Completion

Survey Letters and Instruments Formatting and Printing

July-December, 2018

Advance Mailing Campaign

December 13, 2018

Data Collection Begins (advance letter mailing)

January 7, 2019

Data Collection Ends

September 20, 2019

First Look Reports Released

September 20, 2020

Public and Restricted-use Data Files Released

December 31, 2020


A.17 Approval to Not Display the Expiration Date for OMB Approval

The OMB authorization number and expiration date will be displayed on the paper questionnaires and web instrument.

A.18 Exceptions to the Certification Statement

There are no exceptions to the certification statement.

1 Biemer, P., Murphy, J, Zimmer, S., Berry, C., Deng, G., Lewis, K. (2017). Using Bonus Monetary Incentives to Encourage Web Response in Mixed-Mode Household Surveys. Journal of Survey Statistics and Methodology, 0, 1–22. Retrieved 3/28/18 from https://academic.oup.com/jssam/advance-article/doi/10.1093/jssam/smx015/3906559.

2 The final base weighted NHES:2016 screener response rate was 66.4%. ECPP yielded a 73.4%, PFI a 74.3%, and ATES a 73.1% topical response rate. These rates resulted in overall response rates (the product of the screener response rate and topical response rate) of 48.7%, 49.3%, and 48.5%, respectively. The increased use of the web survey mode in 2019 is expected to increase topical response rates.

3 Ineligible addresses are those that are undeliverable. Screener mailings for an address where one or more mailings are returned as a postmaster return (PMR) and no mailings are returned completed or refused will lead to an address being coded as ineligible.

4 Address eligibility and response rates are estimated based on NHES:2016 and are calculated to account for expected differential response rates within sampling strata and experimental treatment groups.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWAITS_T
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy