Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Consumer Education and Parental Choice in Early Care and Education
Formative Data Collections for ACF Research
0970 - 0356
Supporting Statement
Part B
September 2022
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officers:
Alysia Blandon
Bonnie Mackintosh
Part B
B1. Objectives
Study Objectives
The broad objectives of the proposed study are to describe states’ and territories’ consumer education (CE) practices and evaluations as well as gather data to inform the designs of forthcoming activities as part of this project, including potential future research. Specifically, we aim to:
Understand the type of CE information Child Care and Development Fund (CCDF) lead agencies and their partners provide to families about early childhood education (ECE) in their state or territory;
Summarize the variation in CCDF lead agencies’ CE strategy types, modes, and materials;
Identify successes and challenges CCDF lead agencies encounter when implementing CE strategies; and
Determine whether CCDF lead agencies are evaluating their CE efforts, as well as any measures used to assess CE practices and policies.
Generalizability of Results
This study is intended to present internally-valid description of CE services in all states, D.C., and five territories, not to promote statistical generalization to a particular service population.
Appropriateness of Study Design and Methods for Planned Uses
Semi-structured interviews, by comparison to a survey approach, afford our team the opportunity to gather rich information on an unknown set of CCDF lead agency CE strategies, challenges and successes, and evaluation efforts. A survey with limited response options or the inability to follow-up on participants’ responses would restrict our research team’s ability to learn about lead agencies’ CE practices. We are also collecting documents to review to reduce the burden on CCDF lead agency administrators. Administrators can provide documents for our team to review, rather than spend additional interview time on content that already exists.
Our team will use this data to inform the Office of Planning, Research, and Evaluation’s (OPRE)understanding of CCDF lead agencies’ current CE strategies, challenges and successes, and evaluation efforts. The findings will also inform forthcoming research under the same contract. ACF may also incorporate the findings into documents or presentations. The Office of Child Care (OCC) may also use the findings to guide supports for CCDF lead agencies. Finally, the findings may be used to inform future ACF research. For example, ACF may use the findings to identify places where novel CE strategies or community partnerships are implemented for case studies, inform questions for a parent survey of CE, and develop tools for states and territories to use to evaluate their CE. As noted in Supporting Statement A (SSA) section A2, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information. Further, the results of this study are not generalizable. This key limitation will be included in all written products associated with this study.
B2. Methods and Design
Target Population
The target population is each state, D.C., and the five territories. The research team will use public sources and information from ACF to identify the universe of CCDF lead agency staff who can provide information on the study’s key constructs.
Sampling
We will collect data from 56 CCDF administrators, or another CCDF lead agency staff they identify as the best person to answer the questions, from every state, D.C., and each of the five territories. The implementation of CE is a function of the wide variability in states’ and territories’ populations, policies, and geography. Given the small number of potential participants and the inability to sample from this pool in a representative manner, we elected to use a census sampling approach.
B3. Design of Data Collection Instruments
Development of Data Collection Instrument
We developed a semi-structured interview guide (see Instrument 1), aligned to our research questions about states’ and territories’ CE strategies, challenges and successes, and evaluation efforts. We developed questions and prompts based on information our team gathered through a literature review, CE training and technical assistance documents, state and territory 2019-2021 CCDF plans, and our team’s anecdotal experience.
Data collected from the 2022-2024 CCDF plans and CCDF lead agency web content also informed our development of the questions in the interview guide. Our research team included questions that could answer our research questions and could not be answered by reviewing other publicly available information.
We conducted cognitive testing on the interview guide with 3 former state administrators to (1) refine the questions based on the information administrators are likely to have, (2) clarify the language for CCDF lead agency staff, and (3) provide an accurate timeframe for the interview.
B4. Collection of Data and Quality Control
Training and Oversight of Research Staff
The research team will conduct the interviews of CCDF administrators, or another CCDF lead agency staff member they identify as the best person to answer the questions. The selected staff have experience conducting similar semi-structured interviews. In addition to this experience, staff will receive training from the research team leadership on the study-specific interview guide. Leads will conduct two sessions where they review the interview questions and conduct interactive exercises with the interviewers. Then, interviewers will practice conducting the interview with other team members. Leads will watch a recording of the practice interview and provide feedback to the interviewers in real time. Leads will review a subset of the interview recordings to ensure the interviewers continue to use the skills they learned during training.
Recruitment and Scheduling Interviews
Using email addresses gathered from public sources and OCC, we will email each CCDF administrator to request their participation in the study (see Appendix A: Recruitment and Follow-up Materials). This email includes a description of the purpose of the study, research questions, human subjects’ protections, and timeline. In the email, we will ask the administrator if they would like to participate in the interview or if they would like to identify another staff person for us to contact. We will send a follow-up to the administrator if we do not hear from them within 5 business days. If we do not hear back from the administrator after the follow-up email, we will call the administrator’s office phone listed in the CCDF plans. As needed, we will coordinate with OCC to reach administrators who are unresponsive to these recruitment efforts.
When CCDF administrators (or their delegate) agree to participate, we will schedule the interview at a time that is convenient for them. After we schedule the interview, we will send a confirmation email. We will also send a reminder email the day before the interview is scheduled to occur.
Conducting Interviews
A team member will conduct the interview remotely using a video conferencing platform approved by the data security team, with both video and call-in options or phone calls. With the CCDF administrator’s permission, the interview will be recorded. If the CCDF administrator declines to be recorded, a second team member will attend the interview to take detailed notes.
Collecting Lead Agency Documents
At the end of the interview, the interviewer will ask the CCDF administrator (or their delegate) if they have documents that they would like to share with our team (see Instrument 1). If the CCDF administrator (or their delegate) would like to share documents with our team, we will send them an email with instructions for how to securely submit documents through a document portal (see Appendix A: Recruitment and Follow-up Materials).
B5. Response Rates and Potential Nonresponse Bias
Response Rates
The interviews are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported.
NonResponse
As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in written materials associated with the data collection.
B6. Production of Estimates and Projections
The data will not be used to generate population estimates, either for internal use or dissemination.
B7. Data Handling and Analysis
Data Handling
Procedures to minimize errors due to data entry, coding, and data processing: We will use the virtual meeting platform’s transcription feature, which automatically generates a transcript from a recorded meeting. The transcript will act as our raw source of data. Voice to transcript technology is not perfect, so we will have another team member correct any transcription errors by listening to the recording and revising the text, as needed. The interviewer will do a final review of the transcripts by cross-referencing them with their short-hand notes. The final transcripts will be uploaded to NVivo, a qualitative coding program. Once uploaded, the team member who reviewed the transcript will ensure that the transcript was accurately uploaded. Submitted documents will also be uploaded to NVivo.
The research team will code the transcripts and documents for recurring themes. To minimize errors in the coding process, research team leadership will train staff on how to use a priori codes to deductively code and also on best practices for inductively coding the transcript data using NVivo. The coding team will meet regularly to discuss convergent or conflicting coding and themes, as well as to review inductive codes that have emerged. The research team lead will conduct a random spot-check of 20% of the coded transcripts and documents to check for agreement across coders.
Data Analysis
After all interviews and documents are coded, we will look for patterns, relations, and themes across interviews, especially as they relate to the types and modes of CE strategies, evaluation methods and metrics, and priority populations. The team will discuss the overlap and divergences in CCDF lead agency CE strategies. Then, the team will crosswalk the interview and document themes with data from other sources of information including a web and social media scan and CCDF plan reviews. Our research team will summarize the CCDF lead agencies’ CE planned, implemented, and forthcoming strategies and evaluation efforts by presenting the proportion of states and territories engaging in these activities. We will also describe patterns in CE strategy use as well as challenges and success across all states, D.C., and the five territories.
Data Use
The data from this study will inform other project activities, such as identifying states/territories for the case studies, development of a parent survey, and evaluation tools. The team will maintain interview responses and documents with the identity of each state/territory and will not share them outside of our team, although ACF will see interview responses and summaries of the documents (but not the documents themselves) linked to each state/territory. The data will inform future research and planning. Information from this study may be securely shared with qualified researchers to help guide future research.
The findings from this descriptive study are meant to inform ACF activities and, while the primary purpose is not for publication, some findings may be incorporated into documents or presentations that are made public. The following are some examples of ways in which we may share information resulting from these data collections: research design documents or reports; contextualization of research findings from a follow-up data collection that has full PRA approval; or informational reports to TA providers. In sharing findings, we will describe the study methods and limitations with regard to generalizability and as a basis for policy.
B8. Contact Person(s)
Name |
Affiliation |
Email Address |
A. Rupa Datta, PhD |
NORC at the University of Chicago |
|
Rebecca Berger, PhD |
NORC at the University of Chicago |
Attachments
Instrument 1: CCDF Administrator Interview Guide
Appendix A: Recruitment and Follow-Up Materials
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Mackintosh, Bonnie (ACF) |
File Modified | 0000-00-00 |
File Created | 2023-10-17 |