Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Study of Coaching Practices in Early Care and Education Settings 2021: Follow-up
OMB Information Collection Request
0970 - 0515
Supporting Statement
Part B
April 2021
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officer:
Wendy DeCourcey
Part B
B1. Objectives
Study Objectives
The Study of Coaching Practices in Early Care and Education 2021: Follow-up (SCOPE 2021: Follow-up) has three primary objectives:
To understand the practice and processes of coaching—and professional development more broadly—in supporting early care and education (ECE) settings (both ECE centers and family child care [FCC] homes) throughout the COVID-19 pandemic.
To understand how those practices and processes have been adapted or changed in the context of the COVID-19 pandemic (compared to before the pandemic), with a particular focus on understanding the use of remote versus in-person strategies for coaching and professional development.
To learn lessons about changes to coaching and professional development during the COVID-19 pandemic that could be beneficial to maintain after the pandemic subsides because they have the potential to benefit quality improvement efforts in ECE more broadly.
Generalizability of Results
This study is intended to present an internally-valid description of coaching and professional development in ECE centers and FCC homes that were purposively selected, not to promote statistical generalization to other sites or service populations. Publications resulting from the study will acknowledge this limitation.
Appropriateness of Study Design and Methods for Planned Uses
We propose to conduct web-based surveys and qualitative interviews with purposively selected coaches, center directors, and FCC providers who participated in the previously approved SCOPE 2019 web-based surveys in 2019 (OMB #0970–0515; approved September 18, 2018). The participants in SCOPE 2019 are uniquely positioned to provide information about how coaching has changed during the pandemic (their responses to the 2019 web-based surveys can be compared to their responses on the 2021 web-based surveys). As discussed in greater detail in section B2, these respondents were also initially selected for the 2019 web-based surveys to ensure variation in both coaching approaches and in characteristics of ECE settings receiving coaching. This variation in respondents helps ensure we can identify lessons learned relevant for approaches to coaching and ECE settings with different characteristics. The qualitative interviews (which will be completed with a subset of those who respond to the SCOPE 2021: Follow-up web-based surveys) will provide important context for interpretation of web-based survey data and provide additional lessons learned. However, these data will not be representative of centers and FCC homes receiving coaching and professional development; all publicly available products associated with this study will clearly describe key limitations.
As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.
B2. Methods and Design
Target Population
The target population for this study includes ECE professional development coaches, center directors, and FCC providers who participated in the previously approved SCOPE 2019 web-based surveys (OMB #0970–0515). The center directors and FCC providers are from ECE centers and FCC homes, respectively, that serve preschool-age children from low-income families (with a primary target of ECE centers and FCC homes funded through a Head Start grant or serving children supported by Child Care Development Fund [CCDF] subsidies). For the SCOPE 2019 web-based surveys, the research team used non-probability, purposive sampling to ensure variation in the state policy context, the ECE setting type, and setting funding sources, as well as funders and providers of coaching and features of the coaching they provide. Because participants were purposively selected, they are not representative of the population of ECE centers and FCC homes receiving professional development coaching or of coaches who provide that coaching. Instead, we aimed to obtain sufficient variation to understand the wide range of ways that coaching features are combined and implemented as coaches provide support to center teachers and FCC providers to improve their classroom practices.
Respondent Recruitment and Site Selection
Identifying respondents for the SCOPE 2019 web-based surveys. Steps to identify the ECE settings and respondents to recruit for the SCOPE 2019 web-based surveys are described in previously approved Supporting Statements (OMB #0970–0515).1
Identifying respondents for the SCOPE 2021: Follow-up web-based surveys. For the SCOPE 2021: Follow-up web-based surveys (Instruments 1, 2, and 3), respondent recruitment will focus on the 100 coaches, 66 center directors, and 38 FCC providers who participated in the SCOPE 2019 web-based surveys. For center directors only, if we determine that many have left their position since responding to the 2019 web-based survey (that is, there is a high rate of turnover as indicated by email bouncebacks when sending out survey information and/or survey nonresponse), we will aim to locate and recruit (through web searches and a recruitment phone call to the center) the new center director. If the new center director has been in their position for at least four months, we will ask them to complete the 2021 web-based survey. We will take this approach for center directors, because the survey items are about the center setting and supports for teachers rather than the center directors themselves. This approach is not feasible for coaches or FCC providers, who answer about their own experiences (so responses from new study participants in 2021 could not be compared to responses from 2019).
Identifying respondents for the SCOPE 2021: Follow-up qualitative interviews. For the SCOPE 2021: Follow-up qualitative interviews (Instruments 4, 5, and 6), we will recruit a subset of the SCOPE 2021: Follow-up web-based survey respondents.
For coaches, we will prioritize for recruitment those who report in the web-based survey that they are currently providing coaching. If there are not enough coaches still providing coaching to hit our target, we will also attempt to recruit coaches who are providing other forms of professional development (but are not coaching at the time of the web-based survey). We aim to recruit at least 8 and up to 12.
For both center directors and FCC providers, we will attempt to recruit those who work in a setting that is currently open for either in-person or remote services to families and that has received some type of professional development support since COVID-19 began. We aim to recruit at least 16 and up to 24 center directors and at least 8 and up to 12 FCC providers.
B3. Design of Data Collection Instruments
Development of Data Collection Instruments
We developed six data collection instruments for SCOPE 2021: Follow-up:
Web-based surveys
Coach Survey (Instrument 1)
Center Director Survey (Instrument 2)
FCC provider Survey (Instrument 3)
Qualitative interview protocols
Coach Interview (Instrument 4)
Center Director Interview (Instrument 5)
FCC Provider Interview (Instrument 6)
Developing the web-based surveys. We adapted the SCOPE 2019 web-based surveys for the SCOPE 2021: Follow-up web-based surveys for coaches, center directors and FCC providers. To achieve the study’s objectives, we must answer the range of research questions in Section A2 in Part A on the purpose of the study. The web-based surveys each include different modules that capture the range of topics addressed in the study’s research questions. Within each module, the questions align with the key constructs relevant to the research questions.
The web-based surveys include several items from the SCOPE 2019 web-based surveys, which were drawn from several existing sources described in the previously approved OMB package. All three SCOPE 2021: Follow-up web-based surveys also include new items, as the topic of the influence of COVID-19 on coaching and professional development in ECE is a new area of research.
Two coaching and professional development experts/practitioners from the Children’s Learning Institute (CLI) at the University of Texas Health Science Center at Houston who are on the study team reviewed the all three SCOPE 2021: Follow-up web-based surveys to assess the clarity and appropriateness of the questions, given the study’s objectives. Two coaches and one center director also reviewed the web-based surveys and provided feedback on whether the questions were relevant for the study’s purposes, if the instrument terminology was clear, and whether the respondents’ understanding of the questions was consistent with what was intended. These activities informed final revisions to all three web-based surveys to ensure they fit within the allotted burden, were clear for respondents, and included only the key questions necessary for addressing the study’s research questions. The programming of the web-based surveys will be thoroughly tested prior to deployment to ensure accuracy and minimize measurement error.
Developing the qualitative interview protocols. We adapted relevant questions and probes from the qualitative interview protocols that had been developed for the SCOPE: 2019 study site visits (which were suspended due to the COVID-19 pandemic) for the SCOPE 2021: Follow-up qualitative interviews with coaches, center directors and FCC providers. We also developed new questions and probes to address how aspects of coaching and professional development have changed in response to the COVID-19 pandemic. The qualitative interview protocols include the universe of potential questions; we will use the responses to the 2021 web-based surveys to determine which interview protocol questions to ask respondents.
The coaching and professional development experts/practitioners on the study team from CLI reviewed all three qualitative interview protocols to assess the clarity and appropriateness of the questions, given the study’s objectives. Their feedback informed final revisions to all three qualitative interview protocols to ensure they fit within the allotted 45-minute timeframe, were clear for respondents, and included only the key questions necessary for addressing the study’s research questions.
For SCOPE 2021: Follow-up there are four primary research questions (listed in Section A2 of Part A under Research Questions or Tests) to support achievement of the study objectives. Appendix C includes a detailed web-based survey item by research question crosswalk, and Appendix D includes a detailed qualitative interview item by research question crosswalk.
B4. Collection of Data and Quality Control
Project team members from the agency contractor will field the web-based surveys and conduct the qualitative interviews by telephone.
Web-based surveys. We will recruit all coaches, center directors and FCC providers who participated in the SCOPE 2019 web-based survey data collection to participate in the SCOPE 2021: Follow-up web-based survey data collection. We have prepared an advance notification email and an invitation email, tailored to the type of respondent (coach, center director, or FCC provider), describing the study and web-based survey as well as an estimate of the amount of time required to complete the web-based survey. For any new center director respondnets we identify, we have also prepared an email that introduces the study and provides the survey information and a script for a recruitment phone call. Recruitment materials are included in Appendix B.
Throughout the data collection, we will monitor the response rate daily and conduct regular quality assurance checks on the data. Over the course of the data collection period, we will send out periodic reminder emails to nonrespondents (included in Appendix B). We will conduct reminder phone calls to respondents who do not complete the web-based survey following multiple email reminders.
Qualitative interviews. Coaches, center directors and family child care providers will be sampled from respondents to the SCOPE 2021: Follow-up web-based surveys. They will be recruited first by email and then by phone (if email is unsuccessful) within two to three weeks after completing the web-based survey. Though we aim to speak with all respondents for 45 minutes, we will give respondents the option of a 30-minute interview if they are not otherwise able to participate and then prioritize questions based on web-based survey responses. Recruitment materials are included in Appendix B.
For the qualitative interviews, to ensure we collect high quality data, team members will undergo a training focused on: (1) how to prepare for the interview; (2) how to efficiently move through the interview protocol while collecting high quality information; and (3) how to synthesize notes after each interview to confirm completeness of the data.
The qualitative interviews will be completed by phone; with the permission of respondents, the interviews will be recorded to later confirm responses. Throughout the data collection period, team members will conduct weekly meetings to share information and strategies, help troubleshoot challenges, and ensure that all data are collected uniformly.
B5. Response Rates and Potential Nonresponse Bias
Response Rates
The web-based surveys and qualitative interviews are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported.
NonResponse
Based on responses to the SCOPE 2019 web-based surveys, we do not expect substantial nonresponse on the SCOPE 2021 Follow-up web-based surveys. Based on previous experience with qualitative interviews with similar respondents, we also do not expect substantial nonresponse on the SCOPE 2021 Follow-up qualitative interviews. Because participants will not be randomly sampled and findings are not intended to be representative, we will not calculate nonresponse bias. As part of study reporting, we will present information about characteristics of the participating respondents and settings.
B6. Production of Estimates and Projections
The data will not be used to generate population estimates, either for internal use or dissemination.
B7. Data Handling and Analysis
Data Handling
Web-based surveys (Instrument 1, 2, and 3). The use of web-based surveys will make completing them easier for respondents and support efforts to maintain accurate data. The web-based surveys will not allow respondents to enter out-of-range or inconsistent responses. Weekly reviews of web-based survey data will allow us to identify potential errors and follow-up with respondents prior to the end of data collection. If respondents are unable to complete the survey via the web-based instrument, we can offer them the choice to respond to the survey by phone if they prefer. Note, all of the respondants to the SCOPE 2019 surveys chose to complete the survey using the web-based instrument.
Qualitative interviews (Instruments 4, 5, and 6). To ensure that interview notes are complete and consistently prepared, we will use a standard note template for each data collection protocol. With respondent permission, the notetaker will make use of audio recordings to ensure that notes are complete. Data collectors will review the interview notes and a senior member of the team will review a subset of the notes to ensure that data are complete and error free. We will train a small team of coders. To ensure consistency in coding, the team will have a lead coder who develops the coding scheme, trains the coders, oversees the work, and ensures reliability. All coders will code the first set of notes and discuss any differences to establish coder agreement. The coding team will have ongoing meetings to discuss emerging themes from the data in response to the research questions.
Data Analysis
The instruments included in this OMB package will yield data that we will analyze using quantitative and qualitative methods. We will carefully link the research questions guiding the study with the data collected, and we will look across patterns in the quantitative data and themes in the qualitative data to guide our interpretation of findings. SCOPE 2019 is registered with ClinicalTrials.gov (ClinicalTrials.gov ID: NCT03743337); upon approval of this OMB package, we will update SCOPE’s registration to include the analysis plan for the proposed activities.
Web-based surveys (Instrument 1, 2, and 3). For all of the web-based survey data on coaching and professional development, we will examine descriptive statistics (frequencies, means, and standard deviations as appropriate) for the overall sample and for subgroups of interest (for example, settings that have funding from a Head Start grant versus those that do not; ECE centers versus FCCs; settings that are providing in-person services to children and families versus those that are not). For the web-based survey data focused on coaching, we will compare responses to those from the SCOPE 2019 web-based surveys (for any items that match between the two surveys) to understand changes in coaching. As context for the data on coaching and professional development, we will also examine what respondents tell us about the status of their setting (e.g., whether the setting has closed during the pandemic) and/or scope of job (e.g., whether the respondent took on different duties). For data from center directors specifically, we will also examine the pattern of responses on items that match between the 2019 and 2021 web-based surveys both including and excluding any new center directors. As appropriate, we will construct summary variables that draw on multiple items to describe aspects of coaching and professional development.
Qualitative interviews (Instruments 4, 5, and 6). Using qualitative coding software, we will take an open coding approach to preparing the qualitative data for analysis. To begin, two trained researchers will code and analyze the data to identify emerging themes. Researchers will begin with an initial set of codes based on the constructs that guided development of the qualitative interview protocols but also allows for codes or themes to emerge throughout the analysis. Researchers will meet to review how codes are applied and reconcile any discrepancies through consensus. Initial themes may begin to emerge during these discussions and will be documented. During the analysis process, the initial codebook will be adjusted as necessary to reflect the discussion (and new codes will be applied to all notes). This coding process will continue in a similar manner until all notes are coded. Once the data are coded, the team will analyze the data by research question to identify themes and gauge consistency across respondents.
Data Use
Once the analysis is complete, we will synthesize information relevant to each of the study’s research questions across the quantitative and qualitative sources. We will draw conclusions based on that synthesis to highlight important lessons learned about changes to coaching and professional development during the COVID-19 pandemic that could be beneficial to maintain after the pandemic subsides because they have the potential to benefit quality improvement efforts in ECE more broadly. We will then share the findings through a short study report and/or research briefs as well as presentations or briefings. Data from this study will also be transmitted to the Child & Family Data Archive or a similar data archive at the end of the study so it can be used by other researchers to broaden understanding in the ECE field about how to support quality improvement during a time of change or crisis. Before archiving the data, we will screen the data content to reduce the risk of confidentiality breaches, either directly or through deductive analysis, and we will ensure data are stripped of any identifying information, such as uniquely identifying detail. We will prepare a data-use manual to accompany that archived data that describes the study design, data collection procedures, and analysis approaches employed by the study team in order to support understanding of how to property interpret, analyze, and evaluate the information collection. The manual will also describe study limitations.
B8. Contact Person(s)
Social Science Research Analyst
U. S. Department of Health and Human Services
Administration for Children and Families
Office of Planning, Research & Evaluation
Tracy Carter Clopet
Contract Social Science Research Analyst
VPD Government Solutions
U. S. Department of Health and Human Services
Administration for Children and Families
Office of Planning, Research & Evaluation
Emily Moiduddin
Director of Early Childhood Research
Mathematica
emoiduddin@mathematica-mpr.com
Elizabeth Cavadel
Senior Researcher
Mathematica
Caroline Lauver
Survey Researcher
Mathematica
Attachments
Instrument 1: Coach survey
Instrument 2: Center director survey
Instrument 3: Family child care provider survey
Instrument 4: Coach interview protocol
Instrument 5: Center director interview protocol
Instrument 6: Family child care provider interview protocol
Appendices:
Appendix A1: Findings from the SCOPE 2019 web-based surveys – summary of findings
Appendix A2: Findings from the SCOPE 2019 web-based surveys – NRCEC presentation of findings
Appendix B1: Survey recruitment materials
Appendix B2: Interview recruitment materials
Appendix C: Survey item by research question crosswalk
Appendix D: Interview item by research question crosswalk
1Materials related to SCOPE 2019 are accessible at: https://www.reginfo.gov/public/do/PRAViewICR?ref_nbr=201803-0970-005
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | OPRE |
File Modified | 0000-00-00 |
File Created | 2021-09-08 |