Supporting Statement B
Assessing Strategies to Promote Children’s Engagement and Active Participation in Virtual Home Visits
OMB Control No. 0906-XXXX
Population
There are currently approximately 3,200 local early childhood home visiting (ECHV) programs. Participation in this study is restricted to programs that deliver Maternal, Infant, and Early Childhood Home Visiting (MIECHV)-funded services and whose home visitors are currently serving families through virtual or hybrid home visits. In order to minimize burden across MIECHV-funded programs, while maintaining the necessary variability to provide generalizable findings using a rapid cycle evaluation framework, the study team will select four programs, and from these programs, will use the data collection instruments to collect information from program staff and family participants.
The respondent universe for the data collection are home visiting program staff and families participating in home visiting.
Program staff. Program directors, managers, supervisors, and home visitors will participate in a focus group as part of the site selection process, as well as part of the study phases described in Supporting Statement A. Home visitors will also complete a brief weekly questionnaire. The program staff asked to respond will include those who will be directly involved in implementing and testing a strategy as part of this study. The study team will ask program directors to select staff consisting of managers, supervisors, and home visitors who would be involved in the study activities as a part of their job responsibilities. Program leaders overseeing all program operations will also be recruited to participate.
Families. Families will be asked to participate in semi-structured focus groups and complete a brief web survey. Families in home visiting services invited to participate in focus groups will be limited to caregivers. Though home visiting programs often provide services to the whole family, the study team will engage whichever family member experienced the improvement strategy. For many strategies, this may be the primary enrollee. Families involved in data collection will be from a convenience sample; they may not be representative of the population that the programs serve. The study team will ask programs to help recruit families who are on the caseloads of home visitors implementing the virtual engagement strategies and that experienced the strategy. Program staff will also target families within home visitor caseloads to participate in self-administered surveys provided to families by home visitors at the end of each visit through email and through paper versions if web-based data collection is not possible.
Sampling and site selection
The study is using a purposive strategy to identify a sample of four MIECHV-funded programs that are currently serving families through virtual or hybrid home visits. Data collection efforts are intended to support rapid cycle learning activities. In rapid cycle learning, programs iteratively implement a practice or strategy designed to address a challenge or improve an existing practice, collect feedback on the use and promise of the strategy, use the feedback to identify opportunities to strengthen and refine the strategy, and then implement and collect feedback on the refined strategy. The primary purpose of the data collection efforts is to collect feedback on virtual engagement strategies to inform refinements. This study is intended to present an internally valid description of the feasibility of implementing a strategy in the chosen programs, not to promote statistical generalization to other programs or service populations.
To narrow down the potential program sites for selection, the study team will (1) ask for nominations from HRSA staff, subject matter experts, and members of the interested party advisory board via email, (2) look to the literature or other materials reviewed as part of an environmental scan conducted by the study to identify sites that implemented virtual engagement strategies, and (3) aim to recruit a diversity of sites, prioritizing sites according to service delivery modality, geographic location, program size, home visiting model, age ranges of children enrolled, and community-level race and ethnicity.
The study team designed all instruments—the Program Eligibility Protocol (Instrument 1), Program Staff Focus Group Protocol 1 – Co-definition Phase (Instrument 2), Program Staff Focus Group Protocol 2 – Co-definition Phase (Instrument 3), Family Focus Group Protocol – Co-definition & Summary Phases (Instrument 4), Program Staff Focus Group Protocol – Installation & Refinement Phases (Instrument 5), Home Visitor Questionnaire – Installation & Refinement Phases (Instrument 6), Family Post-Visit Questionnaire – Refinement Phases (Instrument 7), Program Staff Summative Focus Group Protocol – Summary Phase (Instrument 8), and Focus Group Participant Characteristics Form – All Phases (Instrument 9)—specifically for this study.
Instrument 1 is intended to select sites using criteria specified in section B.1.
Instruments 2 and 3 (Co-definition Phase), 4 (Co-definition and Summary Phases), 5 (Installation and Refinement Phases), and 8 (Summary Phase) are informed by implementation science frameworks such as the Consolidated Framework for Implementation Research and the Active Implementation frameworks developed by the National Implementation Research Network (NIRN). These instruments contain semi-structured interview questions and topics that are designed to gather information about key components thought to contribute to strong implementation of evidence-based programs and practices, in order to identify and refine implementation strategies to improve ECHV program’s virtual child engagement.
Instruments 6 (Installation and Refinement Phases) is intended to collect rapid feedback about program staff members’ use of a strategy, process, or tool, and was informed by instruments used on a number of other HRSA projects that have used the Learn, Innovate, Improve (LI2) framework to conduct formative rapid-cycle evaluations (for example, McCay et al. 2017)
Instrument 7 (Refinement Phase) is intended to collect rapid feedback about families’ perceptions of how strategies were implemented during a home visit and was informed by instruments used on a number of other HRSA projects that have used the LI2 framework to conduct formative rapid-cycle evaluations (for example, McCay et al. 2017).
Instrument 9 (All Phases) is intended to collect information on the characteristics of all respondents to understand the contexts that strategies are being implemented, and enable strategies to be tailored in response to the needs of respondents.
Further description of each instrument is below:
The study team will gather information from interested sites using Instrument 1. Program Eligibility Protocol. Selection criteria for sites will include service delivery modality (virtual or hybrid [combination of in person and virtual]), geographic location, program size, home visiting model, age ranges of children enrolled in services, and community-level race and ethnicity.
Instrument 2. Program Staff Focus Group Protocol 1 - Co-definition Phase will be used to guide one focus group with program staff (i.e., program managers, supervisors, and home visitors) at each selected site. As the first of two focus groups conducted with staff during the co-definition phase, this focus group focuses on identifying and then defining the parameters of the virtual child engagement strategy (or strategies) that are being used at each site and identifying challenges and facilitators to implementing these strategies.
Instrument 3. Program Staff Focus Group Protocol 2 - Co-definition Phase will be used to guide one focus group with program staff (i.e., program managers, supervisors, and home visitors) at each selected site. As the second of two focus groups conducted with staff during the co-definition phase, this focus group focuses on prioritizing and selecting a strategy with the potential to enhance delivery of virtual home visits to test in subsequent phases.
Instrument 4. Family Focus Group Protocol - Co-definition & Summary Phases. A family focus group at each site will gather information from families who participate in home visiting on their awareness of, satisfaction with, and perception of the utility of the virtual child engagement strategies.
Instrument 5. Program Staff Focus Group Protocol - Installation and Refinement Phases. During the focus group, the study team will facilitate reflection with program staff at each site on the data gathered through the Home Visitor Questionnaire, identification of potential refinements that may be needed, and agreement on refinements to be tested in the next phase.
Instrument 6. Home Visitor Questionnaire - Installation and Refinement Phases. Home visitors participating in the study at each site will complete a brief questionnaire weekly that will provide contemporaneous feedback on how virtual engagement strategies were implemented and their perceived utility.
Instrument 7. Family Post-Visit Questionnaire - Refinement Phase. Families at each site that participate in a home visit where a home visitor implemented virtual engagement strategies being tested will respond to brief self-administered post-visit questionnaires describing their perceptions of how the strategy was implemented.
Instrument 8. Program Staff Focus Group Protocol - Summary Phase. The study team will facilitate another program staff focus group at each site to identify and reflect on lessons learned from the refinement phase, discuss the rapid cycle findings and the potential to improve services.
Respondents that participate in a focus group will complete Instrument 9. Focus Group Participant Characteristics Form – All Phases. After each focus group all respondents (home visitors, other program staff, and families) will complete this web-based form which will describe the focus group sample and includes demographic information, and, as applicable, tenure of program staff, and length of time families have been receiving services.
Focus group information collection activities
The project team reviewed the instruments to ensure that they ask only questions necessary to achieve the objectives of the information collection. All instruments were created specifically for the project, do not include any scales or items that measure constructs, and do not require psychometric testing. All data collection activities are completely voluntary.
All facilitators and notetakers leading the focus groups will be trained. The study team will meet regularly during the data collection period to support ongoing training. For instance, the team will revisit the intent of questions or tips to ensure our phrasing elicits on-track responses. Study team members will be trained to review each type of administrative material received from the program, using a standardized checklist to satisfy the goals of each document request. To consistently extract administrative information, the study team will develop standardized templates with clear guidance on the process for extracting administrative information from each document.
After collecting data through focus groups during each learning cycle, the study team will 1) prepare data for analysis; 2) extract topics and themes from each source; and 3) summarize themes and findings. After each round of data collection, the study team will first use a professional service to transcribe all recorded conversations. After transcriptions are complete, interviewers will review the transcripts for accuracy and completeness. As needed, they will use the recording and their notes to fill in any information the transcriptionist omitted or miswrote because of a recording’s inaudibility. Second, the study team will use a deductive approach to code all focus group transcripts, extract key ideas about each coded excerpt, and group data thematically for analysis. Finally, coders will summarize any high-level theme(s) and findings. These analysis findings will inform modifications to the practice to test in subsequent rounds of testing and will allow the study team to evaluate the success of the practice within each individual study site.
Survey information collection activities
In collaboration with the programs, the study team will collect and analyze quantitative data regularly to examine whether strategies are affecting implementation or proximal outcomes. Questionnaire responses will be analyzed using descriptive summary statistics (such as frequencies and averages), synthesized both within and across learning cycles, and triangulated with findings from qualitative data collection efforts. When analyzing quantitative questionnaire data (from Instrument 6. Home Visitor Questionnaire - Installation & Refinement Phases, Instrument 7. Family Post-Visit Questionnaire - Refinement Phase, and Instrument 9. Focus Group Participant Characteristics Form - All Phases), the study team will use simple descriptive statistics and cross tabulations to assess sample size, characteristics, response rates, and data quality.
The study team will develop several processes to ensure consistency in qualitative and quantitative data collection:
Qualitative data collection consistency will be aided by the study team developing interview and focus group protocols that use consistent structures across programs and respondents. The study will provide a standardized initial training of interviewers and focus group facilitators and ongoing, targeted retraining as needed and appropriate to ensure we collect consistently high-quality data through all phases of collection. A senior researcher will communicate regularly with data collectors to identify and resolve any problems, questions, and inconsistencies quickly.
Consistency in quantitative data collection will be achieved in several ways. First, the survey software used has built-in mobile formatting to ensure that all respondents have the same user-friendly, high-quality experience in completing the survey, whether on a handheld device, a desktop, or a laptop. The team will also provide guidance to all data collectors to help ensure consistency of survey administration.
For all survey instruments, the study team has developed web-based survey response criteria to produce consistent data across varied respondents. For example, there will be numeric range restrictions on questions about caseload, age, and program start and end dates, among others. The team will implement skip patterns to ask respondents only the most relevant questions, and home visitor and family surveys will include items that form a standard summary index of the activities and strategies that are relevant to each program. Web surveys will have built-in verification and quality control features that will, for example, check for internal consistency across items, validate data as they are entered and, using range checks, limit impossible outliers. The survey instruments also prioritize closed-ended questions (to enable descriptive quantitative analyses). Any open-ended probes will be accompanied by a framework for coding that facilitates descriptive analyses.
In addition to these automated tools, study team will quickly review item frequencies after the first few surveys are completed to ensure that all potential response patterns and skip logic are functioning properly, that checks are preventing incorrect or missing values, and to detect issues such as instructions that need to be clarified. The team will continue to monitor data quality and consistency throughout the field period and act swiftly to resolve any issues.
The data collection activities are not designed to produce statistically generalizable findings, and participation is wholly at the respondent’s discretion. Response rates will not be reported.
Nonresponse
We will work with each participating program to designate a site liaison, who will help with individual-level recruitment, coordinating focus groups with program staff, and on-going data collection efforts. We will work with each program to determine the appropriate frequency and type of follow-up they will conduct to minimize nonresponse.
Since families and program staff will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. The data collection will document examples of programs that provide services to children, and families. Respondent demographics will be documented and reported in written materials associated with the data collection.
The data collection instruments were not pilot tested because they are designed to be flexible so they can be tailored for each individual site based on the virtual engagement strategy that it is testing.
The study team gathered input from program staff, home visitors, and families participating in home visiting through the advisory boards developed for this study. The study team asked advisory board members, as described in section 8B of Supporting Statement A, to provide input on how the questions can be improved to ensure they are easy to understand and based on the real-world operations or actions of programs, home visitors, and families.
The research team consulted statistical experts within their organizations while designing the data collection strategy; no outside individuals were consulted on statistical aspects of the design. The research team designed the data collection strategy and will collect and analyze data. The project received feedback on study design from advisory boards developed for this study and made up of subject matter experts, program staff, home visitors, and families participating in home visiting. These advisory boards also provided feedback on data collection instruments.
HRSA has contracted with PRG and Mathematica to conduct this study. PRG and Mathematica are responsible for the collection and analysis of all information described in this ICR.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Julius Anastasio |
File Modified | 0000-00-00 |
File Created | 2024-08-01 |