3_HBCCSQ CLIN 7 Validation Study_SSB_12.20.23_fnl

3_HBCCSQ CLIN 7 Validation Study_SSB_12.20.23_fnl.docx

Home-Based Child Care Toolkit for Nurturing School-Age Children Study

OMB: 0970-0625

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes






Home-Based Child Care Toolkit for Nurturing School-Age Children Study



OMB Information Collection Request

0970 – New Collection





Supporting Statement

Part B



January 2024







Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers: Ann Rivera and Bonnie Mackintosh


Part B


B1. Objectives

Study Objectives

This validation study will build psychometric evidence about the Home-Based Child Care Toolkit for Nurturing School-Age Children (HBCC-NSAC Toolkit) provider questionnaire. As noted in Supporting Statement Part A Section A1, the study is part of the Home-Based Child Care Supply and Quality (HBCCSQ) project. The objectives of the validation study are to 1) administer the HBCC-NSAC Toolkit provider questionnaire to providers caring for school-age children (age 5 and in kindergarten, or ages 6 through 12) in a residential setting (i.e., home-based child care [HBCC]), survey the families they care for, and observe a subset of the providers, and 2) elicit enough responses across instruments to examine the psychometric properties of the dimensions in the HBCC-NSAC Toolkit provider questionnaire including convergent and discriminant validity evidence. By doing so, the study team aims to 1) assess the reliability of the HBCC-NSAC Toolkit provider questionnaire, 2) assess the evidence for its construct and concurrent/convergent validity, and 3) examine invariance across subgroups (that is, look for the absence of any differential item functioning [DIF]). The current approval builds upon the pilot study (completed under ACF’s generic clearance 0970 – 0355; approved March 2023), which provided findings used to revise the HBCC-NSAC Toolkit provider questionnaire and improve procedures for the validation study.


Generalizability of Results

HBCC, or child care and early education (CCEE) offered in a provider’s or child’s home, is the most common form of nonparental child care and is used by millions of families in the United States. This validation study is intended to present reliability and validity information for the English version of the HBCC-NSAC Toolkit provider questionnaire when used by home-based providers who care for school-age children, not to promote statistical generalization to other sites or service populations. Findings will not be representative of the experiences of home-based providers and families, and any written products resulting from the study will acknowledge this limitation. Nonetheless, these data will provide information about a population of great policy interest that is not well-represented in existing research.1



Appropriateness of Study Design and Methods for Planned Uses

The validation study is designed to collect information efficiently from a diverse group of home-based providers to examine the psychometric properties of the domains and dimensions in the HBCC-NSAC Toolkit provider questionnaire. To adequately examine invariance across subgroups the validation study requires that providers have a range of characteristics that may be associated with their responses, including: 1) providers who are licensed, regulated family child care (FCC) providers or family, friend, and neighbor (FFN) providers who are legally exempt from state licensing or other state regulations for child care, 2) providers residing in a mix of urban and rural locations, and 3) providers with diverse race and ethnicity. Working with staff from community organizations that have access to a wide range of providers allows for efficient recruitment of providers with different characteristics. Participants will be selected to vary by the characteristics discussed in Section B2 and the study team will closely monitor whether the respondent pool includes an adequate number of respondents across the characteristics to support subgroup analysis. The details of the methodological approach are described in Section B2.


The data collected are not intended to be representative, and findings may not necessarily apply to all home-based providers or families. The reliability and validity information from this study will be limited to English-speaking home-based providers and families from the range of characteristics obtained for the study. All publicly available products associated with this study (the HBCC-NSAC Toolkit provider questionnaire, study findings report, and archived data) will clearly describe key limitations. This study does not include an impact evaluation and will not be used to assess participants’ outcomes.


As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.



B2. Methods and Design

Target Population

The study team will collect information from 150 home-based providers who at the time of collection regularly care for at least one school-age child (age 5 and in kindergarten, or ages 6 through 12), meaning they care for the school-age child(ren) in a home at least 10 hours per week and at least 8 weeks in the past year. The study team will also collect information from 150 family members with at least one school-age child in HBCC and who are most responsible for the care of the child when they are not in the provider’s care. All participants must be at least 18 years old and be able to read and answer questions in English; all providers must also be able to distribute the family survey to at least one eligible family. The proposed approach balances cost and burden with ensuring enough information to meet the study’s analytic needs.


The study team will use a non-probability, purposive approach to select providers to participate in the validation study in at least 10 states (states vary on HBCC licensing rules). Because participants will be purposively selected, they will not be representative of the population of home-based providers. Table B.1 describes minimum provider sample sizes by provider characteristic needed to conduct the analyses described in section B7.


Table B.1. Minimum provider sample sizes by provider characteristic

Provider characteristics

Minimum provider sample size

Type of HBCC


FCC

80

FFN

35

Urbanicity

 

Urban

35

Rural

35

Race and ethnicity

 

Black

35

White

30

Hispanic

20

Other

~

Target total providers

150

FCC = licensed, regulated family child care (FCC) providers; FFN = family, friend, and neighbor providers who are legally exempt from state licensing or other state regulations for child care

Note: Beyond these minimums, the additional sample may be from any of the subgroups.

The study team will recruit family respondents by asking participating providers to distribute the family survey to all eligible families in their care. The target sample size of the family survey is 150 respondents. For analysis purposes, the minimum number of family surveys needed for analysis is 100. There are no minimum target sample sizes by family characteristics. Family respondents will not be representative of the population of families using home-based child care. However, by recruiting providers with the varied characteristics noted above, the study team expects to obtain some variation in families’ demographic characteristics and relationship to providers. For example, providers who live in urban or rural areas will likely care for families who also live in urban or rural areas, respectively. Compared to FCC providers, FFN providers are more likely to care for relative children and likely to be the same race or ethnicity as the families they care for.2


Recruitment and Site Selection

The study team will use non-probability, purposive recruitment to identify and recruit a convenience sample of home-based providers from at least 10 states in different regions of the United States. The study team will return to states from the pilot study to recruit providers.3 These states were selected by prioritizing states with:

  1. Community organizations that offer support and quality improvement opportunities to home-based providers, that have existing relationships with the study team, may be willing to support the study, and likely have the capacity to work with the study team to recruit enough providers.  

  2. Different policy contexts, such as child care licensing thresholds, numbers of children allowed in home-based child care settings, policies on subsidy eligibility, and participation of home-based providers in Quality Rating and Improvement Systems (QRIS).

  3. Different geographic regions to capture variation in state and regional context and conditions.

The study team will identify additional states as needed to reach recruitment targets. Any additional states selected for the validation study will also be selected using the above criteria.


In each of the study states, the study team will partner with organizations that provide support to home-based providers in order to recruit home-based providers. The study team will partner with about 15 organizations. The team will prioritize selecting community organizations that:

  1. Have existing relationships with the study team on this and other research studies (such as organizations that have participated in the pilot study).  

  1. Work with home-based providers from a range of characteristics outlined above. 

  2. Have experience providing quality improvement-related supports (for example, professional development, training, or coaching).

The team will work closely with all community organizations that support HBCCs to recruit home-based providers and adjust selection of organizations as the team reaches recruitment targets (for example, if the team has difficulty recruiting rural FFN providers during the beginning of data collection, the team may invite new organizations that have experience working with such providers). The study team will work with each community organization to designate a site coordinator who will be the study’s main point of contact (Instrument 1, Appendix A).


The team will recruit 50 home-based providers who will participate in observations in a subset (at least 2) of the study states. The team will work with community organizations within these states to focus exclusively on recruiting home-based providers who will participate in observations. The team will also base selection of community organizations on the study team’s experience working with them during the pilot study. The study team will prioritize community organizations that successfully referred eligible providers to the pilot study. If the study team is unable to reach the desired targets for observations, the team will invite new community organizations within the selected study states until targets are reached.


In addition to partnering with organizations, the study team will also draw from publicly available state licensing lists as needed to supplement providers referred to the study from community organizations. The study team will also use snowball recruitment by asking for recommendations from other providers contacted by the study team during recruitment to participate in the study.


The study team will work with all participating home-based providers to recruit English-speaking families of school-age children who attend their HBCC to complete the family survey.


Because respondents will be purposively selected, they will not be representative of their populations. Instead, the study team aims to obtain variation in home-based providers’ characteristics to gather a diverse group of respondents to respond to the instruments.



B3. Design of Data Collection Instruments

Development of Data Collection Instruments

The recruitment information collection instruments (Instrument 2 and Instrument 3) were developed by the study team to include the key questions necessary to determine whether providers will be invited to participate in the study. These instruments also guide the discussion of the provider’s responsibilities when sharing the family survey with families (and if completed by paper, collecting it and returning it to the study team). Instrument 3 also determines providers’ eligibility and willingness to participate in the observation.


To develop the HBCC-NSAC Toolkit provider questionnaire (Instrument 5), the study team selected domain topics based on the larger HBCCSQ project’s conceptual framework, literature review, and measures review to highlight features of quality that are typically implemented differently or more likely to occur in HBCC than in other child care and early education (CCEE) settings, and that might support positive outcomes for children and families. Since the HBCC-NSAC Toolkit provider questionnaire is designed to measure dimensions that were identified as inadequately measured or not measured in existing instruments, the study team developed new items after review of existing measures of quality used in HBCC or used in school-age youth program settings and literature about the most promotive and protective practices for school-age children in each domain and dimension. Academic experts and experts with lived experience providing HBCC provided input to measures development from selection of the domains and dimensions through review of final items in the draft provider questionnaire. These experts provided written and/or verbal feedback. Additionally, staff from community organizations reviewed the draft provider questionnaire and provided feedback and the study team revised (see Supporting Statement Part A Section A.8). Through these efforts, the team did not request the same information from more than nine individuals and therefore activities were not subject to the Paperwork Reduction Act. In July 2022, the study team also pre-tested the English version of the HBCC-NSAC Toolkit provider questionnaire with 9 providers with a mix of characteristics including FCC and FFN providers in urban and rural areas, with different racial and ethnic identities, and some who are bilingual in Spanish and English. The study team used participant feedback to revise and finalize the introduction, instructions, and items in the provider questionnaire.


After the pre-test, the study team conducted a pilot study, beginning in March 2023 (completed under ACF’s generic clearance 0970 – 0355). Based on findings from the pilot study, the study team made improvements to the provider questionnaire including streamlining the introductory text, updating the response options scale from a 4-item scale to a 7-item scale, dropping or revising items to remove the ‘not applicable’ option, and removing the ‘don’t know’ response option. To streamline the questionnaire to the most essential items, the study team also dropped items that: (1) did not vary (for example, almost everyone responds “3”) or (2) or had covariance and/or misfit problems.


Table B.2 describes the key dimensions within each domain of the HBCC-NSAC Toolkit provider questionnaire. Items were developed to assess each key dimension.

Table B.2 Domains and key dimensions in the HBCC-NSAC Toolkit provider questionnaire

Domain

Key dimensions

Support for social development

  • Builds and strengthens a positive relationship with children

  • Supports children’s perspective-taking and nonverbal communication

  • Supports children’s social skills

  • Supports antibullying and antibias practices

Support for emotional development

  • Helps children understand and regulate emotions

  • Supports a positive sense of belonging

  • Supports a positive self-identity (including racial and ethnic identity)

Positive and proactive behavior management

  • Uses predictable and responsive routines

  • Uses proactive or positive disciplinary practices

Support for learning

  • Provides learning opportunities

  • Supports positive approaches to learning and a growth mindset

  • Scaffolds problem-solving

  • Collaborates with families to strengthen learning connections with homea

Support for health and physical development

  • Provides a variety of activities to support physical well-being

  • Communicates with families about health and physical developmenta

  • Scaffolds physical activity

  • Promotes health, safety, and nutrition

aThese dimensions were removed from the provider questionnaire after the pilot study.


For validation purposes in this study, the provider questionnaire (Instrument 5) also includes items from the Multicultural Teaching Competency Scale (MTCS; Spanierman, 2011). The study team selected the MTCS from measures used in the field that the team hypothesized are associated with the constructs represented in the Supports antibullying /antibias practices and the Supports a positive self-identity (including racial and ethnic identity) dimensions in the provider questionnaire.


The family survey (Instrument 6) includes child and family background information items and items from sub-scales in the Emlen Scales (Emlen 2000; composite subscale, parent’s perception of caregiver’s cultural sensitivity subscale, and happy, safe, secure subscale), an existing measure used for validation purposes. The study team selected the Emlen subscales from measures used in the field that the team hypothesized are associated with constructs represented in the domains and dimensions in the provider questionnaire. The study team will compare families’ responses to the Emlen scale against provider’s responses in the provider questionnaire in order to help validate the provider questionnaire.



B4. Collection of Data and Quality Control

ACF has contracted with Mathematica for this data collection. Staff from the Erikson Institute, a subcontractor to Mathematica, provided input on the design and will provide consultation on data collection and analysis. Study team members at Mathematica have experience recruiting respondents for studies that focus on CCEE, respondents who are experiencing high levels of poverty, and respondents who are members of historically marginalized communities. Importantly, study team members have experience recruiting for and conducting the pilot study. The study team made updates and improvements to all relevant study materials based on pilot study findings to help respondents better understand the proposed study and their role.


Recruitment protocol

The study team developed flyers and other recruitment materials to facilitate outreach to community organizations and potential participants (Appendices A and B) that explain the study clearly and concisely in plain language. These materials were updated based on pilot study findings to help participants better understand the current study.


The study team will primarily work with community organization site coordinators to identify providers who may be interested in participating in the study. The study team will discuss the study and site coordinator role on an onboarding call (Instrument 1) and Appendix A contains the other materials used to work with community organizations. The study team will also contact providers using contact information obtained from public lists of providers as needed. The study team will screen all providers for eligibility and to recruit them into the study. At the end of the recruitment call (Instrument 2 or 3), the protocol asks providers to make recommendations of other providers who might like to participate in the study.


The team will work with eligible providers to recruit families.


Provider recruitment

To generate provider awareness of and interest in the study, the study team will work with community organizations site coordinators to communicate information about the study to providers (for example, posting or distributing flyers, introducing the study during provider gatherings) or arrange for community organization liaisons to speak about the study at the organizations’ virtual gatherings. With the provider’s permission, the site coordinators will share the contact information of interested providers with the study team using a secure website (Instrument 1). Materials related to working with community organization site coordinators are in Appendix A.


Trained recruiters will send an email and flyer to introduce the study to providers identified from community organizations, state lists, or other providers (Appendix B). Recruiters will also call providers to discuss the study and learn about the providers’ characteristics and eligibility using a call script (Instrument 2 or 3, depending on whether the provider is in a site selected for observations). The study team will also ask participating providers to recommend other providers in the area who provide care to school-age children. The study team will continue recruitment until the number of overall respondents and minimums for each type of provider needed for analyses are reached (see Table B.1). Section B5 describes the study’s expected response rates.


Family recruitment

During the recruitment call (Instrument 2 or 3), recruiters will discuss the providers’ role in the family survey data collection and ask if providers’ are able to share the family survey with at least one of their families who have a school-age child and are able to complete the family survey in English. Only providers who can share the family survey with at least one eligible family will be included in the study. The study team will mail providers materials to distribute the family survey to all eligible families in their care. This mailing will include all materials needed to recruit and support the participation of eligible families in the study (Appendix F). These materials were updated based on pilot study findings to help respondents better understand their role in the family survey process (for example, who to give the family survey to and how to return the family survey). To encourage participation from family respondents, families will receive a pre-complete token of appreciation along with the family survey information. Designated study team members will be available to answer study related questions from providers or family members as needed to support family recruitment. The study team may contact family participants directly to encourage participation for participants who: 1) reach out to the study team via the study phone number or email provided in their materials, 2) give providers permission to share contact information with the study team, or 3) have incomplete web-based family surveys and completed the consent portion of the instrument.


Instrument mode and data collection

The provider and family instruments and consent forms will be available electronically and on paper. All participants will complete a consent form before completing their applicable instrument(s) (Appendix E). Providers and families will complete the consent form using the same mode they will use to complete their instrument(s). For example, providers who complete the web-based provider questionnaire (Instruments 5) will first complete a web-based consent form.


Based on provider preference, the study team will email providers a link to the web-based provider questionnaire (optimized so providers can respond on smartphones), mail the paper-based version, or call the provider to administer the questionnaire by phone. If providers provide permission to text them, the study team may send text reminders. Otherwise, reminders will be made by email, mail, or telephone.


For the subset of providers who will participate in observations, the study team will collect scheduling information during the recruitment call (Instrument 3), schedule the date and time, and go over the logistics for the in-person visit during a pre-visit phone call (Instrument 4), in which the assigned observer will confirm logistics and reschedule visits if needed. The study team will also provide providers with a letter notifying parents about the observation (Appendix D). Assigned observers will administer the Family Child Care Program Quality Assessment (FCC PQA) observation during the site visit.4 If the provider questionnaire or family surveys are incomplete at the time of observation, the observer will remind the participants and offer each respondent the option to complete their respective instrument in-person (for example, offer the parent a paper copy to complete and hand back).


The study team will ship each participating provider a family survey distribution packet (Appendix F) with: 1) instructions for the provider about how to distribute to and collect the family surveys from their eligible families, 2) family packets for each eligible family (the content of the packets explained below), and 3) a $10 gift card honorarium for the provider (honoraria described in Supporting Statement Part A Section A13). To aid the paper-based family survey option, the instructions to the providers will include easy-to-follow steps for distributing and collecting the surveys from families with a school-aged child.


The family survey packets (Appendix F) for each eligible family will contain: 1) instructions with a link to complete the web-based family consent (Appendix E) and survey (Instrument 6), 2) the family survey flyer (Appendix B), 3) paper-based family consent (Appendix E) and survey (Instrument 6) with self-seal envelopes, 4) the study team phone number if the family would like to call in to complete the family survey over the telephone, and 5) a $5 gift card (tokens of appreciation described in Supporting Statement Part A Section A9).5 For families who choose the paper-based option, they will place their completed paper consent form and survey into the provided self-seal envelope and seal them before returning to providers. Providers will use the prepaid shipping materials provided by the study team to ship completed paper instruments to the study team.


Provider questionnaire and family survey data collection monitoring

The study team will monitor data quality and consistency throughout the data collection. Before data collection begins, recruiters and telephone interviewers will participate in training to discuss the purpose of the HBCC-NSAC Toolkit provider questionnaire, family survey, and the validation study. Recruiters will learn the importance of testing the provider questionnaire with a diverse group of home-based providers, as well as strategies to build rapport and execute the recruitment and data collection protocol. Also, during training, recruiters will practice using the recruitment script to ensure provider characteristics are collected accurately and consistently across respondents. Telephone interviewers will review the provider questionnaire and family survey and best practices for administration and entry into the web-based versions.


During data collection, the study lead will hold weekly check-ins with recruiters so staff can discuss progress, ask questions, troubleshoot problems, and share successful strategies. In addition, study team members will discuss any questions or comments received from participants to ensure the study team accurately responds to questions or issues reported by respondents. The team will monitor completion of the instruments, including conducting initial data checks as the data are received, for example, to identify high missingness. As described above, the team will follow up with providers and families as needed to address any issues. The web-based instruments will include checks for completeness to minimize missing data, for example, the programmed provider questionnaire may notify respondents if they skipped a question and ask them to provide a response. Every phone interviewer will be monitored at least twice during data collection.


Observation data collection monitoring. Before data collection begins, observers will participate in a training led by the FCC PQA trainers to ensure observers are certified to conduct observations according to the developer’s’ standards for reliability. In addition, observers will also participate in a training led by the study team on the study’s background and logistics for data collection, such as interacting with providers and giving providers honoraria. Field supervisors will oversee the work of the field staff conducting FCC PQA observations (observers). Each observer will be required to check in so field supervisors can monitor daily data collection progress. Trained staff will also review all materials returned from the field for completeness and follow up with observers, if needed. Using daily status reports and weekly meetings with field supervisors, the study team will monitor observation completion rates. The team will also check observation reliability to ensure observers maintain their reliability on the FCC PQA once, midway through the field period, to prevent drift using videos and consensus scores provided by the FCC PQA developers. To compare the observers’ scores for reliability, all observers will need to match on at least 80 percent of the consensus scores. If observers are not reliable on the first video, they will receive feedback and then be asked to code a second video. Observers who are unable to achieve the desired level of inter-rater reliability will not be allowed to continue conducting observations.



B5. Response Rates and Potential Nonresponse Bias

Response Rates

The HBCC-NSAC Toolkit provider questionnaire and the family survey are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported.


However, because the study team will screen potential respondents for eligibility and interest in participating, the team will calculate and report basic information about how many of the potential respondents that were screened participated, were not eligible, or declined to participate, and did not complete all study activities to ACF. Table B.3 describes the expected response rates and number of responses. The team’s expected response rates for the validation study are based on actual response rates from the pilot study. The study team expects that 73 percent of the potential Instrument 2 respondents and 50 percent of the potential Instrument 3 respondents screened will be eligible and agree to participate. Among those recruited, the study team expects that 67 percent of providers will complete the provider questionnaire (Instrument 5) and that 74 percent of recruited providers will distribute the family surveys (Instrument 6) to an average of two eligible families per provider. The study team also expects 50 percent of families who receive a family survey will complete the family survey.

Table B.3. Expected response rates and number of responses by instrument

Instrument

Number of recruited participants

Expected response (percentage)

Expected number of responses

1. Provider questionnaire (instrument 5)

224

0.67

150

2. Family survey (instrument 6)

332a

0.50

166

aThe study team expects 74 percent of recruited participants to distribute the family surveys to an average of two eligible families per provider. These assumptions are based on pilot study rates.


NonResponse

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Basic demographics of all potential respondents screened—including those who participate, are not eligible, or decline to participate—will be documented and reported in written materials associated with the data collection. 


The study team will collect basic demographic and key characteristics about providers and the families they care for (Instrument 2 or 3, Instrument 5, and Instrument 6), including type of HBCC (FCC and FFN) and race and ethnicity of providers and children in care. The study team will assess whether non-response was more prevalent among particular types of providers, for example, FCC and FFN. In the HBCC-NSAC Toolkit pilot study, the team examined the provider questionnaire item level nonresponse and dropped items with the goal of minimizing item non-response in the instruments used in this study. If item level nonresponse in the validation study responses is greater than ten percent, the study team will examine the associations between nonresponse and provider characteristics, such as race or ethnicity of providers, race or ethnicity of children in care, provider experience levels, or the number of children cared for and their age groups. Examination of item level nonresponse will be used to inform the analyses of the items and dimensions.


The study team will follow up with recruited providers who have not completed all data collection activities regularly by email, regular mail, and phone texting to encourage completion and troubleshoot any technology, mailing, or logistical challenges related to their own or their invited family data collection activities.



B6. Production of Estimates and Projections

The data will not be used to generate population estimates, either for internal use or dissemination. Information reported will clearly state that results are not meant to be generalizable. 



B7. Data Handling and Analysis

Data Handling

Trained study team recruiters will enter data collected during the recruitment call into a secure electronic database. The original recruitment call script used to collect data about provider characteristics and eligibility will be saved to a secure Mathematica network folder. Data retrieved from the web-based instruments (including data collected by phone and paper that is entered into the web-based instruments) will be saved on a secure drive accessible only to Mathematica study team members. Direct export of the electronic data to the secure drive will result in minimal processing. See Supporting Statement Part A Section A10 for further discussion of data handling.


Providers will receive prepaid mailing materials to return completed paper provider questionnaires (Instrument 5) and/or completed paper family surveys (Instrument 6) to Mathematica. Families may also receive prepaid mailing materials to return their completed paper family survey directly to Mathematica if requested.


Data Analysis

The analysis methods will use both classical and item response theory (IRT) approaches to examine the domains and dimensions in the provider questionnaire, including:

  • Descriptive statistics (median, standard deviation, and frequencies) to examine the distribution of responses overall and by subgroup (i.e., FCC and FFN providers, urban and rural, and racial and ethnic groups).

  • Reliability estimates. To ensure the provider questionnaire provides reliable measurements of care for school-age and mixed-age groups of children in HBCC, the study team will estimate internal consistency reliability as measured by the Cronbach’s alpha (Cronbach 1951) and examine item-total correlations.

  • IRT analysis. To examine item functioning within dimensions and evaluate domain validity, the study team will conduct one-parameter IRT analyses (Rasch models) that are informed by results from the pilot study findings. For example, the pilot study analysis indicated that the original response scale categories did not discriminate well among providers for most items, and the validation study analysis might indicate a need to collapse two of the response categories in the revised response scale for better measurement (Instrument 5).


Additional analyses for the validation study will provide further evidence for construct, convergent, and discriminant validity, described below.


Construct validity evidence. The study team will examine whether the items in each dimension of the provider questionnaire provide similar evidence for that construct across subgroups of interest. The team will examine the Rasch rating scale model6 for each dimension (with the full sample) and the model’s replicability for specified subgroups of interest (for example, FCC and FFN providers or urban and rural). The team will conduct analyses of potential DIF for subgroups with a minimum of 35 respondents (see Table B.1 for the subgroups expected to reach this minimum). The team will also examine interfactor correlations by subgroups with at least 35 respondents to understand if the associations among factors (dimensions) are similar across subgroups.


The team will also conduct confirmatory factor analyses (CFA) for the overall sample for each domain and combinations of the domains (for example, all items in the emotional and the positive behavior management domains). The sample size of 150 is too small to examine all the items and dimensions in a single analysis. Some dimensions have 15 or more items. Given the sample sizes, the team will not conduct these analyses by subgroups.


Convergent and discriminant validity. The study team will examine evidence of convergent validity of the three validation measures: 1) the MTCS in Instrument 5, 2) the Emlen scales in Instrument 6, and 3) the FCC PQA observational measure. Each validation measure will be paired with dimensions in the provider questionnaire that most closely aligns with it.


The team will also examine Pearson product-moment correlations with the validation measures for each of the dimensions. The expected positive moderate correlations will provide evidence of convergent validity. Low or negative correlations will provide evidence of discriminant validity.


Data Use

The study team will prepare data file(s) with documentation for ACF. If ACF opts to archive the data from this validation study for secondary use, documentation will include information necessary to contextualize and assist in interpretation of the data. In consultation with ACF, the study team will develop other deliverables presenting psychometric evidence found for the HBCC-NSAC Toolkit provider questionnaire and recommendations for future use. This may include a report on study findings and the HBCC-NSAC Toolkit provider questionnaire. The agency also intends to make the HBCC-NSAC Toolkit provider questionnaire available to the public for professional development use and research purposes.


B8. Contact Persons

Table B.4 lists the federal and contract staff responsible for the study, along with each individual’s affiliation and email address.

Table B.4. Individuals responsible for study

Name

Affiliation

Email address

Ann Rivera

Office of Planning, Research, and Evaluation

Administration for Children and Families

Ann.Rivera@ACF.hhs.gov


Bonnie Mackintosh

Office of Planning, Research, and Evaluation

Administration for Children and Families

Bonnie.Mackintosh@acf.hhs.gov

Ashley Kopack Klein

Mathematica

AKopackKlein@mathematica-mpr.com

Sally Atkins-Burnett

Mathematica

SAtkins-Burnett@Mathematica-mpr.com

Annie Li

Mathematica

ALi@mathematica-mpr.com

Laura Kalb

Mathematica

LKalb@mathematica-mpr.com

Myah Scott

Mathematica

MScott@mathematica-mpr.com


Attachments

Instrument 1. Community organization onboarding call

Instrument 2. Provider telephone script and recruitment information collection, non-observation

Instrument 3. Provider telephone script and recruitment information collection including observations

Instrument 4. Observation scheduling call

Instrument 5. HBCC-NSAC Toolkit provider questionnaire

Instrument 6. Family survey

Appendix A. Community organization outreach materials

Appendix B. Respondent outreach materials

Appendix C. Respondent reminders

Appendix D. Observation materials

Appendix E. Consent letters and forms

Appendix F. Family data collection instructions

Appendix G. Frequently asked questions

Appendix H. Respondent thank you letters





1 Bromer, Juliet, Toni Porter, Christopher Jones, Marina Ragonese-Barnes, and Jaimie Orland. “Quality in Home-Based Child Care: A Review of Selected Literature.” OPRE Report 2021-136. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2021. 


Del Grosso, Patricia, Juliet Bromer, Toni Porter, Christopher Jones, Ann Li, Sally Atkins-Burnett, and Nikki Aikens. “A Research Agenda for Home-Based Child Care.” OPRE Report 2021-218. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2021. 

2 Schochet, O., Li, A., Del Grosso, P., Aikens, N., Atkins-Burnett, S., Porter, T., & Bromer, J. (2022). A national portrait of unlisted home-based child care providers: Provider demographics, economic wellbeing, and health. OPRE Brief #2022-280. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, US. Department of Health and Human Services.

Schochet, O., Li, A., Del Grosso, P., Aikens, N., Atkins-Burnett, S., Porter, T., & Bromer, J. (2022). A national portrait of unlisted home-based child care providers: Learning activities, caregiving services, and children served. OPRE Brief #2022-292. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, US. Department of Health and Human Services.

3 The study team recruited providers from the following 10 states in the pilot study (ACF’s generic clearance 0970 – 0355): Arizona, California, Connecticut, Georgia, Illinois, Maryland, North Carolina, Nevada, New York, and Texas.

4 HighScope Educational Research Foundation. Family Child Care Program Quality Assessment (PQA) Administration Manual. Ypsilanti, MI: HighScope Press, 2009. 

5 See section A.9 for more information about the family pre-token of appreciation.

6 Rasch models are unidimensional.

12


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSara Skidmore
File Modified0000-00-00
File Created2024-07-20

© 2024 OMB.report | Privacy Policy