ECE-ICHQ Supporting Statement B

ECE-ICHQ OMB Section B_clean_9.18.2015.docx

Pre-testing of Evaluation Surveys

ECE-ICHQ Supporting Statement B

OMB: 0970-0355

Document [docx]
Download: docx | pdf


Assessing the Implementation and Cost of High Quality Early Care and Education: Comparative Multi-Case Study, Phase 1



OMB Information Collection Request

0970-0355




Supporting Statement

Part B

September 2015


Submitted by:

Office of Planning, Research and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


7th Floor, West Aerospace Building

370 L’Enfant Promenade SW

Washington, DC 20447


Project officers:

Ivelisse Martinez-Beck, Senior Social Science Research Analyst and

Child Care Research Team Leader

Meryl Barofsky, Social Science Research Analyst



TABLES

B.1 Number of centers selected for Phase 1 2

B.2 OPRE project officers, study team leadership, and TEP for the ECE-ICHQ 5



ATTACHMENTS

ATTACHMENT A: CENTER DIRECTOR TELEPHONE SCRIPT

ATTACHMENT B: CENTER DIRECTOR SELF ADMINISTERED QUESTIONNAIRE (SAQ)

ATTACHMENT C: SAQ COGNITIVE INTERVIEW PROTOCOL

ATTACHMENT D: IMPLEMENTATION INTERVIEW PROTOCOL

ATTACHMENT E: COST WORKBOOK

ATTACHMENT F: COST INTERVIEW PROTOCOL

ATTACHMENT G: TIME-USE SURVEY

ATTACHMENT H: FEDERAL REGISTER NOTICE

ATTACHMENT I: INITIAL EMAIL TO CENTER DIRECTORS

ATTACHMENT J: ADDITIONAL RECRUITMENT MATERIALS




B1. Respondent Universe and Sampling Methods

The target population for this information collection is center-based early care and education (ECE) providers that serve children from birth to age 5. The sampling plan prioritizes including different types of ECE centers from three states that represent different geographical regions and types of investments in early care and education in order to understand the cost of implementing quality care in a variety of contexts.

The first priority is to collect information in states with Quality Rating and Improvement Systems (QRIS) that clearly link to classroom observational scores (such as the Environment Rating Scale or the Classroom Assessment Scoring System) so that these scores can serve as a proxy for identifying centers at different levels of quality. The second priority is to include states that fund and offer prekindergarten programs within public schools, to achieve variation in the type of ECE settings included. The third priority is to include states with high numbers of programs participating in their QRIS and, specifically, with programs that achieve the highest rating level to ensure adequate sample sizes of centers from which to recruit. Fourth, the study team will select three states that vary in their child care licensing regulations (which set a floor for quality) from those states that meet the first three priorities. Finally, the three states should be from different geographic regions.

Once the three states are selected, the study team will use information from state child care administrators, the Office of Head Start’s central and regional offices, state and local education agencies, and QRIS data to select centers.

Phase 1 will collect data in 8 centers in each state, for 24 total centers (see Table B.1). To make sure instruments function well in a variety of settings, the 8 centers will comprise 6 community-based centers, 4 of which will have medium or high QRIS ratings. Two of the 4 centers will have mixed funding (one center serving infants and toddlers and one not) and 2 will have no public funding (one serving infants and toddlers and one not). The remaining 2 community-based programs will have a low QRIS rating and mixed funding (one program will serve infants and toddlers and one will not). There will also be one Head Start program serving infants and toddlers, and one state-based prekindergarten program selected from each state.


Table B.1. Number of centers selected for Phase 1


State 1

State 2

State 3

Total

Serves infants and toddlers

Does not serve infants and toddlers

Serves infants and toddlers

Does not serve infants and toddlers

Serves infants and toddlers

Does not serve infants and toddlers

Community-based centers







18

High or medium QRIS ratinga







9

No public funding

1

0

1

0

1

0


High-subsidy fundingb

0

0

0

0

0

0


Mixed fundingc

1

1

1

1

1

1


Low QRIS ratinga







9

No public funding

1

0

1

0

1

0


High-subsidy funding

0

0

0

0

0

0


Mixed funding

1

1

1

1

1

1


Head Start or Early Head Start

1

0

1

0

1

0

3

School-based prekindergarten

NA

1

NA

1

NA

1

3

Total







24

Note: Numbers in italics are subtotals and are not included in overall totals.

a The equivalent state-specific ratings for the high or medium and low QRIS rating categories will be determined once state selection is finalized. The study team will aim to select cutoffs that result in comparable characteristics among centers in each category across states.

b High-subsidy centers are those that serve a high proportion (50 percent or more) of children receiving child care subsidies through the Child Care and Development Fund or Temporary Assistance to Needy Families.

c Mixed funding centers are those that draw from tuitions and one or more public funding sources or centers that draw from multiple public funding sources.

The study team will build lists of centers that fit each of the selection profiles (for example, centers with a medium or high QRIS rating, that receive mixed funding, and serve infants and toddlers). Centers that are selected but choose not to participate will be replaced by another center fitting the same profile in the same state (for example, in State A, with a low QRIS rating, mixed funding, and serving infants and toddlers). Once recruited, the study team expects to complete all data collection activities at each site.

B2. Procedures for Collection of Information

Prior to this information collection, the study team will secure the engagement of three states through calls with state administrators, including QRIS and state prekindergarten administrators. As noted above, the study team will select centers using information from QRIS, subsidy systems, and the Office of Head Start.

After an initial email notifying centers about their potential participation and explaining the study (Attachment I), members of the study team will recruit selected centers over the phone (Attachment A; additional letters and scripts can be found in Attachment J, along with other recruitment materials). Once centers agree to participate, site visits will be scheduled at a time most convenient for center staff. At least two weeks prior to the visit, the team will send the self-administered questionnaire (SAQ) and cost workbook to center directors and finance managers. Each will be accompanied by a cover letter and instructions. As noted in Section A, center directors and finance managers will be asked to complete these and send them back to the study team prior to the site visit.

On site, the following activities will occur:

  • Conduct semi-structured interviews with the program staff who completed the SAQ and cost workbook.

  • Conduct review of financial, staffing, planning, and relevant classroom documents and debrief on document review.

  • Conduct semi-structured interview with administrators of larger organization, if applicable.

  • Work with the center director to distribute time-use surveys to program staff.

  • Staff complete time-use surveys.

  • Debrief staff after they have completed the time-use survey to determine if questions were easily understood and the response categories relevant.

B3. Methods to Maximize Response Rates and Deal with Nonresponse

Expected Response Rates

During Phase 1 of the data collection, the study team will seek guidance from state QRIS administrators, the Office of Head Start, state Head Start liaisons, and state prekindergarten administrators to select the 24 centers. The team will work with these administrators to create a comprehensive list of centers that meet our selection criteria, with enough centers in reserve to replace centers that are unable or unwilling to participate. Based on past studies, we expect to reach out to 72 centers in order to secure the participation of the 24 centers needed for this study. All 72 centers will participate in recruiting discussions; 24 centers will participate in the full study engagement call (a response rate of 100 percent). The team will then complete all of the data collection materials on site with all 24 centers that agree to participate in the study (a response rate of 100 percent). The team will only schedule site visits for a time when the center director, education manager, and finance managers are available.

Dealing with Nonresponse

The potential for challenges with nonresponse exist mainly for the time-use survey for which the study team intends to collect data from all teaching staff at a particular center. The team will work closely with each center to maximize teaching staff participation in the time-use survey. The team intends to distribute and collect the paper time-use surveys to all staff during the course of the site visit. In instances when a teacher does not complete the survey, the study team will attempt to understand why either through follow-up with the teacher or the center director. The time-use data will be analyzed by type of staff (such as lead teachers or aides) for use in the cost analysis. We are attempting to collect data from all teaching staff at a particular center in Phase 1 in order to understand the extent of variation within centers and among staff with similar roles. If some teaching staff do not respond to the survey, we will conduct the analysis using available data from other staff in a similar position. We may exclude centers with high nonresponse from the analysis of variation in staff time-use. The team is not collecting information on the characteristics of individual staff members that would be needed to compare respondents to nonrespondents; however, we will compare characteristics of centers with high nonresponse with characteristics of other centers in the study sample.

Maximizing Response Rates

Mathematica has extensive experience in collecting implementation information and cost data with high response rates from staff in education, social services, and health programs. To maximize response rates in this purposive sample, the study team will contact state QRIS administrators, the Office of Head Start, state Head Start liaisons, and state prekindergarten administrators to obtain a sufficient list of centers that meet the selection criteria in each of the three states. The study team will probe during the cognitive interview following each administration of a data collection tool to understand questions that respondents may have trouble answering or do not know the answer to.

B4. Tests of Procedures or Methods to be Undertaken

The methods used to understand the key functions and their costs (the SAQ, cost workbooks, time-use surveys, and interview protocols) have not previously been used together for the purpose of understanding costs related to high quality child care. The purpose of this data collection is to serve as a test of the procedures and methods in this combination and across a variety of ECE settings in order to produce measures that will be feasible, applicable, and meaningful to inform decision making and resource use in ECE center-based settings. The study team will pre-test the tools and procedures for Phase 1 in three centers as part of a pilot study prior to this information collection. In the pilot, the study team will take the same cognitive interviewing approach as in Phase 1 to make sure language is easily understandable to respondents.

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Mathematica and consultant Dr. Elizabeth Davis of the University of Minnesota are conducting this project under contract number HHSP23320095642WC. Mathematica developed plans for this data collection and analysis and consulted with a technical expert panel (TEP). Leaders of the study team from the Office of Planning, Research and Evaluation (OPRE) and from Mathematica are listed in Table B2, along with members of the TEP.

Table B.2. OPRE project officers, study team leadership, and TEP for the ECE-ICHQ

Name

Affiliation

Ivelisse Martinez-Beck

Office of Planning, Research and Evaluation

Meryl Barofsky

Office of Planning, Research and Evaluation

Gretchen Kirby

Mathematica Policy Research

Kimberly Boller

Mathematica Policy Research

Pia Caronongan

Mathematica Policy Research

Andrew Burwick

Mathematica Policy Research

Cassandra Meagher

Mathematica Policy Research

Anne Gordon

Mathematica Policy Research

Melanie Brizzi

Office of Early Childhood and Out of School Learning, Indiana Family Social Services Administration

Rena Hallam

Delaware Institute for Excellence in Early Childhood, University of Delaware

Lynn Karoly

RAND Corporation

Mark Kehoe

Brightside Academy

Henry Levin

Teacher’s College, Columbia University

Katherine Magnuson

School of Social Work, University of Wisconsin–Madison

Tammy Mann

The Campagna Center

Nancy Marshall

Wellesley Center for Women, Wellesley College

Allison Metz

National Implementation Research Network, FPG Child Development Institute, University of North Carolina at Chapel Hill

Louise Stoney

Alliance for Early Childhood Finance


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOPRE OMB Clearance Manual
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy