REL West: Study of the Program for Infant and Toddler Care (Task 2)
OMB Supporting Statement
Contract No: ED-06-CO-0014
Prepared for
Office of Information and Regulatory Affairs
Office of Management and Budget
Docket Library, Room 10102
725 17th Street NW
Washington, DC 20503
Prepared by
WestEd
730 Harrison Street
San Francisco, California 94107
Berkeley Policy Associates
440 Grand Ave, Suite 500
Oakland, California 94610
Revised Submission: July 17, 2007
Table of Contents
Supporting Statement For Paperwork Reduction Act Submission 3
A. Justification 3
1. Circumstances that Make Collection of Data Necessary 3
2. Purposes and Use of the Data 4
3. Use of Improved Information Technology to Reduce Burden 13
4. Efforts to Identify and Avoid Duplication 13
5. Efforts to Minimize Burden on Small Businesses or Other Entities 13
6. Consequences if the Information is Not Collected or is Collected Less Frequently 13
7. Special Circumstances 13
8. Federal Register Comments and Persons Consulted Outside the Agency 13
9. Payments to Respondents 14
10. Assurances of Confidentiality 14
11. Justifications for Questions of a Sensitive Nature 16
12. Estimate of Information Collection Burden 17
13. Estimate of Total Annual Cost Burden 19
14. Estimates of Annualized Costs to the Federal Government 20
15. Change in Reporting Burden 20
16. Plans for Tabulation and Reporting 21
17. Display of Expiration Date for OMB Approval 23
18. Exceptions 23
References for Part A 24
B. Collections of Information Employing Statistical Methods 27
1. Respondent Universe and Sampling Methods 27
2. Statistical Power of the Sample 31
3. Maximizing Response Rates 33
4. Pretesting 36
5. Contact Information 36
References for Part B 38
Appendix A: Draft Recruitment Flyers
Appendix B: Draft Consent Forms
Appendix C: Draft Instruments
Exhibits
Exhibit 1. Data Collection Instruments 9
Exhibit 1A.Copyright Status of Instruments 12
Exhibit 2. Procedures for Handling Class 1 Data 14
Exhibit 3. Burden Estimate for Each Data Collection Activity 17
Exhibit 4. Burden Estimate For Recruitment Meetings 19
Exhibit 5. Annual Number of Responses and Hour/Cost Burden by Task and Total 20
Exhibit 6. Study of PITC: Reporting Schedule 21
Exhibit 7. Overview of Study Timeline 22
Exhibit 8. Study of PITC: Waves for Recruitment, Random Assignment,
Data Collection, and Program Implementation 23
Exhibit 9. Licensed Child Care Programs in Target Counties 27
Exhibit 10. Statistical Power Calculations 32
SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION
Study of the Program for Infant Toddler Care
The Study of the Program for Infant Toddler Care will be conducted by Berkeley Policy Associates in partnership with the University of Texas and SRM Boulder. The Program for Infant Toddler Care (PITC) was developed by WestEd in 1985 in partnership with the California Department of Education. Since that time it has grown to be a major provider of infant and toddler training. The primary goal of the evaluation is to conduct a rigorous impact assessment of the PITC, including impact on both program quality and on children’s development, particularly on language, cognitive, and social skills that are closely associated with school readiness. Additional goals are to analyze differential impacts on child care centers and family child care homes, and on different sub-groups of children, by language and family background; to inform improvement of the program and guide its replication throughout the region; and to inform policy on the development of high quality infant/toddler child care. The study will use a cluster-based random assignment design, and will recruit 240 child care programs and about 1000 children in Southern California and Arizona.
A. Justification
1. Circumstances that Make Collection of Data Necessary
This information collection is being conducted as one of the Task 2 Studies (Rigorous Applied Research and Development) of the 2005-2010 Regional Education Laboratories Program. The current authorization for the Regional Educational Laboratories program is under the Education Sciences Reform Act of 2002, Part D, Section 174, (20 U.S.C. 9564), administered by the Institute of Education Sciences’ National Center for Education Evaluation and Regional Assistance.
As required in the above legislation, this study addresses a regional and national priority: to improve the quality of infant child care. Research has documented the majority of group-based infant child care programs in the U.S. and in the region to be of poor- to- moderate quality (Whitebook, et al., 1990; Cost, Quality and Outcomes Study Team, 1995; NICHD Early Childhood Research Network, 2005). In a 1991 study of infant toddler child care throughout the United States, only eight percent was seen as developmentally appropriate and 40 percent was actually seen as harmful to children (Cost, Quality & Outcomes Study Team, 1995). The Carnegie Foundation in 1994 noted a “quiet crisis” in the field of infant child care.
The Program for Infant Toddler Care is an important emerging strategy for improving the quality of care. Preliminary research on the PITC has found it to be associated with significant increases in program quality, including improvements in caregiver-child interactions and language/learning activities. To date these effects have been documented through pre-post comparisons of child care environment and caregiving quality. For example, in a California-wide evaluation, the Caregiver Interactions subscale of the Infant/Toddler Environmental Scale (ITERS) was rated in the “good” range post-training, a significant improvement from its rating in the “minimal” range pre-training. These pre-post findings are noteworthy in light of the stability of child care quality in the absence of intervention as documented elsewhere (e.g., Clifford, 2004).
However, an additional, more rigorous study is needed to provide evidence of the impact of the PITC on programs and on children, and to guide future decisions that will improve the quality of infant care in the region. The proposed study is informed by prior research on the association between the quality of early caregiving and children’s development of language, social, and cognitive skills. This study will be the first to investigate these relationships for the PITC and the first to meet the Department of Education’s standards of evidence for “what works” in improving infant toddler care.
2. Purposes and Use of the Data
The primary goal of the evaluation is to conduct a rigorous impact assessment of the Program for Infant Toddler Care, including impacts on both program quality and on children’s development. Evaluation results will be used by REL West to improve infant and toddler child care in the region. In addition, results will be disseminated throughout the Regional Educational Laboratory Network and to education/child development agencies nationwide, in order to inform and improve child care policies and caregiver training.
If the study finds that the PITC program has a positive impact on child care quality and child development, these results will be used as a basis for securing additional funding in order to further disseminate the program and replicate its positive impacts. If the study finds the PITC program does not have a positive impact, or has differential impacts on various dimensions of child care quality and child development, these results will be used to redesign and improve the program or its implementation.
The Program for Infant Toddler Caregivers (PITC) was developed by WestEd in 1985 in partnership with the California Department of Education. Since that time it has grown to be a major provider of infant and toddler training. Over 1,000 early childhood trainers across 16 states have undergone intensive training in WestEd’s PITC, and these trained professionals have in turn trained over 10,000 caregivers. The program has been developed and fielded over two decades and has built strong connections with early childhood stakeholders in the western region. It has also been closely involved with Early Head Start in the first eight years since its inception, having trained over 1,200 EHS trainers.
PITC is a responsive, relationship-based approach to infant/toddler care and is based on extensive developmental research, theory, and practice. The PITC curriculum is divided into four modules: Social Emotional Growth; Group Care; Learning & Development; and Culture, Family and Providers. Each module includes between four and six topics. PITC training is delivered through multiple modalities including on-site training, group training, video, print and web-based materials. A unique aspect of PITC training is its highly accessible and individualized format. In working with programs, certified trainers deliver all or most training during the evenings or on weekends. All programs receive a combination of group training (on-site for centers) and on-site coaching or other individualized assistance. For each course “section” (a section includes two modules) trainers work with each program to develop a customized program improvement plan and to review progress toward the plan; programs must demonstrate progress in order to receive credit for each section. Trainers tailor teaching strategies to the learning styles, preferences, needs, and culture of the caregivers. Trainers are assigned to match the language and cultural backgrounds of the caregivers, to the extent possible.
PITC staff and evaluators will recruit programs for the study in areas of Southern California and Arizona where demand for PITC services is high. The counties targeted for the study include Los Angeles, San Bernadino, Orange and Riverside Counties in California, and Maricopa and Pima Counties in Arizona. Urban, suburban, and rural parts of these counties will be included. Neither PITC nor other intensive infant-toddler training programs are yet widely established in these areas. In these areas, emerging interest and lesser availability of the program will make it possible to recruit and assign applicants for PITC to treatment and control conditions, and to maintain those conditions with ongoing monitoring. The “counterfactual” condition in the study will be represented by control group programs and children, who will be expected to benefit from no special training other than what would ordinarily be available to them in the absence of the study.
The study will measure impacts of the PITC on both program quality and child development. Recruitment and baseline data collection, followed by random assignment, will take place in 2007. The PITC will be implemented in child care programs assigned to the treatment group over a 10-16 month period beginning in mid-2007. Program follow-up data collection will take place in mid-2008. Child follow-up data collection will take place in two rounds: the first in mid 2008 and the second in mid- 2009. A detailed study timeline is included under Question # 16.
Below we present a description of data collection activities and instruments for the study, followed by Exhibit 1, which provides an overview of the instruments and their time requirements. All instruments and measures will be available in both Spanish and English.
1A & 1B. Child Care Provider Screening Interview: Centers and Family Child Care Providers. During an initial telephone contact from recruiters, program directors who agree to be interviewed will be asked questions to determine their eligibility for the study. Questions will address the ages of children in care, languages spoken in the program, length of time program has been operating, and turnover rates among children and caregivers. Providers who complete the interview and who are eligible for the study will be invited to a meeting at which researchers and PITC staff will further explain the study and the intervention and will distribute informed consent forms.
2. Parent Baseline Questionnaire. Child care providers who consent to participate in the study will be asked to distribute consent forms to parents of children under the age of two, with this brief questionnaire to be completed by parents who grant consent. This questionnaire will ask about children’s ages, language spoken at home, and parents’ employment and educational status. Family contact information will also be obtained through this questionnaire so that researchers can reach families at the first and second follow-up points one year and two years later.
3. Center Director Questionnaire. At both baseline (2007) and follow-up (2008), all child care center directors will be asked additional questions about program structure, staffing, enrollment, and services, beyond those asked in the brief screening questionnaire. The baseline director and caregiver questionnaires will be mailed to programs after all consent forms have been completed. Completed questionnaires will be collected by field researchers during the observation visit. (We are also considering on-line administration of these questionnaires.)
4. Center Caregiver Questionnaire. The center caregiver questionnaire, also administered at baseline and follow-up, will ask about caregivers’ background and training. Additional items in this questionnaire are measures of the quality of caregiving and are based on the Arnett Caregiver Interaction Scale (Arnett, 1989). Ideas about Raising Young Children is a 30-item self-administered caregiver questionnaire scaled on a 5-point Likert scale (1 = "Strongly Disagree" to 5 = "Strongly Agree"). Sample items include “children should always obey the teacher” and “children will be bad unless they are taught what is right”, with higher values indicating more traditional beliefs about raising young children (Cronbach's alpha = .89).The Taking Care of Young Children Scale is a 28-item, self-administered questionnaire intended to measure caregiver perceptions of concerns and rewards with regard to taking care of young children. Items on this questionnaire are rated on a 4-point Likert scale (1 = "not at all" a concern/reward, 4 ="extremely a" concern/reward). In the NICHD SECCYD study, principal components analysis revealed four factors, including: Emphasis on Work Characteristics (6 items, Cronbach's alpha = .86), Emphasis on caring for children (8 items, Cronbach's alpha = .72), Emphasis on working with children (6 items, Cronbach's alpha = .75), and Emphasis on Caregiver's own needs (5 items, Cronbach's alpha = .72).
5. Family Caregiver Questionnaire Family child care owner/directors and other caregivers will receive a questionnaire similar to the center director and center caregiver questionnaires. Program level questions will be adapted as appropriate for family child care settings.
6. Individual Child Form for Caregiver. One of these forms will be completed for each child by a caregiver/teacher. This form serves as an indirect baseline measure of the children's language and social-emotional development. The form is divided into three sections. The first section has six items about the caregiver/teacher's relationship with the child. These items come from the ECLS-B 2-year Child Care Provider Interview. The second section has eight items about the child's behavior (i.e., social-emotional development). These eight items also come from the ECLS-B 2-year Child Care Provider Interview but they were originally adapted from the Infant/Toddler Symptom Checklist (DeGangi, Poisson, Sickel, and Wiener, 1995). The third section has three items about the child's language development. Some of these items were adapted from the ECLS-B 2-year Parent Interview and some were developed to include information about the language skills of pre-verbal children.
7A. Center Observation Instrument. All programs in both treatment and control groups will be observed by trained field researchers at both baseline (2007) and follow-up (2008). The center observation instrument will include the complete Infant Toddler Environment Rating Scale (ITERS-R), selections from the PITC Program Assessment Rating Scale (PARS), and several additional items from the National Association of Family Child Care Standards. In each center, researchers will observe two infant-toddler classrooms for between four and six hours each to complete the measures.
The Infant-Toddler Environment Rating Scale (ITERS-R) measures quality experienced for all children (infants to 2½ years) in center-based classrooms (Harms, Clifford, & Cryer, 2003). The 39 items of the ITERS-R comprise 7 subscales: Space/furnishings, Personal care, Language/reasoning, Activities, Interaction, Program structure, and Parents and Staff. Internal consistency of the ITERS-R subscales range from .47 to .93 with a total scale internal consistency of .93. The 32 items of the FDCRS are grouped into six categories: space and furnishings for care and learning, basic care, language and reasoning, learning activities, social development, and adult needs. Eight additional items are included for use with settings that include children with special needs.
The PITC Program Assessment Rating System (PARS) was designed by PITC staff in order to measure the quality experienced by children from birth through age 3 in home-based and center-based settings, in accordance with PITC philosophy. The 98 items are scored either 1 (met) or 0 (not met). The five subscales include Quality of Caregiver interaction with infants (28 items, α=.90), Family Partnerships, Cultural Responsiveness, & Inclusion of Children with Special Needs (20 items, α=.78), Relationship-based Care (16 items, α=.74), Physical Environment (28 items, α=.80), and Routines & Record Keeping (16 items, α=.68). Inter-rater reliability for the individual subscales ranged from .79- .86. Concurrent validity of the Quality of Caregiver Interaction with Children with the subscales from the Arnett scale is moderate (warmth r=.60, criticalness, r=-.70, and distance r=-.60). A high degree of concurrent validity was found between the PARS total score and the ITERS-R (r=.84), ECERS-R (r=.88), and the FDCRS (r=.86) (Mangione, Kriener-Althen, Niggle, & Welsh, 2006).
7B. Family Child Care Program Observation Instrument. This instrument is similar to the center observation instrument, with use of scales and items that are appropriate for family child care settings. The counterpart of the ITERS-R for home-based settings has been the Family Day Care Rating Scale (FDCRS), (Harms & Clifford, 1989); however, this scale is currently undergoing revision and an updated scale, the FCCERS-R, will be available in spring 2007 and will be used for the study. Researchers will observe each family child care program for between four and six hours to complete the FCCERS-R, as well as selected items from the PITC-PARS and NAFCC standards.
8A and 8B. Center and Family Child Care Interview with Observation. These interviews are integral to the observation instruments above; interview questions have been incorporated to address items on the ITERS, FCCERS, or PARS that may not be observable at the time of the visit. Researchers will ask program directors some or all of these questions, as needed.
9. PITC Training Evaluation. Caregivers in the treatment group will be asked to complete this brief questionnaire after completion of the second and fourth of the four PITC modules (approximately six months and twelve months after start-up of program implementation).
10. Parent Follow-Up Questionnaire. This questionnaire will be administered along with child outcomes data collection twelve months and twenty-four months after random assignment (2008 and 2009). It will address the child’s caregiving arrangements over the past year; changes in family status since baseline; and will incorporate several child measures and a parenting measure.
Child measures include two social-behavioral measures. The Positive Behavior Scale was developed for the New Chance survey (Quint, Bos, and Polit, 1997), a study of over 2,000 low-income mothers and their children. Its 25 items can be divided into three subscales: compliance/self-control (for example, thinks before he/she acts, usually does what I tell him/her), social competence and sensitivity (for example, gets along well with other children, shows concern for other people’s feelings), and autonomy (for example, tries to do things for him/herself, is self-reliant). The parent responds on a five-point scale, ranging from “never” to “all of the time.” It has high internal consistency (Epps, Park, Huston, & Ripke, 2005). The Problem Behavior Scale from the Social Skills Rating System (Gresham and Elliot, 1990) has two components: externalizing problems and internalizing problems. Externalizing problems include aggression and lack of behavior control (e.g., “is aggressive toward people or objects,” “has temper tantrums”). Internalizing problems include social withdrawal and excessive fearfulness (e.g., “appears lonely,” “acts sad or depressed”). The internal consistencies for parents’ ratings of preschool children are satisfactory. To reduce response biases, the items from the positive and problem behavior scale are mixed together for administration.
The MacArthur Communicative Development Inventories, available in English and Spanish, are structured interviews to obtain parent reports of children’s language at three age levels: 8-16 months – words and gestures; 16-30 months – productive vocabulary, irregular word forms, overgeneralization; 30-42 months – the range of language. These scales are well-validated (e.g., Feldman et al., 2005). A short form of these inventories will be incorporated into the parent questionnaires.
The Home Observation for Measurement of the Environment (HOME) Inventory (Caldwell and Bradley, 1984) was designed to measure the quality and quantity of stimulation and support available to a child in the environment. Although the HOME is primarily an observational assessment, selected items from the inventory will be considered for incorporation into the parent questionnaire. We will consider items from the ECLS-B 9-month Parent Interview that were modified from the HOME Inventory and the National Household Education Survey (NHES).
11. and 12. In-Person Child Measures
In-person child outcomes data collection will take place in 2008 and 2009. Trained researchers/child assessors will arrange to meet with all treatment and control group children and their parents in their homes. The parent follow-up questionnaire will be mailed to parents in advance of the meeting. The in-person child assessment will require one hour, with up to one half hour of additional time for parents to complete the questionnaire (the researcher may administer it as an interview) if they have not done so prior to the meeting. During both rounds the researcher will begin the meeting with a brief review of the study and will read and discuss the informed consent protocol before asking the parents to sign the form.
During the first round of assessments children’s ages will range from approximately eighteen months to thirty-six months; during the second round, their ages will range from twenty-four through forty-eight months. Growth and development during this time necessitate that different measures be used at each round of assessment.
In 2008 children will be assessed with The Bayley Scale of Infant and Toddler Development III. It measures five major developmental domains — cognitive skills, language, motor, social-emotional, and adaptive behavior for children from infancy through 3.5 years. It has been used in several national evaluations of very young children including the Early Childhood Longitudinal Study – Birth Cohort and the Early Head Start Study. It is standardized on a sample of children that is representative of the population based on the 2000 census. It has good reliability and is a valid indicator of children’s current developmental level. A short form of this measure can be administered in twenty to twenty-five minutes. Copies cannot be appended because the test involves large materials (e.g., blocks, pictures, objects) that the examiner manipulates. Information can be found at http://harcourtassessment.com.
In 2009, children will be assessed with the following two measures:
The Woodcock-Johnson Psycho-Educational Battery-III has tests of early letter and number skills in English and Spanish (Mather, 2002). It is normed for age 2 to adult. It is a widely-used measure of school-related skills with excellent reliability and validity (McGrew, 1991), and can be administered for young children in twenty-five minutes. Two tests will be used for this study: Letter-Word Identification: The first five items involve symbolic learning, or the ability to match a pictographic representation of a word with an actual picture of the object. The remaining items measure the subject's reading identification skills in identifying isolated letters and words. In this test, it is not necessary that the subject know the meaning of any word correctly identified. The items become more difficult as they present words that appear less and less frequently in written English. Applied Problems: This test measures the subject's skill in analyzing and solving practical problems in mathematics. In order to solve the problems, the subject must recognize the procedure to be followed and then perform relatively simple calculations.
The Peabody Picture Vocabulary Test-III is a measure of receptive vocabulary with norms for ages 2.5 – adult (http://ags.pearsonassessments.com). The test can be administered in fifteen minutes. Children are shown a card containing four pictures and asked to point to the one that corresponds to a word the examiner says. There is a Spanish equivalent version. Copies cannot be appended because the materials include a large cardboard book and an extensive manual for examiners.
Exhibit 1 presents an overview of all data collection instruments and their role in the study.
Exhibit 1. Data Collection Instruments
Instrument |
Respondent Group |
Content |
Purpose |
Mode of Administration |
Time Needed |
Timeline |
Approx Dates |
1A. Child Care Provider Screening Interview-Centers 1B. Child Care Provider Screening Interview-Family Child Care Programs (similar to 1A, with minor adjustments)
|
Child care directors (centers) or owners (family child care programs) |
Basic program characteristics: number and ages of children, staffing, languages spoken, program stability |
Determine or confirm eligibility for the study |
Telephone |
20 min |
Recruitment/ baseline |
April 07-June 07 |
2. Parent baseline questionnaire (attached to consent form) |
Parents |
Family and child background characteristics: children’s ages, parents’ education and employment status, ethnicity and languages spoken |
Confirm eligibility for study; analyze subgroup differences ; refine impact estimates; intent –to- treat analysis |
Paper and pencil |
20 min |
Recruitment / baseline |
April 07-June 07 |
3. Center Director questionnaire |
Child Care Center directors |
Center goals, services, practices, detailed enrollment and staffing characteristics |
Describe subgroup differences; measure impact on structural quality |
Paper and pencil or online |
20 min |
Baseline and program follow-up |
Jun 07-Sept 07 AND June 08-Sept 08 |
4. Center Caregiver Questionnaire |
Child Care Staff working w/children under age 3 |
Caregiver attitudes, beliefs, background, training, education |
Describe subgroup differences; measure of impact on caregiving quality |
Paper and pencil or online |
30 min |
Baseline and program follow-up |
Jun 07-Sept 07 AND June 08-Sept 08 |
5. Family Caregiver Questionnaire |
Family Child Care owner/directors AND other caregivers |
Program/ enrollment characteristics AND attitudes, beliefs |
Describe subgroup differences; measure of impact on caregiving quality |
Paper and pencil or online |
30 min |
Baseline and program follow-up |
Jun 07-Sept 07 AND June 08-Sept 08 |
6. Individual Child Form for Caregiver |
One teacher/ caregiver for each child |
Child’s relationships, behavior, language in the care setting |
Analyze subgroup differences ; refine impact estimates; intent –to- treat analysis |
Paper and pencil |
20 min |
Baseline |
June 07-Sept 07 |
7A. Center Observation Instrument: Includes ITERS-R, items from PARS 7B. Family Child Care Observation Instrument: Includes FCCERS-R, items from PARS
|
Child Care Programs will be observed |
Program quality indicators |
Impact analysis, plus implementation analysis for PITC-related practices |
Observation by field researchers |
None (observation will take 4-6 hours but no respondent time required) |
Baseline and program follow-up |
Jun 07-Sept 07 AND June 08-Sept 08 |
8A. Center Interview w/ Observation: 8B. Family Child Care interview w/ Observation |
Child Care Directors or Owners |
Program quality indicators |
Impact analysis, plus implementation analysis for PITC-related practices |
Interview by field researcher as needed for non-observable items |
30 min |
Baseline and program follow-up |
Jun 07-Sept 07 AND June 08-Sept 08 |
9. PITC Training Evaluation |
Caregivers in all treatment programs |
Quality, usefulness, and fidelity of PITC training & technical assistance |
Implementation analysis |
Paper and pencil |
15 min |
Throughout intervention, at completion of each of the 4 modules |
Rolling, through-out 08 |
10. Parent follow-up questionnaire w/child measures including: Positive behavior and problem behavior scales; MacArthur Communicative Development Inventory (short form);plus brief parenting measure |
Parents
|
Update family /child data including changes in care arrangements; also includes social- behavioral and language measures |
Impact analysis; control-treatment contrast |
Paper and pencil or in-person in-home interview by researcher |
30 min(if parents do not complete prior to in-person visit, this time will be shifted to extend in-person visit to 90 min) |
Child follow-up both rounds |
June 08-Sept 08 AND June 09-Sept 09 |
11. In-person child measures 1st Round: Includes Bayley Scale of Infant and Toddler Development (short form) |
Parent w/child |
Child language, cognitive, and general development |
Impact analysis |
Direct child assessment by researcher (in home) |
60 min (see above) |
Child follow-up 1st round |
June 08-Sept 08 |
12. In-person child measures 2nd Round: Includes Woodcock Johnson Psycho-Ed Battery and include Peabody Picture Vocabulary Test |
Parent w/child |
Child language, cognitive, and general development outcomes for impact analysis |
Impact analysis |
Direct child assessment by researcher (in home) |
60 min (see above) |
Child follow-up 2nd round |
June 09-Sept 09 |
Exhibit 1A identifies copyrighted materials within the study instruments, and also describes the status of permission to use these copyrighted measures or items for the study. Permission will be strictly limited to use of the measures for the study and will not permit further reproduction of the measures or sharing in the public domain.
Exhibit 1A. Copyright Status of Instruments
Instrument |
Copy righted Items |
Status of Permission |
Comments |
1A.Child Care Provider Screening Interview-Centers 1B. Child Care Provider Screening Interview-Family Child Care Programs
|
No copyrighted items |
NA |
|
2. Parent baseline questionnaire (attached to consent form) |
No copyrighted items |
NA |
|
3. Center Director questionnaire |
No copyrighted items |
NA |
|
4. Center Caregiver Questionnaire |
No copyrighted items |
NA |
|
5. Family Caregiver Questionnaire |
No copyrighted items |
NA |
|
6. Individual Child Form for Caregiver |
No copyrighted items |
NA |
|
7A. Center Observation Instrument: Includes ITERS-R, items from PARS 7B. Family Child Care Observation Instrument: Includes FCCERS-R, items from PARS
|
All items are copyrighted and should be kept out of public view
|
Teachers’ College Press is reviewing our request for permission to reproduce the ITERS and FCCERS for use in the study. The permission will be limited to use of the scales for the study and will NOT permit placement of the scales in the public domain. |
These two instruments are copyright protected and should be kept from public view. |
8A. Center Interview w/ Observation: 8B. Family CC interview w/ Observation |
All items are copyrighted and should be kept out of public view |
See above |
These two instruments are copyright protected and should be kept from public view. |
9. PITC Training Evaluation |
No copyrighted items |
NA |
|
10. Parent follow-up questionnaire w/child measures including: Positive behavior and problem behavior scales; MacArthur Communicative Development Inventory (short form);plus brief parenting measure from HOME |
Items 27-30 from the Problem Behavior Scale, MacArthur CDI, and HOME are copyrighted and should be kept out of the public view. |
We are in the process of purchasing these copyrighted measures, which includes obtaining permission to use the copyrighted items for the study; this permission will be limited as for the ITERS and FCCERS above. |
The current submitted instrument includes several copyrighted items, so please do not place in public view. |
11. In-person child measures 1st round using Bayley Scale of Infant and Toddler Development (BSITD, short form) |
The BSITD is copyrighted and should be kept out of public view. |
Survey Research Management (SRM) will be purchasing copies of the measure for administration in 2008. Permission to use the measures will be obtained in the process of purchasing the kits. Permission will be limited to use of the measure for the study. |
This measure cannot be submitted electronically. The current file includes a description of the measure, not the actual measure. (No copyrighted material is included) |
12. In-person child measures 2nd round using Woodcock Johnson Psycho-Ed Battery and Peabody Picture Vocabulary Test (PPVT) |
The Woodcock Johnson and PPVT are copyrighted and should be kept out of public view |
Same as above for the BSITD. |
Same as above for the BSITD. |
3. Use of Improved Information Technology to Reduce Burden
The respondents for this study, who include small, home-based child care programs and parents of varied backgrounds, may not have ready access to technology; hence, most data will be collected through paper-and-pencil questionnaires and telephone and in-person interviewing. However, we are considering on-line administration for center director and caregiver questionnaires.
4. Efforts to Identify and Avoid Duplication
Limited data on child care program characteristics, such as licensed capacity and addresses, are available through state databases and these will be extracted for the study. Other needed data are not available other than through the data collection activities proposed for this study. These include data on caregiver backgrounds, child care programs’ environment and practices, and children’s development.
5. Efforts to Minimize Burden on Small Businesses or Other Entities
Child care programs participating in the study will include nonprofit organizations and small for-profit businesses. We will limit burden on these entities by: 1) collecting only those data that are essential to meeting the goals of the study; 2) collecting much of the data through unobtrusive observation; 3) compensating child care providers for their time spent in providing data for the study.
6. Consequences if the Information is Not Collected or is Collected Less Frequently
In the absence of this study it would be difficult for REL West to carry out its responsibility under the authorizing legislation; in particular, it would be unable to meet the REL mission of serving the high-priority educational needs of the region using rigorous studies.
7. Special Circumstances
This information collection will not be conducted in a manner that will require using any special circumstances.
8. Federal Register Comments and Persons Consulted Outside the Agency
The notice was published on October 26,2006 on page 62589 of the Federal Register. To date, no public comments have been received.
Lead researchers for the study have consulted on the content, form and frequency of data collection with staff of the Program for Infant and Toddler Care in California and Arizona, with members of the Western Regional Laboratory Technical Work Group, and with the Institutional Review Boards of the University of Texas at Austin and Berkeley Policy Associates.
9. Payments to Respondents
Study participants will be paid as follows:
Programs: For child care centers, we will provide a $15 gift card per classroom (a maximum of two classrooms per center will be included) to each program that returns the completed packet of caregiver and parent informed consent forms within two weeks. A completed packet will include a minimum of two caregiver forms per classroom and three to six (depending on classroom size) parent forms per classroom. For family child care homes, we will provide a $15 gift card for each home that submits completed informed consent forms from all parents of enrolled children under the age of twenty-four months. Again, all forms are counted, including those that indicate refusal to participate. All parent forms are counted including those that indicate refusal to participate.
Caregivers/Teachers: Each individual caregiver will receive $25 merchandise gift cards for each questionnaire completed (one in 2007 and another in 2008).
Families: Families will receive $10 merchandise gift cards after completion of a consent form with a baseline questionnaire and $50 after each in-person research session (one in 2008 and another in 2009).
The purpose of these payments is to offset respondent burden and to help sustain respondents’ cooperation with the study. Similar payments have been used in comparable studies conducted by members of the research team. In the Milwaukee Family Study (also called the New Hope Study), conducted by the Manpower Demonstration Research Corporation and the University of Texas with SRM Boulder, parents were compensated $50 for participation in each parent-child interview session, and children over the age of six were given gift coupons worth $15-$20.
Caregivers in the program group who complete all requirements of the PITC training will receive professional growth compensation in the form of either academic units, or $350 in the form of cash or resource materials. This compensation is part of the PITC intervention and is not specifically related to participation in the study.
10. Assurances of Confidentiality
Berkeley Policy Associates follows the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Berkeley Policy Associates will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released. Information from participating institutions and respondents will be presented at aggregate levels in reports. Information on respondents will be linked to their institution but not to any individually identifiable information. No individually identifiable information will be maintained by the study team. All institution-level identifiable information will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required. Berkeley Policy Associates obtain signed NCEE Affidavits of Nondisclosure from all employees, subcontractors, and consultants that may have access to this data and submits them to our NCEE COR.
In addition, Berkeley Policy Associates, the University of Texas at Austin, and SRM Boulder all implement data security policies and programs. Data security protections for this study receive continuing review by the Institutional Review Boards of Berkeley Policy Associates and the University of Texas at Austin. Informed consent forms have been developed with input from these Boards, and approval from both the University of Texas IRB and Berkeley Policy Associates’ IRB (Independent Review Consulting) has been granted.
Below is an overview of Berkeley Policy Associates’ Data Security Policy:
Policies for Class 1 Data (Confidential data, with identifying information) are:
(1) Can never leave BPA premises.
(2) Always kept in a secure place.
(3) Only authorized persons can access and use.
(4) Must be properly disposed of or transferred.
Exhibit 2. Procedures for Handling Class 1 Data
|
Electronic Data |
Paper Data |
Receipt and tracking of Class 1 materials |
|
|
Can never leave BPA premises |
|
|
Create separate working analysis file |
|
|
Always kept in a secure place |
|
|
Only authorized persons can access and use |
|
|
Must be properly disposed of or transferred |
|
|
Policies for Class 2 (Proprietary data and documents that are not Class 1) are:
(1) Only authorized persons can access and use.
(2) Must be used and stored under responsible person's oversight. Must not be left in public view (e.g. sitting out on a desk, open on computer monitor).
Consent forms for the study are attached. The forms and all data collection instruments include the following language regarding confidentiality:
“Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific program or individual. We will not provide information that identifies you or your program/family to anyone outside the study team, except as required by law.”
11. Justifications for Questions of a Sensitive Nature
The questions do not address highly sensitive topics such as those identified. However, the data collection includes observations of and interviews with young children; and children’s language, social, and cognitive skills will be assessed. Therefore, the study is subject to oversight by the Institutional Review Boards of Berkeley Policy Associates and the University of Texas at Austin. These Boards ensure the necessary procedures for human subjects protections. Informed consent will be obtained from all study participants prior to data collection and random assignment. Informed consent forms are attached.
12. Estimate of Information Collection Burden
Estimates of annual number of responses and burden hours for each data collection activity are provided in Exhibit 3. Note that the number of years for data collection activities is 3.
Exhibit 3. Burden Estimate for Each Data Collection Activity
A |
B |
C |
D |
E |
F |
G |
Respondent |
Data Collection Activity or Form |
Time Burden Per Form or Session (in minutes) |
Average Annual Number of Respondents |
Frequency of Data Collection |
Annual Number of Responses1 |
Annual Hour Burden2 |
Center Directors |
1A.Child Care Provider Screening Interview-Centers |
20 |
30 |
1 |
30 |
10 |
Family Child Care Owners or Directors |
1B. Child Care Provider Screening Interview-Family Child Care Programs |
20 |
50 |
2 |
100 |
33 |
Parents |
2. Parent baseline questionnaire |
20 |
313 |
1 |
313 |
104 |
Center Directors |
3. Center Director questionnaire |
20 |
30 |
2 |
60 |
20 |
Center Director and Caregivers |
4. Center Caregiver Questionnaire (Directors+4caregivers per center=5x90=450) |
30 |
150 |
2 |
300 |
150 |
Family Child Care Owners or Directors and Caregivers |
5. Family Caregiver Questionnaire (Director/Owner+1 additional caregiver in 2/3 of programs=150+100=250) |
30 |
83 |
2 |
166 |
83 |
Center and Family Caregivers identified as primary caregivers for participating children (estimate of 150 fcc caregivers and 350 center caregivers) |
6.Individual Child Form for Caregiver (about 150 fcc caregivers and 350 center caregivers) |
20 |
167 |
1 |
167 |
56 |
N/A |
7A. Center Observation Instrument |
0 |
|
|
|
|
A |
B |
C |
D |
E |
F |
G |
Respondent |
Data Collection Activity or Form |
Time Burden Per Form or Session (in minutes) |
Average Annual Number of Respondents |
Frequency of Data Collection |
Annual Number of Responses1 |
Annual Hour Burden2 |
N/A |
7B. Family Child Care Observation Instrument |
0 |
|
|
|
|
Center Directors |
8A. Center Interview w/ Observation |
30 |
30 |
2 |
60 |
30 |
Family Child Care Director/Owner |
8B. Family CC interview w/ Observation |
30 |
50 |
2 |
100 |
50 |
All caregivers and directors/owners in treatment programs |
9. PITC Training Evaluation (directors and caregivers in treatment progs=5x90/2+250/2=350) |
15 |
117 |
4 |
468 |
117 |
Parents |
10.
Parent follow-up questionnaire |
30 |
313 |
2 |
626 |
313 |
Parents (w/child) |
11. In-person child measures 1st follow-up |
60 |
313 |
1 |
313 |
313 |
Parents (w/child) |
12. In-person child measures 2nd follow-up |
60 |
313 |
1 |
313 |
313 |
Child |
11. In-person child measures 1st follow-up |
60 |
313 |
1 |
313 |
313 |
Child |
12. In-person child measures 2nd follow-up |
60 |
313 |
1 |
313 |
313 |
Provider Total |
|
|
|
|
1451 |
549 |
Parent Total |
|
|
|
|
1565 |
1,043 |
Child Total |
|
|
|
|
626 |
626 |
GRAND TOTAL |
|
|
|
|
3642 |
2218 |
*The unduplicated numbers of person respondents in this study are 700 child care providers, 940 parents, and 940 children, for a total of 2,580 person-respondents.
In order to explain the study to providers and to distribute informed consent forms for all relevant staff and parents, we are inviting child care center directors and family child care meetings. director/owners who complete the screening interview to attend an in-person meeting to be scheduled in their local areas. Exhibit 4 presents the estimated annual respondent burden for these meetings.
Exhibit 4. Burden Estimate For Recruitment Meetings
Task/Respondent Group |
Number of Respondents |
Frequency of Activity |
Total Number of Responses |
Meeting Time Per Response (in hours) |
Total Hour Burden (in hours) |
Annual Number of Responses3 |
Annual Hour Burden4 |
Recruitment/Gaining Cooperation—Child Care Center Directors |
90 |
1 |
90 |
1 |
90
|
30 |
30 |
Recruitment/Gaining Cooperation—Family Child Care Providers |
150 |
1 |
150 |
1 |
150
|
50 |
50 |
TOTAL Recruitment |
240 |
1 |
240 |
1 |
240
|
80 |
80 |
The estimated total annual number of responses and hour/cost burden for all data collection and recruitment activities is presented in Exhibit 5.
13. Estimate of Total Annual Cost Burden
There are no direct start-up costs to respondents other than their time to participate in the study, as estimated above. Estimations of the value of participation time for each respondent group, and for the study as a whole, are presented in Exhibit 5 below.
Exhibit 5. Annual Number of Responses and Hour/Cost Burden by Task and Total
Task/Respondent Group |
Annual Number of Responses |
Annual Hour Burden |
Hourly Rate5 |
Annual Cost Burden |
Recruitment/Gaining Cooperation—Child Care Center Directors |
30 |
30.0 |
$18 |
$540 |
Recruitment/Gaining Cooperation—Family Child Care Providers |
50 |
50.0 |
$8 |
$400 |
TOTAL Recruitment |
80 |
80.0 |
|
$940 |
Child Care Provider Data Collection—Center Directors |
270 |
105.0 |
$18 |
$1,890 |
Child Care Provider Data Collection—Center Caregivers/Teachers |
598 |
219.0 |
$11 |
$2,409 |
Child Care Provider Data Collection—Family Child Care Providers |
583 |
225.0 |
$8 |
$1,800 |
TOTAL Provider Data Collection |
1,451 |
549.0 |
|
$6,099 |
Parent Data Collection (including time with child) |
1,565 |
1,043.0 |
$22 |
$22,946 |
Child Data Collection |
626 |
626.0 |
-- |
|
TOTAL Parent-Child Data Collection |
2,191 |
1669.0 |
|
$22,946 |
TOTAL All Tasks |
3,722 |
2298.0 |
|
$29,985 |
14. Estimates of Annualized Costs to the Federal Government
Total budget for the Study of the PITC is $2,947,767. The approximate budget for each year is as follows:
July 2006-December 2006: $309,261
January 2007-December 2007: $635,885
January 2008-December 2008 $1,069,366
January 2009-December 2009 $665,195
January 2010-August 2010 $268,059
The average annual cost per year (for 3 years) is $982,589.
Change in Reporting Burden
The change of total 2,298 annual burden hours reflects new data collection.
16. Plans for Tabulation and Reporting
Exhibit 6. Study of PITC: Reporting Schedule
Product |
First Draft Submission Date |
Final Draft Submission Date |
Interim Report |
April 2009 |
June 2009 |
Final Technical Report |
May 2010 |
July 2010 |
Final Non-Technical Report |
Aug 2010 |
Sept 2010 |
Web Version of Non-Technical Report |
|
Sept 2010 |
Journal Article |
TBD |
TBD |
The impact analysis will describe post-random assignment differences between the program and control groups in five distinct areas: (1) the receipt of training, support, and technical assistance, (2) caregiver understanding/ attitude towards the development and needs of infants and toddlers, (3) program/caregiver practice and environment, (4) child development, and (5) parenting. The first of these is to describe the treatment contrast, answering the question of how much the PITC program adds to the existing program and training infrastructure available to those providing care to infants and toddlers. The second describes how the PITC training changes the way providers and their staff regard the developmental needs of young children and their own role in meeting these needs. This is the most immediate outcome of the training provided by PITC. Next, the impact analysis documents whether improvements in staff understanding and expressed commitment translate into observable changes in how staff interact with children and the quality of the environment in which the children receive care. Factor analysis will be used to consolidate different measures and capture distinct dimensions of child care quality, such as caregiver quality and quality of environment/materials. The fourth step documents the extent to which impacts on training, understanding, and practice translate into better cognitive, language, and socio-emotional child outcomes. Finally, the fifth step describes whether and how parents are affected by PITC, either directly through their interaction with child care providers or indirectly through their children.
Given the underlying experimental nature of the data, the impact analyses will be very straightforward. They will be multi-level regression analyses in which each outcome variable is regressed on a dummy variable measuring experimental status (1 for the program group and 0 for controls) and a small set of child- and program-level baseline covariates. The child-level analyses will be conducted with HLM software or PROC MIXED in SAS to account for the multi-level nature of the data and the clustering of child observations within providers. The provider-level analyses will be simple single-level regressions, with each observation weighted to account for the uneven distribution of children across providers. Given the purposive nature of the sample selection (see discussion of sample selection below), we do not plan to formally generalize our findings beyond our data. Because of this, we are not planning to conduct any random effects estimation and will assume that all parameters are estimated as fixed effects.
Subgroup analyses will be conducted across a number of different dimensions, including region, provider type, child age and primary language, and various levels of baseline program assessment. For example, we will explore whether providers with weaker preparation (less previous training, education, and experience) to provide high quality care at baseline experience greater benefits from PITC than providers with stronger preparation at baseline. However, due to the clustered nature of the data, there are significant limitations to these subgroup analyses. Individual subgroup estimates at the provider level will have limited statistical power and there will be very little power to establish whether differences in impacts across subgroups are statistically different from one another. Thus, we will likely only be able to identify large differences across providers, such as a case in which one group of providers experiences no impacts at all and all impacts are concentrated within a second group of providers.
There will be more statistical power for within-provider subgroup analyses. We will disaggregate child impacts by age, language, and gender and by other baseline variables such as parental background variables.
All subgroup analyses will be conducted with fully interacted regression models, in which the program dummy and all the baseline covariates are interacted with the subgroup variable. Estimating such a model isolates the specific interaction effect of interest from other potential interaction effects with other baseline variables. At the program level we may have to reduce the number of baseline covariates in these analyses to preserve degrees of freedom.
Exhibits 7 and 8 below illustrate the overall study timeline and the timeline for two waves of recruitment and data collection.
Exhibit 7. Overview of Study Timeline
June 06- Apr 07 |
Mar 07 – May 07 |
Apr 07- May 07 |
June 07-Aug 07 |
July 07 – Sept 07 |
Aug 07 - Nov 08 |
June 08- Sept 08 |
Dec 08-June 09 |
Jun 09 – Sept 09 |
Nov 09-Sept 10 |
Revise Study Plan, Develop Measures and Data Collection Instruments IRB and OMB submissions |
Train Recruiters Refine Program and Train Trainers
|
Train Data Collection Staff |
Participant Recruitment, Screening, Informed Consent (In Waves)
|
Baseline Data Collectionand Random Assignment (In Waves) |
Program Implementation (In waves)
|
1st Child Assessments, 2nd Program Assessments, Other Follow-Up Data Collection (In Waves) |
Interim Analysis, Interim Report |
2nd Child Assessments (In Waves) |
Final Analysis and Reports |
Exhibit 8. Study of PITC: Waves for Recruitment, Random Assignment,
Data Collection, and Program Implementation
2007
|
Program N |
Recruitment, Screening, Informed Consent |
Baseline Data Collection (including baseline program observations), and Random Assignment |
Program Implementation (approx) |
California |
|
|||
Wave 1 |
60* |
June 07 |
July 07 |
Aug 07 - Nov 08 |
Wave 2 |
60 |
July 07 - Aug 07 |
Aug 07 - Sept 07 |
Sept 07 - Dec 08 |
Arizona |
|
|||
Wave 1 |
60 |
June 07 |
July 07 |
Aug 07 - Nov 08 |
Wave 2 |
60 |
July 07 - Aug 07 |
Aug 07 - Sept 07 |
Sept 07 - Dec 08 |
TOTAL |
240 |
|
|
|
*Each wave of 60 includes about 22 centers and 38 family child care homes (4-7 family child care groups). The total sample of 240 includes about 150 family child care homes and 90 centers.
2008
|
1st Child Outcome Measures, 2nd Program Observations, and Other Follow-Up Data Collection |
Wave 1 |
June08- July08 |
Wave 2 |
Aug 08-Sept 08 |
2009
|
2nd Child Outcome Measures |
Wave 1 |
June09- July 09 |
Wave 2 |
Aug09-Sept 09 |
17. Display of Expiration Date for OMB Approval
We do not seek approval to not display the expiration date for OMB approval.
18. Exceptions
We are able to certify compliance with each of the provisions.
Arnett, J. (1989). Caregivers in day care centers: does training matter? Journal of Applied Developmental Psychology, 10, 541–552.
Bloom, H. S., Bos, J. M., & Lee, S. (1999). Using cluster random assignment to measure program impacts. Evaluation Review, 23(4), 445–469.
Borman, G.D., Hewes, G.M., Overman, L.T., & Brown, S. (2003). Comprehensive School Reform and Achievement: A Meta-Analysis. Review of Educational Research, 73, 125-230.
Bos, J., Polit, D., & Quint, J. (1997). New Chance: Final Report on a Comprehensive Program for Young Mothers in Poverty and Their Children. Manpower Demonstration Research Corporation.
Burchinal, M. R., Roberts, J. E., Nabors, L. A., & Bryant, D. M. (1996). Quality of center child care and infant cognitive and language development. Child Development, 67(2), 606-620.
Caldwell, B.M. and Bradley, R.H. (1984). Home Observation for Measurement of the Environment. Little Rock: University of Arkansas at Little Rock.
Campbell, F. A., Pungello, E. P., Miller-Johnson, S., Burchinal, M., & Ramey, C. T. (2001). The development of cognitive and academic abilities: Growth curves from an early childhood educational experiment. Developmental psychology, 37(2) 231-242.
Campbell, F. A., Ramey, C., Pungello, E. P., Sparling, J. J., & Miller-Johnson, S. (2002). Early childhood education: Young adult outcomes from the Abecedarian Project. Applied Developmental Science, 6, 42-57.
Clifford, R. M., (2004). Structure and stability of the Early Childhood Environment Rating Scale. Keynote Address at the Quality in Early Childhood Care and Education International Conference, Dublin, Ireland, September/October.
Clifford, R.M. & Rossbach, H. (in press). Structure and stability of the Early Childhood Environmental Rating Scale. In D. Cryer (Ed.) A world of improvement: Promoting quality early childhood programs for all children.
Cost, Quality & Child Outcomes Study Team (1995) Cost, Quality and Child Outcomes in Child Care Centers, Public Report, Second Edition. Denver: Economics Department, University of Colorado at Denver.
Crosnoe, R. (2005) “Double Disadvantage of Signs of Resilience: The Elementary School Contexts of Children from Mexican Immigrant Families.” American Educational Research Journal 42:269-303.
DeGangi, G., Poisson, S., Sickel, R., and Wiener, A. (1995). Infant/Toddler Symptom Checklist: A Screening Tool for Parents. San Antonio, TX: Therapy Skill Builders, Psychological Corporation.
Duncan, G. J., & National Institute of Child Health and Human Development Early Child Care, Rockville, MD (US). (2003). Modeling the impacts of child care quality on children's preschool cognitive development. Child Development, 74(5), 1454-1475.
Epps, S. R., Park, S. E., Huston, A. C., & Ripke, M. N. (2005). A Scale of Positive Behaviors. In K. Moore & L. Lippman (Eds.), Conceptualizing and measuring indicators of positive development: What do children need to flourish? NY: Kluwer Academic/Plenum Publishers.
Feldman, H. M., Dale, P. S., Campbell, T. F., Colborn, D. K., Kurs-Lasky, M., Rockette, H. E., & Paradise, J. L. (2005). Concurrent and predictive validity of parent reports of child language at ages 2 and 3 years. Child Development, 76, 856-868.
Gresham, F. & Elliott, S. (1990) The Social Skills Rating System. Circle Pines, MN: American Guidance Service.
Harms, T. & Clifford, M. (1989). Family Day Care Rating Scale. New York: Teachers College Press.
Harms, T., Clifford, R.M., & Cryer, D. (2003). Infant-Toddler Environment Rating Scale-Revised. New York: Teachers College Press.
Howard, E. R. (2003). Biliteracy development in two-way immersion education programs: A multilevel analysis of the effects of native language and home language use on the development of writing ability in English and Spanish. Doctoral dissertation submitted to the Department of Human Development and Psychology, Harvard University Graduate School of Education. Cambridge, MA: Harvard University.
Lee, V. E., Loeb, S., & Lubeck, S. (1998). Contextual effects of prekindergarten classrooms for disadvantaged children on cognitive development: The case of Chapter 1. Child Development, 69 (2), 479-494.
Mangione, P., Kriener-Althen, K., Niggle, M. P., & Welsh, K. (2006). Program Quality Through the PITC Lens: Assessing Relationship-Based Care in Infant/Toddler Early Care and Education Programs. (Presentation). 15th National Institute for Early Childhood Professional Development. San Antonio, TX.
Mathematical Policy Research & Teachers College Center for Children and Families, Columbia University (2002). Making a difference in the lives of infants and toddlers and their families: The impacts of Early Head Start. Report submitted to the Office of Planning, Research and Evaluation, Administration for Children and Families and the Head Start Bureau, Department of Health and Human Services, United States Government.
Mather, N. (2002). The Woodcock-Johnson III: Reports, recommendations, and strategies. New York: Wiley.
McGrew, K. S. (2001). Woodcock-Johnson III Technical Manual. Itasca, IL : Riverside Publishers.
NICHD Early Child Care Research Network, Rockville, MD (US). (2003). Does quality of child care affect child outcomes at age 4 1/2? Developmental psychology, 39(3) 451-469.
NICHD Early Child Care Research Network (2005). Child Care and Child Development: Results from the NICHD Study of Early Child Care and Youth Development. New York: Guilford Press.
NICHD Study of Early Child Care, instruments listed in https://secc.rti.org.
Ramey, C. T., & Ramey, S. L. (2004). Early learning and school readiness: Can early intervention make a difference? Merrill Palmer Quarterly Journal of Developmental Psychology, 50(4) 471-491.
Shaefer, E. S., & Edgerton, M. (1985). Parent and child correlates of parental modernity. In I. E. Sigel (Ed.), Parental belief systems (pp. 287-318). Hillsdale, NJ: Lawrence Erlbaum.
Schochet, P. (2005). Statistical power for random assignment evaluations of education programs. Document No. PR05-36. Princeton, NJ: Mathematica Policy Research.
U.S. Department of Health and Human Services, Administration for Children and Families (2005). Head Start Impact Study: First Year Findings. Washington, D.C.
1 F=D*E.
2 G=(C*F)/60.
3 Annual number of responses = Total number of responses / 3.
4 Annual hour burden = Total hour burden / 3.
5 Based on wage data for 2005 found at http://www.bls.gov, with adjustment for inflation.
SUPPORTING
STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION
File Type | application/msword |
Author | Phyllis Weinstock |
Last Modified By | DoED |
File Modified | 2007-07-18 |
File Created | 2007-07-18 |