Recruitment of Divisions and Schools for the Evaluation of the REL Appalachia Teaching Math to Young Children Toolkit
PART A: Justification
November 2023
Submitted to:
Institute of Education Sciences
U.S. Department of Education
Submitted by:
SRI International
333 Ravenswood Ave
Menlo Park, CA 94025
(650) 859-2000
Tracking and OMB Number: (XX) XXXX-XXXX
Revised: 11/30/2023
Mathematics knowledge acquired in early childhood provides a critical foundation for long-term student success in math as well as reading (Duncan et al., 2007; Watts et al., 2014), but the professional development (PD) and curricular support for preschool teachers often lack specific content and training on high-quality math instruction delivered by math content experts. To address this problem, REL Appalachia is developing a toolkit to provide preschool teachers with support in implementing core teaching practices essential to promoting early math skills and knowledge in children. The toolkit is based on the Teaching Math to Young Children What Works Clearinghouse practice guide (Frye et al., 2013) and is being developed in collaboration with state and district partners in Virginia.
REL AP is requesting clearance to conduct an evaluation to assess the efficacy of the professional development resources included in the toolkit. The evaluation will also assess how teachers implement the toolkit to provide context for the efficacy findings and guidance to improve the toolkit and its future use. The evaluation will take place in 50 schools across approximately 10 school divisions in Virginia and focus on mathematics teaching practices and student mathematics knowledge and skills in preschool classrooms.
As part of the REL solicitation request (Solicitation #91990020R0032), IES required each applicant to develop at least one research-based toolkit to support educators’ use of evidence-based practices, and to conduct an independent efficacy and implementation evaluation of the toolkit.
Per the solicitation:
“IES is invested in developing practitioner-friendly toolkits to help educators use evidence-based practices in classrooms — from preschool through postsecondary settings. Some of the best evidence available is consolidated in the WWC Practice Guides, in which researchers and practitioners review the evidence from the most rigorous studies available, develop recommendations for practice, and create action steps for how to use the recommended practices. To help get this evidence into the hands of stakeholders, RELs shall partner with educators and postsecondary instructors (if relevant) to develop one toolkit based on an assigned WWC Practice Guide, which shall include all materials necessary for effective implementation.” (pp. 44-45)
This data collection is consistent with the authorizing legislation of the REL Program, the Educational Sciences Reform Act (ESRA) of 2002 (see Appendix A). Part D, Section 174(f)(2) of ESRA states that as part of their central mission and primary function, each regional educational laboratory “shall support applied research by . . . developing and widely disseminating, including through Internet-based means, scientifically valid research, information, reports, and publications that are usable for improving academic achievement, closing achievement gaps, and encouraging and sustaining school improvement, to—schools, districts, institutions of higher education, educators (including early childhood educators and librarians), parents, policymakers, and other constituencies, as appropriate, within the region in which the regional educational laboratory is located.”
The toolkit contains the following three parts: (1) Initial Diagnostic and Ongoing Monitoring Instruments, (2) PD Resources, and (3) Steps for Institutionalizing Supports for Evidence-Based Practice. The solicitation also states that RELs must evaluate the efficacy and implementation of the professional development resources in the finished toolkit. According to the solicitation, “[t]he evaluation shall examine changes in teacher practice and may also include measures of teacher knowledge and/or teacher self-efficacy.”
The Early Math Toolkit will address core teaching practices essential to promoting early math skills and knowledge in preschool children. Using the recommendations in the IES Teaching Math to Young Children practice guide (Frye et al., 2013) as a basis, the toolkit developers identified a set of teaching practices that operationalize the recommendations so teachers can focus on a specific set of actions to implement in the classroom. The toolkit content addresses Recommendation 1: Teach number and operations using a developmental progression; Recommendations 3: Use progress monitoring to ensure that math instruction builds on what each child knows; Recommendation 4: Teach children to view and describe their world mathematically; and Recommendation 5: Dedicate time each day to teaching math and integrate math instruction throughout the school day.
Preschool teachers assigned to the Early Math Toolkit intervention condition will be invited to participate in PD modules and implement the practice guide recommendations to promote early mathematics learning throughout the school year. The PD modules are designed to increase the teachers’ knowledge about how to plan for and apply evidence-based math teaching practices to increase the quantity and quality of math instruction. They include an introductory module that outlines the recommendations in the practice guide and associated teacher practices and describes how to use the PD resources and other toolkit components, as well as four content modules. All the professional development content is contained in the toolkit and that the toolkit provides facilitators all the materials they need to implement the toolkit.
A small but rigorous evidence base suggests that the practices recommended in the WWC practice guide will lead to improved early math learning when implemented by teachers. However, past studies have not examined the impact of providing a comprehensive resource toolkit for educators to train them on how to implement evidence-based practices. Therefore, a rigorous evaluation of the efficacy and implementation of the toolkit is necessary to gather evidence about this set of resources and determine whether this type of toolkit could serve as a model for implementation support for preschool teachers more broadly. The toolkit will be made publicly available after the study is conducted, and this study will provide critical evidence to its potential users, which could include preschool teachers across the country. In addition, the study will provide implementation findings that can inform how the toolkit could be improved so that it is as useful as possible to a wide range of districts, schools, and teachers.
This package only requests clearance for data collection related to recruitment activities. A separate OMB package will request clearance for data collection procedures and activities related to addressing the study research questions (RQs). Additional details about the study goals and design are included below for context.
The impact and implementation research questions addressed in this study include the following:
Do teachers in intervention-assigned schools (that is, teachers who are offered the toolkit PD resources) report greater confidence in, and positive attitudes toward, using evidence-based practices in math compared to teachers in control-assigned schools?
Do teachers in intervention-assigned schools implement more math activities, spend more time on math through daily instruction, and include more instruction across settings and activities?
Do teachers in intervention-assigned schools demonstrate more frequent use of evidence-based math teaching practices than teachers in control-assigned schools?
Do preschool students in intervention-assigned schools score higher on measures of math achievement in the spring of preschool than students in control-assigned schools?
Did the professional development components of the toolkit implementation, classroom activities, and instruction occur as intended?
What are different ways that teachers engage with the toolkit PD resources? To what extent does teachers’ use of the PD resources vary? What helps or hinders effective learning from the PD resources?
What challenges do teachers face in implementing the toolkit and how do teachers attempt to overcome those challenges? What additional supports are needed and what improvements do participants recommend for the toolkit?
The impact study will be a school-level, cluster-randomized controlled efficacy trial. The evaluation team will recruit and randomly assign 50 schools across 10 school divisions to the treatment condition (toolkit) or business as usual (control) in the spring and fall of 2024 (recruitment materials are attached in appendix A). Random assignment of schools will occur after the collection of consent forms. In schools assigned to the toolkit group, preschool teachers will be invited to use the toolkit materials. In control schools, preschool teachers will not have access to the toolkit until after the study when it is made publicly available.
Both groups will be asked to participate in study data collection using teacher instructional logs, surveys, and observations. The implementation group will be asked to participate in additional data collection to address the implementation questions, including completing implementation checklists and a toolkit satisfaction survey. The study will also collect administrative data, including demographic information on students and teachers, and student standardized mathematics assessment scores.
After data collection and analysis, the study team will develop a report to share findings from the efficacy study. The white paper’s primary audience will be preschool teachers and instructional leaders, who will benefit from information on the extent to which the toolkit improves outcomes, the conditions under which the toolkit is perceived to be most useful, and potential challenges that may emerge when implementing the toolkit. IES and the REL Appalachia team that developed the toolkit will be the secondary audience, who will benefit from information on potential refinements to the toolkit.
The efficacy study’s data collection activities are listed below. This package only requests clearance for district recruitment activities. The remaining data collection activities are provided as context. A separate OMB package will request clearance for the survey instruments, observation protocols, instructional logs, assessment and administrative data, and associated data collection procedures.
vision The study team will first email division administrators in the targeted divisions to inform them about the study (see A1 in appendix A). The inference population is schools and preschool teachers in low-resourced communities. We will recruit schools from divisions with the highest percentages of students who are eligible for services due to economic disadvantages, as identified by the Virginia Department of Education (VDOE), as the criterion and include divisions of different sizes. Using data from the 2022/23 school year, we identified the following 12 school divisions as serving students from low-resourced communities: Richmond City, Sussex County, Northampton County, Alexandria City, Henry County, Nottoway County, Petersburg City, Portsmouth City, Danville City, Hopewell City, and Martinsville City (VDOE, n.d.). If obtaining a sufficient number of schools from these divisions is not possible, we will recruit schools from the next most economically disadvantaged divisions until we reach our target of 50 schools.
If division leaders are interested in participating, the evaluation team will ask for their help securing school agreement. Then, the team will email all the division’s school leaders in schools with at least one preschool classroom and invite them to learn more about the study and what will be required of school and division staff (see A2 in appendix A). Once school leaders agree to participation by their school and teachers, the study team will schedule a series of webinars (at least one per division) with preschool teachers in participating schools to explain the study purpose, benefits, and time commitment. The team will then invite the teachers to join the study and data-collection activities by asking them to review and sign an online consent form prior to random assignment (see A3 in appendix A). This nonbinding agreement will indicate that they understand the intervention and the study and will participate to the best of their ability, regardless of the condition to which they are assigned.
Schools will be included in the random assignment pool if at least one preschool teacher in that school consents to participate in the study. The only exclusion criteria for divisions and/or schools will be if a division/school does not participate in the Virginia Kindergarten Readiness Program (VKRP) since this will be the source of outcome data for students.
We
will not ask divisions, schools, or teachers to provide data during
the recruitment process. We will only be collecting publicly
available data during recruitment to identify school divisions to
target for recruitment. We will not collect any sensitive data during
the recruitment process.
Data collection activities for which clearance is not requested as part of this OMB package (provided for context)
Consent to participate. The evaluation team will request teacher rosters and email addresses to email teachers the invitations to complete the surveys and logs. Once the teachers have consented, the study team will follow division consent procedures for parents/guardians. The intention is to engage divisions that allow passive consent procedures for parents/guardians to opt out of their child’s participation in the study if they choose. However, if divisions do not allow passive consent procedures, we will follow the division consent procedures for active consent. The study team will ask schools to help communicate information about the study and opt-out procedures by either emailing families and/or sending the information home in student backpacks with other school communications. Because all the student-directed study procedures will be part of the students’ typical classroom experience, including the assessments, families can only opt-out of their child’s data being used in the research study. If families do not want their child’s data to be used in the research study, they will be given a website URL and QR code to notify the study team of their decision.
Teacher Instructional Log. All teachers will complete an instructional log to document time spent on math-focused activities as well as information about the learning goals and format of activities using a sampling approach (described in Supporting Statement Part B) capturing a week at pre-intervention, mid-intervention, and post-intervention. The study team decided to collect the log at three time points to balance the desire to collect more frequent data about teacher use of math-focused activities with the need to limit burden on teachers. The study team will decide the weeks during which teachers will complete the log. We estimate these forms will take less than 5 minutes daily to complete. The data in each round of logs should be representative of the classroom instruction, but not of the students. The team plans to develop an instructional log to capture the frequency of math instructional activities in a preschool classroom, as no such log currently exists in the literature. The log is in development as part of the efficacy evaluation and a subject matter expert has reviewed it. The study team will complete at least two cognitive interviews in the fall of 2023 with fewer than ten teachers who have pilot-tested the log. The final version of the log will be submitted for OMB approval in a separate package.
The Early Math Assessment (EMAS; Ginsburg & Pappas, 2016) is a direct assessment of young children’s math abilities and knowledge that is administered twice a year as part of the VKRP. Using these assessment data will limit the need for the study to test students, thus reducing burden. The study team also plans to collect Phonological Awareness Literacy Screening scores for students to use as a covariate in the analyses.
free or reduced-price lunch (FRPL) eligibility, gender, English learner status, and Individualized Education Program (IEP) status. These data will be used as covariates in our statistical models to increase the precision of the estimates of the intervention’s effect and allow for analyses to test whether the intervention is more effective for certain groups of students. Student-teacher links (classroom rosters) will be requested. School-level data will include school characteristics, such as school enrollment and percentage of students considered economically disadvantaged.
Student-level administrative data used in the study will include student characteristics such as
Teacher Surveys. The team will use an existing assessment with established reliability and validity — the Attitudes, Beliefs, and Confidence Survey (Reid & Melgar, 2018) — to capture teacher attitudes, beliefs, and confidence about teaching math at both the beginning and end of the 2024/25 school year. All teachers will complete a survey about their background, education, training, and past professional development experiences only at the beginning of the 2024/25 school year. The final version of the teacher survey will be submitted for OMB approval in a separate package.
Intervention teachers will complete the Self-Assessment of Mathematics Instruction (SAMI) at the beginning and middle of the 2024/25 school year and again in March 2025, following completion of the last PD module. The SAMI is being developed as part of the Early Math Toolkit and will provide a self-reported measure of teachers’ practices. As part of the evaluation, rather than analyze teachers’ responses on the SAMI, the study team will simply track completion of the SAMI as part of the toolkit fidelity.
Intervention teachers will also complete a toolkit satisfaction survey and an implementation checklist in which they will report which modules and activities from the toolkit they completed. The final version of these instruments will be submitted for OMB approval in a separate package.
Teacher Observations. The study team will measure teachers’ use of evidence-based practices in math at the beginning and end of the school year in both the intervention and control conditions using the Teaching Practices Observation Tool (TPOT) for Early Math. The TPOT, being developed as part of the Early Math Toolkit, is an observation-based rubric which will align with the Educator Self-Assessment of Math Instruction (SAMI) and measure preschool teachers’ use of evidence-based math teaching practices. The current version, which will be refined through pilot testing, includes observation items rated on a four-point continuum. The study team will collect baseline teacher practice observations for all teachers at the beginning of the 2024/25 school year, and again in March 2025 after the intervention teachers complete the PD modules.
If the school or preschool program participates in Virginia’s Quality Rating Improvement System, we will obtain teachers’ Classroom Assessment Scoring System (CLASS) instructional support scores to provide a measure of overall teaching quality practices in addition to the math-specific practices. The final version of these observation measures will be submitted for OMB approval in a separate package.
During recruitment, we will primarily rely on email communication, when possible, to minimize additional phone call burden on participants. The data-collection plan is designed to obtain information efficiently in a way that minimizes respondent burden and utilize extant data whenever possible.
The other data collection activities, for which the study team will seek approval through a second OMB package, will include the online survey platform, Qualtrics, which will be 508 compliant. REL AP will use an online survey platform to collect data that can be collected directly only from school leaders, facilitators, and teachers. REL AP will manage the entire data-collection process, including questionnaire programming, sample management, and monitoring of responses. REL AP will email study participants an individual link to online surveys. To reduce the burden on respondents, the software will allow survey respondents to answer using their preferred device, such as a laptop, smartphone, or tablet, and will save survey progress if a respondent cannot complete the survey in one sitting. The questionnaire will include a telephone number to a staffed help desk and an email address where respondents can send questions. These procedures will minimize the survey burden on respondents.
When possible, the evaluation team will collect data from administrative sources rather than through primary data collection. Division staff will submit information electronically using secure file transfer procedures. The materials for preparing the teacher list will include an email address to which respondents can direct their questions.
The study team will not collect information that is already available from alternative data sources for this project. No population frame exists that the study team can use to identify districts and schools that are most suitable for the study; the recruitment process is key to identifying suitable districts and schools.
The other data collection activities, for which the study team will seek approval through a second OMB package, will draw on information that is already available from extant administrative records. When possible, the evaluation team will collect school-level characteristics such as size and percentage of students considered economically disadvantaged, as well as student-level characteristics, such as student achievement, directly from division administrators to minimize the length of surveys administered directly to principals and teachers and prevent duplication of effort.
The primary data collection for this study will include only information that is not available from other sources. Information obtained from the instructional logs, surveys, and observations are not available elsewhere.
The study team will primarily rely on email for recruitment to reduce burden associated with phone calls. When recruiting divisions and schools, the study team will send one initial email and one follow up email to non-responders. The study team will follow up with phone calls only if the two emails do not receive a response (see A1 and A2 in appendix A). The study team will rely on email communication when recruiting teachers, unless teachers specify a preference for a phone call.
The other data collection activities, for which the study team will seek approval through a second OMB package, will not affect any small businesses, but some of the schools might be small entities. The use of administrative records will reduce the burden on school educators by ensuring that only the minimum amount of original data will be requested from schools to meet the objectives of this study. Aside from requests for administrative records and the survey links emailed directly to participants, the evaluation team will not contact schools to request additional data. The study team will secure permission from individual schools to share their student-level assessment data and then request these data directly from the Virginia Department of Education (VDOE), to minimize burden on programs.
The Education Science Reform Act of 2002, Part D, Section 174 states that the central mission and primary function of the RELs includes supporting applied research and providing technical assistance to state and local education agencies within their region (20 U.S.C. 9564). Failure to approve the recruitment activities related to the evaluation of early mathematics toolkit will jeopardize this attempt to study the impact of the toolkit and thereby prevent the REL AP contractor from fulfilling its mission.
This study aims to provide evidence relevant to the types of schools that may benefit most from the toolkit, including high-need schools, schools that reflect a variety of preschool classrooms. If the study does not collect these data during recruitment, the study team will be unable to successfully select and recruit into the study a sample of districts and schools that meet this aim. In addition, the information the study team collects during recruitment will help select only those districts and schools that can feasibly follow the study protocol and implement the toolkit. Not collecting this information will increase the risk of failing to learn about the efficacy of this publicly available toolkit in improving preschool teachers’ instructional practices and students’ early math skills.
The data collection activities for which the study team will seek approval in a second package will contribute to understanding the toolkit’s potential to affect student and teacher outcomes. If this study does not collect data from the surveys, implementation logs, focus groups, interviews, and district administrative records, the study team will be unable to draw conclusions about the toolkit’s effect on student and teacher outcomes.
No special circumstances are involved with this data collection. Data will be collected in a manner consistent with the guidelines in 5 CFR 1320.5.
A 60-day Federal Register Notice was published on September 14, 2023 (88 FR 63093). There were no public comments received during the 60-day comment period. A 30-day notice will be published.
In addition, throughout the course of this study, we will draw on the experience and expertise of Dr. Todd Grindal who is the associate center director and principal senior researcher for the Center for Learning & Development, SRI Education and the subject matter expert for this study.
The study proposal has also gone through external peer review as required by the Education Sciences Reform Act (ESRA) for all REL studies. The study proposal will undergo Institutional Review Board (IRB) review through SRI in summer 2024.
The current information request focuses on work with district and school staff for recruitment, and there are no incentives associated with the recruitment activities.
As part of the recruitment activities proposed in this information collection, the recruitment flyer and emails (Appendix A) describe the incentives for upcoming data collection activities. A future information collection request will provide detailed information on the data collection procedures and instruments, including incentives.
The study team will provide incentives only to PD and/or data-collection participants. Teachers in the intervention group will either be directly paid $40 per hour of PD (up to $800 total across the life of the study, with an estimated 20 hours to participate in the professional learning activities) if the PD is completed outside of working hours, or the study team will provide funds to the school or division to pay for substitute days to allow intervention teachers to complete the PD modules during their normally scheduled working hours. Teachers will be asked to self-report the modules and activities they complete on the Implementation Checklist, and the study team will calculate their stipends using their estimated number of hours for each module. Teachers will not be paid for implementing the activities in the classroom. Teacher compensation will be determined during the initial discussions with division administrators and school leaders and will follow division policy regarding teacher time, fair compensation, and availability of substitute teachers. Intervention teachers will also be given $50 per data-collection wave for completing the implementation surveys. In addition, all intervention and control teachers will be compensated for their time spent completing the surveys and teacher instructional logs ($50 per completed data-collection wave). The study team will also provide each participating division with $400 for its assistance with data exports from administrative data systems.
The information divisions provide during the recruitment process will include personally identifiable information, work email addresses and phone numbers, of school division staff, including division administrators, school principals, and preschool teachers. The study team needs this information to conduct outreach to these division staff, school principals, and teachers to inform them about the study and confirm their willingness to participate in the study.
The data collection activities, for which the study team will seek approval through a second OMB package, will collect classroom rosters from the participating preschool classrooms. The study team needs this information to inform parents/guardians about the study. The study team will also use the classroom rosters to link the students to the district administrative data.
All data-collection efforts, including those associated with recruitment, will be conducted in accordance with all relevant federal regulations and requirements. REL AP will be following the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, which requires “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.
Respondents will be assured that all information identifying them and their school will be kept confidential. All members of the study team have obtained their certification on the use of human subjects in research. The following safeguards to carry out confidentiality assurances are routinely employed at SRI International, the contractor executing this study:
All team members will participate in data-collection training that includes a focus on methods to maintain participant confidentiality and data security.
The study team will provide secure environments for housing all data collected for the study. Paper files will be stored in a locked file cabinet and all digital files will be password protected so that only project researchers can access it.
The study team will immediately deidentify all data collected during the study that can potentially be linked to an individual and will delete temporary files that are stored on encrypted hard drives during on-site data-collection activities.
Only authorized members of the study team will have direct access to deidentified evaluation databases. Study team members will maintain a high level of focus on ensuring the confidentiality of both quantitative and qualitative data.
The team will not share data obtained for this study with any entity or individual other than the Department and will not use the data for purposes other than this evaluation.
The
evaluation team will make certain that all data are held in strict
confidentiality, as just described, and that in no instance will
responses or data be made available except in aggregate statistical
form. The following statement will appear on all letters to
respondents on data collection:
Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific division, school, or individual. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.
All survey responses will be kept strictly confidential. No school, district, or state staff member will have access to survey responses that include respondents’ names, school names, or other information that could potentially be used to identify individuals or schools. The project is currently in the process of being reviewed by SRI International’s Institutional Review Board (10331).
In addition, for student information, the data-collection efforts will ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools, shall remain confidential in accordance with section 552a of Title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act. The study will also adhere to requirements of subsection (d) of section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.
The evaluation team will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released publicly. Information from participating institutions and respondents will be presented at aggregate levels in reports. No individually identifiable information will be maintained by the study team upon study completion.
We plan to request administrative demographic data on students including gender and race/ethnicity, which we will use as covariates in our analyses. These covariates will increase the precision of the estimates of the intervention’s effect and allow for subgroup analyses to test whether the intervention is more effective for certain subgroups of students. These data are not sensitive primary data. These data will not be collected as part of recruitment activities.
The preliminary activities for which approval is requested in this submission include outreach to districts and schools for recruitment. The total response burden for these data collection activities is 95 hours. This is a one-time series of recruitment activities and there are no plans for follow-up or other recurring collections outside of what is being proposed in this package. Table 2 shows the hourly burden for the recruitment activities. The number of responses is 320.
Table 2. Estimated Annual Burden and Respondent Costs Table
Information Activity |
Sample Size |
Respondent Response Rate |
Number of respondents |
Responses per Respondent |
Number of Responses |
Average Burden Hours per Response |
Total Annual Burden Hours |
Estimated Respondent Average Hourly Wage |
Total Annual Costs |
Recruitment |
|
|
|
|
|
|
|
|
|
Division recruitment contact email |
12 |
100% |
12 |
1 |
12 |
0.10 |
1.2 |
$50.00 |
$60.00 |
Recruitment followup for nonresponding divisions |
12 |
100% |
12 |
1 |
12 |
0.10 |
1.2 |
$50.00 |
$60.00 |
Division recruitment phone call |
10 |
100% |
10 |
1 |
10 |
1 |
10 |
$50.00 |
$500.00 |
First principal contact email |
60 |
100% |
60 |
1 |
60 |
0.10 |
6 |
$50.00 |
$300.00 |
Recruitment followup for nonresponding principals |
60 |
100% |
60 |
1 |
60 |
0.10 |
6 |
$50.00 |
$300.00 |
Principal recruitment phone call |
60 |
84% |
50 |
1 |
50 |
1 |
50 |
$50.00 |
$2,500.00 |
Division informational webinar |
10 |
100% |
10 |
1 |
10 |
1 |
10 |
$50.00 |
$500.00 |
First teacher contact email |
112 |
90% |
100 |
1 |
100 |
0.10 |
10 |
$50.00 |
$500.00 |
Recruitment followup for nonresponding teachers |
12 |
50% |
6 |
1 |
6 |
0.10 |
0.6 |
$50.00 |
$30.00 |
Subtotal |
|
|
320 |
|
320 |
|
95 |
|
$4,750.00 |
Total |
|
|
320 |
|
320 |
|
95 |
|
$4,750.00 |
There are no start up costs for respondents.
The total cost to the federal government for work conducted over all four years is $1,182,058.26, and the estimated annualized cost to the federal government for each year of the study is $295,514.57.
Funding includes staff time to recruit participants and to collect, clean, and analyze data from the study. The total also includes costs incurred by the study team related to study preparation and submission of the study information to IES (from proposed research design through reporting of results).
This is a request for a new collection of information.
The activities for which this first OMB package requests clearance will not be directly tabulated and published, but rather will be used to facilitate the selection of the study sample.
Once the sample is selected, the study team will conduct the data collection activities that will be included in the second OMB package. To estimate the impact of the toolkit PD resources on teacher and student outcomes (RQs 1 to 4), we will use hierarchical linear modeling (HLM) to adjust standard errors associated with the clustering of observations within schools and divisions (Raudenbush & Bryk, 2002), controlling for baseline scores on each outcome measure and other relevant covariates. HLM models estimating teacher-level impact will need to account for the nesting of teachers within schools, and models estimating student-level impacts will need to account for the nesting of students within classrooms and schools. Models will be estimated separately for each outcome. To analyze the implementation fidelity research questions (RQs 5 to 7), we will examine means, standard deviations, and frequencies (percentages) of outcome measures. Additionally, we will conduct qualitative analyses of the open-ended responses on the Toolkit Satisfaction Survey to examine how teachers used the toolkit resources as well as how teachers approached challenges and any suggested improvements teachers have for the toolkit.
The results of this study will be made available to the public through a peer-reviewed report published by IES. The study team will produce and disseminate a report on the efficacy study findings with an expected release in 2026. The primary audience consists of preschool teachers and instructional leaders, as it will provide them with information on the extent the toolkit improved outcomes as well as implementation context and challenges. The secondary audience consists of IES and the REL team that developed the toolkit, as the findings will inform potential refinements to the toolkit.
The datasets from these studies will be turned over to the REL’s IES Contracting Officer’s Representative to become IES restricted-use data sets requiring a user’s license (see http://nces.ed.gov/pubs96/96860rev.pdf for procedures related to obtaining and using restricted-use datasets). These files will contain all the primary survey data collected for the study with all personal identifiers removed. All restricted use files are required to be reviewed by IES’ Disclosure Review Board. The Disclosure Review Board (DRB) comprised of members from each NCES Division, representatives from IES’ Statistical Standards Program, and a member from each of the Institute of Education Sciences (IES) Centers. The DRB will review disclosure risk analyses conducted by the REL contractor to ensure that data released do not disclose the identity of any individual respondent. The DRB approves the procedures used to remove direct identifiers from restricted-use data files. Administrative data will not be included in the data file, but instructions on how to obtain those data and information on how those data were used in the analysis will be included. These data files are made available so that other researchers can replicate the REL’s research or answer additional research questions.
No responses or data will be reported for individual staff members, students, or schools. Reported data will contain no fewer than four cases per reported table cell to protect confidentiality and mask individually identifiable data.
The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date. The surveys and notification letters will display the expiration date for OMB approval.
This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).
Duncan, G. J., Dowsett, C. J., Claessens, A., Magnuson, K., Huston, A. C., & Klebanov, P. (2007). School readiness and later achievement. Developmental Psychology, 43, 1428−1446. https://doi.org/10.1037/0012-1649.43.6.1428Education Sciences Reform Act, Public Law 107-279. 20 U.S.C. ch. 76 §§9501-
9631 (2002).
Frye, D., Baroody, A. J., Burchinal, M., Carver, S. M., Jordan, N. C., & McDowell, J. (2013). Teaching math to young children practice guide (NCEE 2014-4005). U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. https://ies.ed.gov/ncee/wwc/practiceguide/18
Ginsburg, H. P., & Pappas, S. (2016). Invitation to the birthday party: Rationale and description. ZDM Mathematics Education, 48, 947–960. https://doi-org.sri.idm.oclc.org/10.1007/s11858-016-0818-4
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.). Sage.
Reid, E. E., & Melgar, C. (2018, March). A center-based approach to changing teacher math attitudes in Head Start centers. Poster presented at the annual Erikson Research Symposium.
VDOE. (n.d.) Fall Membership Build-A-Table. https://p1pe.doe.virginia.gov/apex_captcha/home.do?apexTypeId=304
Watts, T. W., Duncan, G. J., Siegler, R. S., & Davis-Kean, P. (2014). What’s past is prologue: Relations between early mathematics knowledge and high school achievement. Educational Researcher, 43(7), 352–360.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Statement Part A |
Author | Authorised User |
File Modified | 0000-00-00 |
File Created | 2023-12-13 |