Impact Evaluation of Teacher Residency Programs
Supporting Statement for Paperwork Reduction Act Submission
PART A: Justification
December 2020
Submitted to: |
Submitted by: |
U.S. Department of Education Institute of Education Sciences 550 12th Street, SW Washington, DC 20202 Attn: Meredith Bachman, Project Officer Contract: 91990019C0066 |
Mathematica P.O. Box 2393 Princeton, NJ 08543-2393 Phone: (609) 799-3535 Fax: (609) 799-0005 Project Director: Jill Constantine Project Reference: 50911 |
Contents
A.1. Circumstances making collection of data necessary 1
A.2. Purpose and use of data 1
A.3. Use of technology to reduce burden 5
A.3.1. Classroom rosters from districts 5
A.3.2. Administrative data from districts on teachers and students 5
A.3.3. Teacher surveys and mentor teacher surveys 5
A.3.4. Residency program interviews and cost interviews 5
A.4. Efforts to identify and avoid duplication 6
A.5. Efforts to minimize burden on small businesses or other entities 6
A.6. Consequences of not collecting data 6
A.8. Federal register announcement and consultation 6
A.8.1. Federal Register Announcement 6
A.8.2. Consultations Outside the Agency 6
A.9. Payments to respondents 7
A.10. Assurance of confidentiality 7
A.11. Questions of a sensitive nature 8
A.12. Estimates of respondent burden 8
A.13. Estimates of the cost burden to respondents 12
A.14. Estimates of annualized government costs 12
A.15. Changes in hour burden 12
A.16. Plans for analysis, tabulation, and publication of results 12
A.17. Approval to not display expiration date 13
A.18. Exceptions to Certification Statement 13
Exhibits
Exhibit A.1. Key research questions 2
Exhibit A.3. Estimate of respondent time burden by year, current request 9
Exhibit A.4. Estimate of respondent cost burden by year, current request 10
Exhibit A.5. Analysis methods planned to answer the research question in each report 12
The U.S. Department of Education (ED)’s Institute of Education Sciences (IES) requests clearance for data collection activities to support the first large-scale rigorous study of teacher residency programs. These programs are rapidly increasing in popularity, as a potential way to address persistent inequities in student access to high quality teachers. They combine education coursework with an extensive full-year apprenticeship or “residency”, under the supervision of an experienced mentor teacher, to prepare teachers to fill hard to staff positions in partner districts. They also include financial support to residents in exchange for a commitment to stay in those districts for at least three to five years.
This initial request covers collection of classroom rosters from schools. These are needed before the study can begin to support random assignment of students to participating teachers. A future request will seek clearance for data collection activities needed later in the study to examine program outcomes.
Collecting information about residency programs is critical given ED’s need to provide rigorous evidence on promising approaches to teacher preparation and the increasing popularity of the residency approach. Efforts to better prepare new teachers and retain effective ones are a central goal of Title IIA of the Every Student Succeeds Act (ESSA). Seen as a promising way to achieve these goals, teacher residency programs have grown considerably since the first programs began in the early 2000s, with substantial investment from ED. In addition, ESSA, passed in 2015, allows states and districts to use Title IIA funds to support teacher residency programs, and 15 states and the District of Columbia have promoted these programs in their ESSA plans (National Center for Teacher Residencies 2018).
Despite the growth in residency programs, there is little rigorous evidence on whether they are an effective approach for preparing new teachers and addressing persistent inequities in access to high quality teachers. Some evidence indicates that residency graduates are more effective (Garrison 2019; Papay et al. 2012) and have higher retention in their districts (Papay et al. 2012; Silva et al. 2015) than other teachers. However, findings come from small scale nonexperimental studies and are mixed. Additionally, there is no evidence on the costs or cost-effectiveness of teacher preparation programs. This study will provide evidence on the effectiveness and costs of teacher residency programs and will examine ways to improve teacher preparation more broadly.
IES has contracted with Mathematica and its partners, the National Center for Teacher Residencies, Decision Information Resources, Clowder Consulting, and IRIS Connect (together, the “study team”), to conduct the study, including all data collection. The study will collect data to answer and report on the questions shown in Exhibit A.1.
Exhibit A.1. Key research questions
Key research questions |
|
|
|
|
|
Exhibit A.2 lists the types of data to be collected and the purpose of each.
Data source |
Sample |
Respondent |
Mode and timing |
Use(s) in study |
Data collections included in current clearance request |
||||
Classroom rosters
|
Students assigned to participating teachers in fall 2021 and fall 2022 |
School administrative staff |
Electronic or paper form collected four times in 2021–2022 and 2022–2023 (before and after random assignment) |
Randomly assign students to teachers and track any mobility during the school year |
Data collections included in future clearance request |
||||
Administrative data on students
|
Students assigned to participating teachers in fall 2021 and fall 2022 |
District data office staff |
Electronic records collected in fall 2022 and fall 2023 |
Estimate residency graduates’ effectiveness in improving student achievement in English language arts and math Describe student sample |
Administrative data on teachers |
Participating teachers in 2021–2022 and 2022–2023 |
District data office staff |
Electronic records collected in fall 2022 and fall 2023 |
Compare retention in school and district between residency graduates and other teachers in the study |
Teacher surveys |
Participating teachers in 2021–2022 and 2022–2023 |
Teachers |
Web survey in spring 2022 and 2023 |
Describe teachers’ preparation experiences, ongoing support from preparation program and district, preparedness for teaching, job satisfaction, and background characteristics |
Classroom observations |
Classrooms of participating teachers |
Study team staff |
Video recordings of two lessons per participating teacher in 2021–2022 and 2022–2023, scored using the Classroom Assessment Scoring System (CLASS) rubric |
Compare teaching practices of residency graduates and other teachers in the study |
Residency program interviews |
All residency programs nationwide |
Residency program directors |
Telephone interviews in summer 2022 |
Describe key features of residency programs, including their approaches to candidate recruitment and selection; placement of program participants in residency and first teaching jobs; coursework requirements and integration with classroom experience; characteristics of residency; teaching strategies for special populations; financial support for residents or other costs; selection, training, and compensation of mentors Examine how these features relate to residency graduates’ effectiveness |
District cost interviews |
Participating districts |
District residency coordinator |
Telephone interviews in spring 2022 |
Describe cost to districts of hiring residency graduates |
Mentor teacher surveys |
Residency program mentor teachers in participating districts in 2021–2022 |
Mentor teachers |
Web survey in spring 2022 |
Describe mentor teachers’ perceptions of how serving as a mentor for a resident affects their own teaching skills |
To minimize burden on study participants, the study team will use strategies that have proven successful in past studies the team has conducted with similar populations of teacher preparation program directors, district and school administrators, teachers, and students (these include the Impact Study of Feedback for Teachers Based on Classroom Videos, An Implementation Evaluation of Teacher Residency Programs, and Impact on Secondary Student Math Achievement of Highly Selective Routes to Alternative Certification). Strategies to minimize burden using technology are described below for each type of data collection in the study.
To minimize burden on school administrative staff, the study team will collect classroom rosters in electronic form using a secure file sharing site (Appendix A). Schools will have the flexibility to submit electronic data in a wide range of file formats (for example, Excel, csv, SAS). Schools will also have the option of submitting paper versions of the classroom rosters.
To minimize burden on district data office staff, the study team will collect administrative data in electronic form using a secure file sharing site. Districts will have the flexibility to submit electronic data in a wide range of file formats (for example, Excel, csv, or SAS).
To minimize burden on survey respondents, the study team will administer the surveys in a user-friendly web-based format. Features of the surveys include:
Secure personalized access. All survey respondents will receive a customized link to the survey and be able to complete it at the time and location of their choice. The survey software will also allow respondents to save responses and return to the survey later to finish at their convenience.
Automated skip patterns. Skip logic embedded in the surveys will minimize respondent burden by omitting non-applicable questions. This type of programming also reduces entry errors that may require follow-up contacts to gather correct information.
Automated validation checks. The software will check for allowable ranges for numeric questions, minimizing out-of-range or unallowable values. This reduces entry errors that may require following up with contacts to gather correct information.
Closed-ended questions. These types of questions reduce burden on respondents by minimizing the effort required to respond. Some questions will include “other, specify” options to ensure respondents have an opportunity to enter information that does not fit pre-existing options.
Trained interviewers will schedule and conduct the telephone interviews at a time convenient for the interviewees. Interviewers will follow a structured protocol designed to be completed within the established time frame. Interviewers will also alert interviewees in advance about any information they may need to compile to help them answer the questions. This will save time during the interview and allow respondents to compile the information at their convenience.
Whenever possible, the study team will use administrative data and publicly available data from ED’s website to gather the information needed to address the study’s research questions. For example, it will use student test score data from district administrative data to measure residency graduates’ effectiveness and information from ED’s Common Core of Data to measure school demographics. The study team has built on prior studies of teacher preparation programs (such as An Implementation Evaluation of Teacher Residency Programs and the Impact on Secondary Student Math Achievement of Highly Selective Routes to Alternative Certification) to design instruments that are clear and concise. The information to be collected in each of the data collection activities is not available elsewhere.
The study will not collect any information from small businesses but may collect information from small teacher residency programs. The data collection procedures have been designed to minimize burden on teacher residency programs both large and small. The study team will schedule the telephone interview with residency program directors at their convenience, to minimize any disruption to their regular responsibilities.
Collecting the data for this study is necessary for ED to provide rigorous evidence on the effectiveness and costs of the promising approach to teacher preparation used by residency programs. It is also needed to learn about effective strategies used by residency programs that could be used by teacher preparation programs more broadly to better prepare new teachers. If these data are not collected, policymakers, districts, residency programs, and the public will be less informed about the effectiveness of an increasingly common approach to teacher preparation and about the effective use of Title IIA funds.
There are no special circumstances associated with this data collection.
The 60-day Federal Register (Appendix B) notice was published on December 15, 2020 at 85 FR 81189. There were no public comments. The 30-day Federal Register notice will be published to solicit additional public comment.
The study team has formed an external technical working group in partnership with ED to provide guidance on the study design, instrumentation, and data collection for the study. The technical working group includes the following six members:
David Blazar, Assistant Professor of Education Policy and Economics, University of Maryland College of Education (evaluation methods)
Kwamé Floyd, Office of Strategic Operations, New Jersey Department of Education (residency program operations)
Brooke James, Dean, Relay Graduate School of Education (residency program operations)
James Kemple, Executive Director of the Research Alliance for New York City Schools and Research Professor at the Steinhardt School of Culture, Education, and Human Development at New York University (evaluation methods)
Rebecca Sieg, Urban Teachers (residency program operations)
James Wyckoff, Curry Memorial Professor of Education and Policy, Professor at the Frank Batten School of Leadership and Public Policy, and Director of EdPolicyWorks, University of Virginia (evaluation methods)
The study team convened the technical working group in June 2020 to review the study design and ensure the study will provide information that policymakers, school districts, residency programs, and other teacher preparation programs can use to improve teacher preparation. The study team plans to convene the technical working group again before the public release of the final report.
To maximize the success of the data collection effort, we propose incentives to offset the time and effort required of school administrative staff, teachers, and mentor teachers. Incentives will also help increase response rates, reduce nonresponse bias, and improve survey representativeness, which will increase the reliability of the study findings (James 1997; Goritz 2006; Groves et al. 2006; Messer and Dillman 2011; Singer and Kulka 2002; Singer and Ye 2013). The proposed incentive amounts are consistent with guidelines in the March 22, 2005, memorandum, “Guidelines for Incentives for NCEE Evaluation Studies,” prepared for OMB, including the standard practice of linking the dollar amounts to the extent of burden.
Teacher incentive for collecting parent permission forms. To compensate teachers for their time to collect parent permission forms, we propose an incentive ranging from $25 to $50. Parents will use the forms to provide permission for their children to be included in classroom video recordings. A high return rate will be critical for ensuring that the recordings accurately capture teachers’ performance. All teachers will receive $25 gift cards for distributing the forms. In addition, in districts that require active parental consent to include students in the video recordings, we will offer teachers an additional $25 gift card for collecting parent permission forms for at least 85 percent of their students. The maximum incentive of $50 for any one teacher for collecting parent permission forms is roughly $2 per form. This is less than the $3-per-student recommendation for low-burden teacher ratings of students, which is the closest analog in the NCEE memorandum. We expect teachers will have to remind students and call or email parents to obtain completed forms for 85 percent of their students.
Teacher incentive for teacher survey. To compensate teachers for the 30 minutes required to complete the teacher survey, we propose a $30 incentive for completion.
Mentor teacher incentive for mentor teacher survey. To compensate mentor teachers for the 30 minutes required to complete the mentor teacher survey, we propose a $30 incentive for completion.
Mathematica and its research partners will conduct all data collection activities for this study in accordance with relevant regulations and requirements, which are:
The Privacy Act of 1974, P.L. 93-579 (5 U.S.C. 552a)
The Family Educational and Rights and Privacy Act (FERPA) (20 U.S.C. 1232g; 34 CFR Part 99)
The Protection of Pupil Rights Amendment (PPRA) (20 U.S.C. 1232h; 34 CFR Part 98)
The Education Sciences Institute Reform Act of 2002, Title I, Part E, Section 183
All Mathematica employees sign a confidentiality pledge (Appendix C) that emphasizes the importance of confidentiality and describes employees’ obligations to maintain it. In addition, the study team will take the following steps to protect the confidentiality of all data collected for the study:
All data will be stored in secure areas accessible only to authorized staff members and will be destroyed as soon as they are no longer required.
Personally identifiable information will be kept separate from analysis data files and password-protected. The study team will assign each respondent a unique identification number and use those numbers to construct raw data and analysis files.
Access to hard copy documents will be strictly limited. Documents will be stored in locked files and cabinets. Discarded materials will be shredded.
Secure transfer sites with limited access will be created and maintained for the duration of the administrative data collection task.
In public reports, residency program and district findings will be presented in aggregate by type of district respondent or for subgroups of interest. No reports will identify individual respondents or school districts.
All data collection forms will include the following or similar language:
“Responses to this data collection will be used only for research purposes. The report prepared for this study will summarize findings across the sample and will not associate responses with a specific residency program, district, school, or individual. We will not provide information that identifies you, your school, or your district to anyone outside the study team, except as required by law. Additionally, no one at your school or in your district will see your responses.”
To ensure that study participants are properly protected, Mathematica’s Institutional Review Board will review the study design protocols, informed consent process, data security plan, and all data collection instruments and procedures.
The study does not include questions of a sensitive nature.
The total annual respondent burden for the data collection effort covered by this clearance request is 117 hours. Exhibit A.3 presents the estimated time burden to respondents. Exhibit A.4 presents the estimated cost burden to respondents. The following assumptions informed these burden estimates:
School administrative staff. Administrative staff at each participating school will compile and submit classroom rosters (Appendix A). The study team will request classroom rosters at four points during the school year—first to assign students to classes and then to verify that the students have remained in their assigned classes. Compiling and submitting these data will take approximately 0.25 hours per classroom, per request. Each school will have approximately 5 participating classrooms, for a total of 350 classrooms across approximately 35 schools per year over two school years. With four requests per classroom, there are 1,400 requests across all participating classrooms. The cost to the school is based on an average hourly wage of $20.87 per hour in 2019 for Secretaries and Administrative Assistants (Bureau of Labor Statistics [BLS] 2020).
Exhibit A.3. Estimate of respondent time burden by year, current request
Respondent type |
Time per response (hours) |
Maximum number of responses |
Number of respondents |
Total time burden (hours) |
2021–2022 school year (July 1, 2021 – June 30, 2022) |
||||
School administrative staff |
|
|
|
|
Classroom rosters |
0.25 |
4 |
175 |
175 |
2021–2022 total hours |
|
|
|
175 |
2022–2023 school year (July 1, 2022 – June 30, 2023) |
||||
School administrative staff |
|
|
|
|
Classroom rosters |
0.25 |
4 |
175 |
175 |
2022–2023 total hours |
|
|
|
175 |
Total burden across all years |
|
|
|
350 |
Average burden per year |
|
|
|
117 |
Exhibit A.4. Estimate of respondent cost burden by year, current request
Data collection |
Annual salary estimate |
Average hourly wage |
Time per response (hours) |
Maximum number of responses |
Cost per response |
Number of respondents |
Total cost for responses |
2021–2022 school year (July 1, 2021 – June 30, 2022) |
|||||||
School administrative staff |
|
|
|
|
|
|
|
Classroom rosters |
$43,410 |
$20.87a |
0.25 |
4 |
$5.22 |
175 |
$3,652.25 |
2021–2022 total cost |
|
|
|
|
|
|
$3,652.25 |
2022–2023 school year (July 1, 2022 – June 30, 2023) |
|||||||
School administrative staff |
|
|
|
|
|
|
|
Classroom rosters |
$43,410 |
$20.87 |
0.25 |
4 |
$5.22 |
175 |
$3,652.25 |
2022–2023 total cost |
|
|
|
|
|
|
$3,652.25 |
Total cost across all years |
|
|
|
|
|
|
$7,304.50 |
Average cost per year |
|
|
|
|
|
|
$2,434.83 |
a The cost for the school administrative staff is based on an average hourly wage of $20.87 in 2019 for Secretaries and Administrative Assistants (BLS 2020).
There are no direct or start-up costs to respondents associated with this data collection.
The estimated cost to the federal government of the study, including its design, data collection activities, recruiting, analysis, and reporting, is $8,200,988. The estimated average annual cost is $1,366,831 (total cost divided by six years of the study).
This is a request for a new collection of information.
The study will produce two reports, one about the strategies residency programs use to recruit and train candidates, and the other about the effectiveness and retention of residency graduates relative to other teachers in the same schools. Exhibit A.5 describes the main analyses for each report, along with the expected publication dates.
Exhibit A.5. Analysis methods planned to answer the research question in each report
Research question |
Analysis method |
Report 1: Strategies used by residency programs nationwide (expected publication fall 2023) |
|
What are the characteristics of residency programs nationally, and what strategies do they use to better prepare new teachers? |
Descriptive analysis of residency program characteristics and strategies for recruiting and preparing teachers |
Report 2: Effectiveness and retention of residency graduates (expected publication spring 2025) |
|
Are residency graduates more effective than other teachers? Does this change as teachers progress in their careers? |
Regression analysis comparing test scores of students assigned to residency graduates and students assigned to other teachers in the same grades and schools, controlling for teacher experience Subgroup analyses for early-career teachers (those in their first five years of teaching) and more experienced teachers (those with at least six years of experience) |
Do residency graduates remain in teaching longer than other teachers? |
Survival analysis to examine whether residency graduates are more likely to remain in teaching than other teachers |
What explains any differences in effectiveness and retention between residency graduates and other teachers? Are the differences explained by the types of candidates residency programs select? Particular features of residency programs? |
Subgroup analyses by program features Mediation analyses using regression models with and without controls for teacher characteristics |
Are residency programs a cost-effective strategy for improving student achievement? |
Cost-effectiveness analysis comparing the districts’ costs of hiring a residency graduate (over and beyond the cost of hiring another teacher) to benefits in terms of improved student achievement |
No exemption is requested. The data collection instruments will display the expiration date.
This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).
American Association of Colleges for Teacher Education. “Teacher Quality Partnership Grants 2019 Fact Sheet.” 2019. Available at https://secure.aacte.org/apps/rl/res_get.php?fid=1329&ref=rl. Accessed April 20, 2020.
Bureau of Labor Statistics, U.S. Department of Labor. “Occupational Employment and Wages.” 2020. Available at https://www.bls.gov/oes/current/oes_nat.htm. Accessed October 29, 2020.
Garrison, Anne Walton. “Memphis Teacher Residency: Teacher Effectiveness in 2017–18.” Memphis, TN: Memphis City Schools, Department of Research and Performance Management, February 2019.
Goritz, A. “Incentives in Web Studies: Methodological Issues and a Review.” International Journal of Internet Science, vol. 1, no. 1, 2006, pp. 58–70.
Groves, R.M., M.P. Couper, S. Presser, E. Singer, R. Tourangeau, G. Acosta, and L. Nelson. “Experiments in Producing Nonresponse Bias.” Public Opinion Quarterly, vol. 70, no. 5, 2006, pp. 720–736.
James, T. “Results of Wave 1 Incentive Experiment in the 1996 Survey of Income and Program Participation.” Proceedings of the Survey Research Section, American Statistical Association, 1997, pp. 834–883.
Messer, B., and D. Dillman. “Surveying the General Public Over the Internet Using Address-Based Sampling and Mail Contact Procedures.” Public Opinion Quarterly, vol. 75, 2011, pp. 429–457.
National Center for Education Statistics, U.S. Department of Education. Digest of Education Statistics, 2019. Available at https://nces.ed.gov/programs/digest/d19/tables/dt19_211.20.asp. Accessed November 11, 2020.
National Center for Teacher Residencies. “2017–18 Network Partner Report.” Chicago, IL: National Center for Teacher Residencies, 2018.
Papay, J.P., M.R. West, J.B. Fullerton, and T.J. Kane. “Does an Urban Teacher Residency Increase Student Achievement? Early Evidence from Boston.” Educational Evaluation and Policy Analysis, vol. 34, no. 4, 2012, pp. 413–434. doi:10.3102/0162373712454328.
Silva, Tim, Allison McKie, and Philip Gleason. “New Findings on the Retention of Novice Teachers from Teaching Residency Programs.” NCEE Evaluation Brief 2015-4015. Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education, August 2015.
Singer, E., and R.A. Kulka. “Paying Respondents for Survey Participation.” In Studies of Welfare Populations: Data Collection and Research Issues. Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs, edited by Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press, 2002, pp. 105–128.
Singer, E., and C. Ye. “The Use and Effectives of Incentives in Surveys.” Annals of the American Academy of Political and Social Science, vol. 645, no. 1, 2013, pp. 112–141.
.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Mathematica Report Template |
Author | Melissa Clark |
File Modified | 0000-00-00 |
File Created | 2021-03-18 |