Impact Evaluation of Teacher Residency Programs
Supporting Statement for Paperwork Reduction Act Submission
PART A: Justification
September 2021
Submitted to: |
Submitted by: |
U.S. Department of Education Institute of Education Sciences 550 12th Street, SW Washington, DC 20202 Attn: Meredith Bachman, Project Officer Contract: 91990019C0066 |
Mathematica P.O. Box 2393 Princeton, NJ 08543-2393 Phone: (609) 799-3535 Fax: (609) 799-0005 Project Director: Melissa Clark Project Reference: 50911 |
Contents
A.1. Circumstances making collection of data necessary 1
A.2. Purpose and use of data 1
A.3. Use of technology to reduce burden 5
A.3.1. Classroom rosters from districts 5
A.3.2. Administrative data from districts on teachers and students 5
A.3.4. Residency program interviews and cost interviews 5
A.4. Efforts to identify and avoid duplication 6
A.5. Efforts to minimize burden on small businesses or other entities 6
A.6. Consequences of not collecting data 6
A.8. Federal register announcement and consultation 6
A.8.1. Federal Register Announcement 6
A.8.2. Consultations Outside the Agency 6
A.9. Payments to respondents 7
A.10. Assurance of confidentiality 7
A.11. Questions of a sensitive nature 8
A.12. Estimates of respondent burden 8
A.13. Estimates of the cost burden to respondents 12
A.14. Estimates of annualized government costs 12
A.15. Changes in hour burden 12
A.16. Plans for analysis, tabulation, and publication of results 12
A.17. Approval to not display expiration date 13
A.18. Exceptions to Certification Statement 13
APPENDIX D: ADMINISTRATIVE DATA REQUEST FORM
APPENDIX E: TEACHER SURVEY
APPENDIX F: PROGRAM INTERVIEW
APPENDIX G: COST INTERVIEW
APPENDIX H: 60-DAY FEDERAL REGISTER NOTICE
APPENDIX I: RESPONSE TO 60-DAY PUBLIC COMMENTS
Exhibits
Exhibit A.1. Key research questions 2
Exhibit A.3. Estimate of respondent time burden by year, current request 8
Exhibit A.4. Estimate of respondent cost burden by year, current request 9
Exhibit A.5. Analysis methods planned to answer the research question in each report 10
The U.S. Department of Education (ED)’s Institute of Education Sciences (IES) requests clearance for data collection activities to support the first large-scale rigorous study of teacher residency programs. These programs are rapidly increasing in popularity, as a potential way to address persistent inequities in student access to high quality teachers. They combine education coursework with an extensive full-year apprenticeship or “residency,” under the supervision of an experienced mentor teacher, to prepare teachers to fill hard to staff positions in partner districts. They also include financial support to residents in exchange for a commitment to stay in those districts for at least three to five years.
This second of two requests for the study covers data collection activities that support describing residency programs nationally and estimating the effectiveness of teachers who graduate from these programs. A prior request (1850-0960, approved 4/26/2021) covered the collection of classroom rosters from schools to support random assignment of students to participating teachers.
Collecting information about residency programs is critical given ED’s need to provide rigorous evidence on promising approaches to teacher preparation and the increasing popularity of the residency approach. Efforts to better prepare new teachers and retain effective ones are a central goal of Title IIA of the Every Student Succeeds Act (ESSA). Seen as a promising way to achieve these goals, teacher residency programs have grown considerably since the first programs began in the early 2000s, with substantial investment from ED. In addition, ESSA, passed in 2015, allows states and districts to use Title IIA funds to support teacher residency programs, and 15 states and the District of Columbia have promoted these programs in their ESSA plans (National Center for Teacher Residencies 2018).
Despite the growth in residency programs, there is little rigorous evidence on whether they are an effective approach for preparing new teachers and addressing persistent inequities in access to high quality teachers. Some evidence indicates that residency graduates are more effective (Garrison 2019; Papay et al. 2012) and have higher retention in their districts (Papay et al. 2012; Silva et al. 2015) than other teachers. However, findings come from small scale nonexperimental studies and are mixed. Additionally, there is no evidence on the costs or cost-effectiveness of teacher preparation programs. This study will provide evidence on the effectiveness and costs of teacher residency programs and will examine ways to improve teacher preparation more broadly.
IES has contracted with Mathematica and its partners, the National Center for Teacher Residencies, Decision Information Resources, Clowder Consulting, and IRIS Connect (together, the “study team”), to conduct the study, including all data collection. The study will collect data to answer and report on the questions shown in Exhibit A.1.
Exhibit A.1. Key research questions
Key research questions |
|
|
|
|
|
Exhibit A.2 lists the types of data to be collected and the purpose of each.
Data source |
Sample |
Respondent |
Mode and timing |
Use(s) in study |
Data collections included in previous clearance request (OMB 1850-0960, approved 4/26/2021) |
||||
Classroom rosters
|
Students assigned to participating teachers in fall 2021 and fall 2022 |
School administrative staff |
Electronic or paper form collected four times in 2021–2022 and 2022–2023 (before and after random assignment) |
Randomly assign students to teachers and track any mobility during the school year |
Data collections included in current clearance request |
||||
Administrative data on students
|
Students assigned to participating teachers in fall 2021 and fall 2022 |
District data office staff |
Electronic records collected in fall 2022 and fall 2023 |
Estimate residency graduates’ effectiveness in improving student achievement in English language arts and math Describe student sample |
Administrative data on teachers |
Participating teachers in 2021–2022 and 2022–2023 |
District data office staff |
Electronic records collected in fall 2022 and fall 2023 |
Compare retention in school and district between residency graduates and other teachers in the study |
Teacher surveys |
Participating teachers in 2021–2022 and 2022–2023 |
Teachers |
Web survey in spring 2022 and 2023 |
Describe teachers’ preparation experiences, ongoing support from preparation program and district, preparedness for teaching, job satisfaction, and background characteristics |
Residency program interviews |
All residency programs nationwide |
Residency program directors |
Telephone interviews in summer 2022 |
Describe key features of residency programs, including their approaches to candidate recruitment and selection; placement of program participants in residency and first teaching jobs; coursework requirements and integration with classroom experience; characteristics of residency; teaching strategies for special populations; financial support for residents or other costs; selection, training, and compensation of mentors Examine how these features relate to residency graduates’ effectiveness |
District cost interviews |
Participating districts |
District residency coordinator |
Telephone interviews in spring 2022 |
Describe cost to districts of hiring residency graduates |
To minimize burden on study participants, the study team will use strategies that have proven successful in past studies the team has conducted with similar populations of teacher preparation program directors, district and school administrators, teachers, and students (these include the Impact Study of Feedback for Teachers Based on Classroom Videos, An Implementation Evaluation of Teacher Residency Programs, and Impact on Secondary Student Math Achievement of Highly Selective Routes to Alternative Certification). Strategies to minimize burden using technology are described below for each type of data collection in the study.
To minimize burden on school administrative staff, the study team will collect classroom rosters in electronic form using a secure file sharing site (Appendix A, approved 4/26/2021). Schools will have the flexibility to submit electronic data in a wide range of file formats (for example, Excel, csv, SAS). Schools will also have the option of submitting paper versions of the classroom rosters.
To minimize burden on district data office staff, the study team will collect administrative data in electronic form using a secure file sharing site (Appendix D). Districts will have the flexibility to submit electronic data in a wide range of file formats (for example, Excel, csv, or SAS).
To minimize burden on survey respondents, the study team will administer the teacher surveys in a user-friendly web-based format (Appendix E). Features of the surveys include:
Secure personalized access. All survey respondents will receive a customized link to the survey and be able to complete it at the time and location of their choice. The survey software will also allow respondents to save responses and return to the survey later to finish at their convenience.
Automated skip patterns. Skip logic embedded in the surveys will minimize respondent burden by omitting non-applicable questions. This type of programming also reduces entry errors that may require follow-up contacts to gather correct information.
Automated validation checks. The software will check for allowable ranges for numeric questions, minimizing out-of-range or unallowable values. This reduces entry errors that may require following up with contacts to gather correct information.
Closed-ended questions. These types of questions reduce burden on respondents by minimizing the effort required to respond. Some questions will include “other, specify” options to ensure respondents have an opportunity to enter information that does not fit pre-existing options.
Trained interviewers will schedule and conduct the telephone interviews at a time convenient for the interviewees (Appendices F and G). Interviewers will follow a structured protocol designed to be completed within the established time frame. Interviewers will also alert interviewees in advance about any information they may need to compile to help them answer the questions. This will save time during the interview and allow respondents to compile the information at their convenience.
Whenever possible, the study team will use administrative data and publicly available data from ED’s website to gather the information needed to address the study’s research questions. For example, it will use student test score data from district administrative data to measure residency graduates’ effectiveness and information from ED’s Common Core of Data to measure school demographics. The study team has built on prior studies of teacher preparation programs (such as An Implementation Evaluation of Teacher Residency Programs and the Impact on Secondary Student Math Achievement of Highly Selective Routes to Alternative Certification) to design instruments that are clear and concise. The information to be collected in each of the data collection activities is not available elsewhere.
The study will not collect any information from small businesses but may collect information from small teacher residency programs. The data collection procedures have been designed to minimize burden on teacher residency programs both large and small. The study team will schedule the telephone interview with residency program directors at their convenience, to minimize any disruption to their regular responsibilities.
Collecting the data for this study is necessary for ED to provide rigorous evidence on the effectiveness and costs of the promising approach to teacher preparation used by residency programs. It is also needed to learn about effective strategies used by residency programs that could be used by teacher preparation programs more broadly to better prepare new teachers. If these data are not collected, policymakers, districts, residency programs, and the public will be less informed about the effectiveness of an increasingly common approach to teacher preparation and about the effective use of Title IIA funds.
There are no special circumstances associated with this data collection.
An initial 60-day Federal Register notice for the school roster data collection was published on March 8, 2021 (86 FR 13347) (Appendix B, approved 4/26/2021). The 60-day Federal Register (Appendix H) notice for this request was published on July 21, 2021 (86 FR 38471). One set of comments was received during the 60-day public comment period. No changes were made to the data collection in response to the public comments. A response to the public comments is included in Appendix I. The 30-day Federal Register notice will be published to solicit additional public comment.
The study team has formed an external technical working group in partnership with ED to provide guidance on the study design, instrumentation, and data collection for the study. The technical working group includes the following six members:
David Blazar, Assistant Professor of Education Policy and Economics, University of Maryland College of Education (evaluation methods)
Kwamé Floyd, Office of Strategic Operations, New Jersey Department of Education (residency program operations)
Brooke James, Dean, Relay Graduate School of Education (residency program operations)
James Kemple, Executive Director of the Research Alliance for New York City Schools and Research Professor at the Steinhardt School of Culture, Education, and Human Development at New York University (evaluation methods)
Rebecca Sieg, Urban Teachers (residency program operations)
James Wyckoff, Curry Memorial Professor of Education and Policy, Professor at the Frank Batten School of Leadership and Public Policy, and Director of EdPolicyWorks, University of Virginia (evaluation methods)
The study team convened the technical working group in June 2020 to review the study design and ensure the study will provide information that policymakers, school districts, residency programs, and other teacher preparation programs can use to improve teacher preparation. The study team plans to convene the technical working group again before the public release of the final report.
To maximize the success of the data collection effort, we propose incentives to offset the time and effort required of teachers to complete the survey and residency program directors to complete the residency program interview. Specifically, we propose a $30 incentive for completion of the 30-minute teacher survey and a $50 incentive for completion of the 75-minute residency program interview.
For the teacher survey, incentives will help increase response rates, reduce nonresponse bias, and improve survey representativeness, which will increase the reliability of the study findings (James 1997; Goritz 2006; Groves et al. 2006; Messer and Dillman 2011; Singer and Kulka 2002; Singer and Ye 2013). The proposed incentive amount is consistent with guidelines in the March 22, 2005, memorandum, “Guidelines for Incentives for NCEE Evaluation Studies,” prepared for OMB, including the standard practice of linking the dollar amounts to the extent of burden.
For the residency program interview, we anticipate that program directors will have considerable competing time demands. We expect that offering an incentive for this 75-minute interview will reduce non-response associated with competing demands for their time, especially for residency programs not associated with the 15 study districts.
Mathematica and its research partners will conduct all data collection activities for this study in accordance with relevant regulations and requirements, which are:
The Privacy Act of 1974, P.L. 93-579 (5 U.S.C. 552a)
The Family Educational and Rights and Privacy Act (FERPA) (20 U.S.C. 1232g; 34 CFR Part 99)
The Protection of Pupil Rights Amendment (PPRA) (20 U.S.C. 1232h; 34 CFR Part 98)
The Education Sciences Institute Reform Act of 2002, Title I, Part E, Section 183
All Mathematica employees sign a confidentiality pledge (Appendix C) that emphasizes the importance of confidentiality and describes employees’ obligations to maintain it. In addition, the study team will take the following steps to protect the confidentiality of all data collected for the study:
All data will be stored in secure areas accessible only to authorized staff members and will be destroyed as soon as they are no longer required.
Personally identifiable information will be kept separate from analysis data files and password-protected. The study team will assign each respondent a unique identification number and use those numbers to construct raw data and analysis files.
Access to hard copy documents will be strictly limited. Documents will be stored in locked files and cabinets. Discarded materials will be shredded.
Secure transfer sites with limited access will be created and maintained for the duration of the administrative data collection task.
In public reports, residency program and district findings will be presented in aggregate by type of district respondent or for subgroups of interest. No reports will identify individual respondents or school districts.
All data collection forms will include the following or similar language:
“Responses to this data collection will be used only for research purposes. The report prepared for this study will summarize findings across the sample and will not associate responses with a specific residency program, district, school, or individual. We will not provide information that identifies you, your school, or your district to anyone outside the study team, except as required by law. Additionally, no one at your school or in your district will see your responses.”
To ensure that study participants are properly protected, Mathematica’s Institutional Review Board will review the study design protocols, informed consent process, data security plan, and all data collection instruments and procedures.
The study does not include questions of a sensitive nature.
The total annual respondent burden for the data collection effort covered by this clearance request is 277 hours. Exhibit A.3 presents the estimated time burden to respondents. Exhibit A.4 presents the estimated cost burden to respondents. The following assumptions informed these burden estimates:
District data office staff. Data office staff at each participating school district will compile and submit administrative data on students and teachers twice (in fall 2022 and fall 2023). Collecting and submitting these data will take approximately 16 hours per request, with two requests per district across 15 districts. The cost to the district is based on an average hourly wage of $46.21 per hour in 2019 for Database Administrators and Architects (BLS 2020).
Teachers. Each teacher will complete a teacher survey once (spring 2021 or spring 2022). There are 350 teachers in the sample. The survey will take approximately 30 minutes. The cost to the school is based on an average hourly wage of $30.74 per hour in 2019 for Elementary School Teachers, Except Special Education (BLS 2020).
Residency program directors. The director of each residency program will participate in the residency program interview (in summer 2022) across the estimated 105 residency programs nationwide, plus a subsample of approximately 10 programs in Louisiana. The interview will take approximately 75 minutes: 15 minutes to gather records and complete a spreadsheet prior to the interview plus 60 minutes to complete a telephone interview. The cost to the residency program is based on an average hourly wage of $54.04 per hour in 2019 for Education Administrators, Postsecondary (BLS 2020).
District residency coordinators. The residency program coordinator within each of the 15 study districts will participate in the cost interview (in spring 2022). It will take the program coordinators approximately 1 hour to gather relevant cost records and 40 minutes to complete the telephone interview in addition to a 20 minute orientation call. The cost to the school district is based on an average hourly wage of $48.24 per hour in 2019 for Education Administrators, Kindergarten through Secondary (BLS 2020).
Exhibit A.3. Estimate of respondent time burden by year, current request
Respondent type |
Time per response (hours) |
Maximum number of responses |
Number of respondents |
Total time burden (hours) |
2021–2022 school year (July 1, 2021 – June 30, 2022) |
||||
Teachers |
|
|
|
|
Teacher survey |
0.5 |
1 |
175 |
88 |
Residency program directors |
|
|
|
|
Residency program interview |
1.25 |
1 |
115 |
144 |
District residency coordinators |
|
|
|
|
Cost interview |
2.0 |
1 |
15 |
30 |
2021–2022 total hours |
|
|
|
262 |
2022–2023 school year (July 1, 2022 – June 30, 2023) |
||||
District data office staff |
|
|
|
|
Administrative data on teachers and students |
16.0 |
2 |
15 |
480 |
Teachers |
|
|
|
|
Teacher survey |
0.5 |
1 |
175 |
88 |
2022–2023 total hours |
|
|
|
568 |
Total burden across all years |
|
|
|
830 |
Average burden per year |
|
|
|
277 |
Exhibit A.4. Estimate of respondent cost burden by year, current request
Data collection |
Annual salary estimate |
Average hourly wage |
Time per response (hours) |
Maximum number of responses |
Cost per response |
Number of respondents |
Total cost for responses |
2021–2022 school year (July 1, 2021 – June 30, 2022) |
|||||||
Teachers |
|
|
|
|
|
|
|
Teacher survey |
$63,930 |
$30.74a |
0.5 |
1 |
$15.37 |
175 |
$2,689.75 |
Residency program directors |
|
|
|
|
|
|
|
Residency program interview |
$112,400 |
$54.04b |
1.25 |
1 |
$67.55 |
115 |
$7,768.25 |
District residency coordinators |
|
|
|
|
|
|
|
Cost interview |
$100,340 |
$48.24c |
2.0 |
1 |
$96.48 |
15 |
$1,447.20 |
2021–2022 total cost |
|
|
|
|
|
|
$11,905.20 |
2022–2023 school year (July 1, 2022 – June 30, 2023) |
|||||||
District data office staff |
|
|
|
|
|
|
|
Administrative data on teachers and students |
$96,110 |
$46.21d |
16.0 |
2 |
$739.36 |
15 |
$22,180.80 |
Teachers |
|
|
|
|
|
|
|
Teacher survey |
$63,930 |
$30.74 |
0.5 |
1 |
$15.37 |
175 |
$2,689.75 |
2022–2023 total cost |
|
|
|
|
|
|
$24,870.55 |
Total cost across all years |
|
|
|
|
|
|
$36,775.75 |
Average cost per year |
|
|
|
|
|
|
$12,258.58 |
a The cost for the teachers is based on an average hourly wage of $30.74 in 2019 for Elementary School Teachers, Except Special Education (BLS 2020).
b The cost for the residency program directors is based on an average hourly wage of $54.04 in 2019 for Education Administrators, Postsecondary (BLS 2020).
c The cost for the district residency coordinators is based on an average hourly wage of $48.24 in 2019 for Education Administrators, Kindergarten through Secondary (BLS 2020).
d The cost for the district data office staff is based on an average hourly wage of $46.21 in 2019 for Database Administrators and Architects (BLS 2020).
There are no direct or start-up costs to respondents associated with this data collection.
The estimated cost to the federal government of the study, including its design, data collection activities, recruiting, analysis, and reporting, is $8,200,988. The estimated average annual cost is $1,366,831 (total cost divided by six years of the study).
This is a request for a new collection of information.
The study will produce two reports. The first report will focus on the strategies residency programs use to recruit and train candidates and the extent to which they help address districts’ shortage areas and diversity needs. The second report will focus on the effectiveness and retention of residency graduates relative to other teachers in the same schools. Exhibit A.5 describes the main analyses for each report, along with the expected publication dates.
Exhibit A.5. Analysis methods planned to answer the research question in each report
Research question |
Analysis method |
Report 1: Strategies used by residency programs nationwide (expected publication fall 2023) |
|
What are the characteristics of residency programs nationally, what strategies do they use to better prepare new teachers, and what are the characteristics of those graduating from these programs? |
Descriptive analysis of residency program characteristics, strategies for recruiting and preparing teachers, and characteristics of program graduates |
Report 2: Effectiveness and retention of residency graduates (expected publication spring 2025) |
|
Are residency graduates more effective than other teachers? Does this change as teachers progress in their careers? |
Regression analysis comparing test scores of students assigned to residency graduates and students assigned to other teachers in the same grades and schools, controlling for teacher experience Subgroup analyses for early-career teachers (those in their first five years of teaching) and more experienced teachers (those with at least six years of experience) |
Do residency graduates remain in teaching longer than other teachers? |
Survival analysis to examine whether residency graduates are more likely to remain in teaching than other teachers |
What explains any differences in effectiveness and retention between residency graduates and other teachers? Are the differences explained by the types of candidates residency programs select? Particular features of residency programs? |
Subgroup analyses by program features Mediation analyses using regression models with and without controls for teacher characteristics |
Are residency programs a cost-effective strategy for improving student achievement? |
Cost-effectiveness analysis comparing the districts’ costs of hiring a residency graduate (over and beyond the cost of hiring another teacher) to benefits in terms of improved student achievement |
No exemption is requested. The data collection instruments will display the expiration date.
This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).
American Association of Colleges for Teacher Education. “Teacher Quality Partnership Grants 2019 Fact Sheet.” 2019. Available at https://secure.aacte.org/apps/rl/res_get.php?fid=1329&ref=rl. Accessed April 20, 2020.
Bureau of Labor Statistics, U.S. Department of Labor. “Occupational Employment and Wages.” 2020. Available at https://www.bls.gov/oes/current/oes_nat.htm. Accessed October 29, 2020.
Garrison, Anne Walton. “Memphis Teacher Residency: Teacher Effectiveness in 2017–18.” Memphis, TN: Memphis City Schools, Department of Research and Performance Management, February 2019.
Goritz, A. “Incentives in Web Studies: Methodological Issues and a Review.” International Journal of Internet Science, vol. 1, no. 1, 2006, pp. 58–70.
Groves, R.M., M.P. Couper, S. Presser, E. Singer, R. Tourangeau, G. Acosta, and L. Nelson. “Experiments in Producing Nonresponse Bias.” Public Opinion Quarterly, vol. 70, no. 5, 2006, pp. 720–736.
James, T. “Results of Wave 1 Incentive Experiment in the 1996 Survey of Income and Program Participation.” Proceedings of the Survey Research Section, American Statistical Association, 1997, pp. 834–883.
Messer, B., and D. Dillman. “Surveying the General Public Over the Internet Using Address-Based Sampling and Mail Contact Procedures.” Public Opinion Quarterly, vol. 75, 2011, pp. 429–457.
National Center for Education Statistics, U.S. Department of Education. Digest of Education Statistics, 2019. Available at https://nces.ed.gov/programs/digest/d19/tables/dt19_211.20.asp. Accessed November 11, 2020.
National Center for Teacher Residencies. “2017–18 Network Partner Report.” Chicago, IL: National Center for Teacher Residencies, 2018.
Papay, J.P., M.R. West, J.B. Fullerton, and T.J. Kane. “Does an Urban Teacher Residency Increase Student Achievement? Early Evidence from Boston.” Educational Evaluation and Policy Analysis, vol. 34, no. 4, 2012, pp. 413–434. doi:10.3102/0162373712454328.
Silva, Tim, Allison McKie, and Philip Gleason. “New Findings on the Retention of Novice Teachers from Teaching Residency Programs.” NCEE Evaluation Brief 2015-4015. Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education, August 2015.
Singer, E., and R.A. Kulka. “Paying Respondents for Survey Participation.” In Studies of Welfare Populations: Data Collection and Research Issues. Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs, edited by Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press, 2002, pp. 105–128.
Singer, E., and C. Ye. “The Use and Effectives of Incentives in Surveys.” Annals of the American Academy of Political and Social Science, vol. 645, no. 1, 2013, pp. 112–141.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Angela Edwards |
File Modified | 0000-00-00 |
File Created | 2021-12-03 |