Impact Evaluation of Departmentalized Instruction in Elementary Schools
Part A: Supporting Statement for Paperwork Reduction Act Submission
CONTENTS
PART A. JUSTIFICATION 1
A1. Circumstances necessitating the collection of information 2
A2. Purpose and use of data 2
A3. Use of technology to reduce burden 3
A4. Efforts to avoid duplication of effort 4
A5. Methods of minimizing burden on small entities 4
A6. Consequences of not collecting data 4
A7. Special circumstances 4
A8. Federal register announcement and consultation 4
A9. Payments or gifts 5
A10. Assurances of confidentiality 5
A11. Justification for sensitive questions 6
A12. Estimates of hours burden 6
A13. Estimate of cost burden to respondents 7
A14. Annualized cost to the federal government 7
A15. Reasons for program changes or adjustments 7
A16. Plans for tabulation and publication of results 8
A17. Approval not to display the expiration date for OMB approval 8
A18. Exception to the certification statement 9
APPENDICES
APPENDIX A: ADMINISTRATIVE RECORDS DATA REQUEST
appendix B: CONFIDENTIALITY PLEDGE for mathematica employees
TABLES
A.1 Technical Working Group Experts 5
A.2 Estimated response time for data collection 7
Finding creative ways to redeploy existing teachers in the classroom may yield academic benefits to students at little cost. One such strategy is departmentalized instruction, where each teacher specializes in teaching certain subjects to multiple classes of students instead of teaching all subjects to a single class of students (self-contained instruction). While nearly ubiquitous in secondary schools, departmentalization has only recently become more popular in upper elementary grades and is an improvement strategy that low-performing elementary schools identified under the Every Student Succeeds Act (ESSA) may consider adopting. The Department, through its Institute of Education Sciences (IES), received OMB clearance in 2018 to collect information for an evaluation that will provide valuable evidence on the implementation and outcomes of teachers and students as they departmentalize in fourth and fifth grades.1 However, the coronavirus pandemic during the 2019-20 and 2020-21 school years created substantial delays in data availability. The purpose of this new package is to request an extension of the original approved timeline so that it is possible to finish collecting the district administrative records needed for the evaluation.
The justification for this evaluation and the associated data collection was detailed in the original approved Supporting Statement, and remains the same.
To briefly recap, this evaluation is authorized by Title VIII Section 8601 of ESSA. ESSA gives states considerable flexibility in designing systems to hold their schools accountable for improving student achievement. This flexibility extends to the types of strategies that states encourage or require their low-performing schools to adopt. However, many strategies in use have little to no evidence of effectiveness. More research is needed to help states identify strategies that are likely to help their low-performing schools improve.
Departmentalized instruction in elementary grades is one such strategy. Despite having become more prevalent over time, overall, the educational community lacks large-scale, high-quality evidence on whether departmentalization helps or harms students. This study will address the gap by providing evidence on how teacher and student outcomes in schools that switched to departmentalized instruction compare to those of teachers and students in similar schools that continued with self-contained instruction.
The purpose and use of data for this evaluation were detailed in the original approved Supporting Statement, and remain the same.
To briefly recap, Mathematica and its partners (Public Impact; Clowder Consulting; Social Policy Research Associates; IRIS Connect) are conducting the evaluation. The evaluation will examine the implementation and outcomes of departmentalizing elementary fourth and fifth grades for a selected sample of schools across the nation. The overarching research questions are:
After two years of departmentalizing instruction, how do elementary teachers' and students' outcomes compare to those of similar teachers and students in self-contained schools?
How do schools structure departmentalization, and what challenges and benefits do principals and teachers perceive in switching from self-contained classrooms to departmentalization?
To address these overarching questions, Mathematica and its partners initially recruited a voluntary sample totaling 90 elementary schools in 12 districts across the country. All of these schools had been using traditional self-contained instruction at the time of recruitment, whereby their fourth and fifth grade teachers each taught all of the core subjects (including math and English language arts). As part of the evaluation, approximately half of these schools then agreed to implement departmentalized instruction in their fourth and fifth grades for two school years, while the other half continued with self-contained classrooms.
The evaluation has been or will be collecting a variety of data, including principal interviews to learn how teacher assignments were made and how departmentalization was structured; a teacher survey to examine teachers' perceptions of and approaches to departmentalization; and district administrative data on teacher retention, students' math and reading achievement, attendance, and disciplinary incidents, as well as teacher and student demographics. Some of the data will contribute to better understanding how to implement departmentalized instruction with fidelity, while the other data will be used to provide evidence on departmentalized instruction by comparing outcomes from the study’s departmentalized schools (treatment) to those of the study’s self-contained schools (comparison).
Additional details about the evaluation’s research questions, study design, and data collection activities can be found in the originally approved Supporting Statement, as noted above.
The purpose of this request is limited to carrying over a subset of the originally approved burden to complete one of the remaining data collection activities. This activity is the administrative district records collection, including student test scores. Districts indicated that it was easiest for them to provide all of the multiple years of records in a single request, including state assessment scores. Because these assessments were universally cancelled in spring 2020 and will now potentially be delayed in some states until fall 2021, it is not feasible to finish collecting these types of records by the expiration date of the originally approved collection (June 30, 2021). Therefore, this package requests an extension of the expiration date by 1.5 years to December 31, 2022 for only the administrative district records collection. Such an extension is needed to guarantee enough time and flexibility to complete this critical evaluation activity. All other data collection activities will be completed by the original June 30, 2021 expiration date.
To summarize, the only data collection that this package is requesting approval for through December 31, 2022 is:
District administrative student and teacher records between 2018 and 2021. Beginning in fall 2021, the study team will collect, as available, administrative records from the 2018–2019, 2019–2020, and 2020–2021 school years on (1) student outcomes and characteristics and (2) teachers’ school assignments and characteristics (see Appendix A).
Student records will allow a comparison of outcomes of students in departmentalized schools to outcomes of students in self-contained schools to provide evidence on departmentalization (student achievement in the 2020–2021 school year and behavior in the 2019–2020 and 2020–2021 school years). Data on student characteristics from 2018–2019 will allow the analysis to adjust for any residual differences in background between students in departmentalized and self-contained schools.
Records on teachers’ school assignments and characteristics between 2018 and 2021 will serve a number of purposes. For example, data on teaching placements will allow the study to examine whether departmentalized schools experienced more or less teacher attrition compared to self-contained schools, and whether there were differential changes in the types of teachers at these schools.
The data collection plan is designed to obtain information in an efficient way that minimizes respondent burden, including the use of technology when appropriate. For example, districts will be encouraged to provide electronic copies of student and teacher records. While the study team will specify the required data elements, it will accept any format the district wishes to use, to reduce burden for them. To help ensure study participants’ confidentiality, districts will upload data files directly to a secure data site.
No similar evaluations are being conducted, and there is no equivalent source for the information to be collected. Moreover, the data collection plan reflects careful attention to the potential sources of information for this study, particularly to the reliability of the information and the efficiency in gathering it. The data collection plan avoids unnecessary collection of information from multiple sources. For example, student achievement will be measured using scores from state-administered student assessments, instead of administering an assessment as part of this study.
No small businesses or entities will be involved as respondents.
If the district administrative records are not collected, then the evaluation will not be able to examine the key outcomes of interest, including student outcomes (achievement and behavior) and teacher outcomes (retention and mobility). Thus, the evaluation would not be able to provide critical evidence that schools around the country need to decide whether departmentalized instruction is an improvement strategy that they should consider implementing. That would greatly diminish the usefulness of this evaluation and prevent the evaluation from meeting one of IES’s mandates under ESSA, which is to conduct evaluations to help identify effective educational strategies.
There are no special circumstances involved with this data collection. Data collection will be conducted in a manner consistent with the guidelines in 5 CFR 1320.5.
The 60-day Federal Register notice was published on March 31, 2021. There were no public comments during the 60-day period. The 30-day Federal Register notice will be published.
In formulating the intervention and evaluation design for this evaluation, the study team sought input from several individuals with expertise in departmentalized instruction, including Lucy Steiner of Public Impact and Florence Chang of Jefferson County Public Schools. Additionally, this study has a technical working group (TWG) comprised of experts in the relevant content and methodological areas, who have/will provide guidance on all aspects of the evaluation to help ensure that it is of the highest quality and that findings are relevant to policymakers, school districts, and principals. Table A.1 lists the TWG members, their affiliation, and their relevant expertise.
Table A.1. Technical Working Group Experts
Name |
Affiliation |
Expertise |
Allison Atteberry |
Assistant Professor, University of Colorado Boulder |
Teacher assignment policies; school reforms |
Thomas Cook |
Professor Emeritus of Sociology, Psychology, Education, and Social Policy, Northwestern University |
Evaluation methods |
Cassie Guarino |
Professor of Education and Public Policy, UC Riverside |
Methods for estimating teacher effectiveness |
James Kemple |
Executive Director, The Research Alliance for New York City Schools, New York University |
School reforms; Evaluation methods |
Lisa Martin |
Chief Academic and Accountability Officer, DeKalb County School District |
Departmentalized instruction; Teacher assignment policies |
Audra Parker |
Professor, George Mason University |
Departmentalized instruction; Teacher assignment policies |
Chris Rhoads |
Associate Professor, University of Connecticut-Neag School of Education |
Evaluation methods |
Jonah Rockoff |
Professor of Finance and Economics, Columbia Business School |
School reforms; Methods for estimating teacher effectiveness |
Brian Schultz |
Interim Superintendent, Cabarrus County Schools |
Departmentalized instruction; Teacher assignment policies |
No incentives are proposed for the district administrative records collection, which is the only activity that this package requests approval for.
Mathematica and its research partners will conduct all data collection activities for this study in accordance with relevant regulations and requirements, which are:
The Privacy Act of 1974, P.L. 93-579 (5 U.S.C. 552a)
The “Buckley Amendment,” Family Educational Rights and Privacy Act (FERPA) of 1974 (20 U.S.C. 1232g; 34 CFR Part 99)
The Protection of Pupil Rights Amendment (PPRA) (20 U.S.C. 1232h; 34 CFR Part 98)
The Education Sciences Reform Act of 2002, Title I, Part E, Section 183
The research team will protect the confidentiality of all data collected for the study and will use it for research purposes only. The Mathematica project director will ensure that all individually identifiable information about respondents remains confidential. All data will be kept in secured locations, and identifiers will be destroyed as soon as they are no longer required. All members of the study team having access to the data will be trained and certified on the importance of confidentiality and data security. When reporting the results, data will be presented only in aggregate form, such that individuals, schools, and districts are not identified. Included in all voluntary requests for data will be the following or similar statement:
“Responses to this data collection will be used only for research purposes. The report prepared for this study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. We will not provide information that identifies you, your school, or your district to anyone outside the study team, except as required by law. Additionally, no one at your school or in your district will see your responses.”
The following safeguards are routinely used by Mathematica to maintain data confidentiality, and they will be consistently applied to this study:
All Mathematica employees are required to sign a confidentiality pledge that emphasizes the importance of confidentiality and describes employees’ obligations to maintain it (see Appendix B).
Personally identifiable information (PII) is maintained on separate forms and files, which are linked only by random, study-specific identification numbers.
Access to hard copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.
Access to computer data files is protected by secure usernames and passwords, which are only available to specific users who have a need to access the data and who have the appropriate security clearances.
Sensitive data is encrypted and stored on removable storage devices that are kept physically secure when not in use.
Mathematica’s standard for maintaining confidentiality includes training staff regarding the meaning of confidentiality, particularly as it relates to handling requests for information, and providing assurance to respondents about the protection of their responses. It also includes built-in safeguards concerning status monitoring and receipt control systems. In addition, all study staff who have access to confidential data must obtain security clearance from ED which requires completing personnel security forms, providing fingerprints, and undergoing a background check.
This study will include no questions of a sensitive nature.
Table A.2 provides an estimate of district burden for collecting student and teacher administrative records, the only data collection activity that extends beyond the current OMB approval period. These estimates are based on Mathematica’s prior extensive experience collecting administrative data from school districts.
Table A.2. Estimated response time for data collection that would occur during the extended approval timeline
Respondent/Data request |
Number of targeted respondents |
Expected response rate (%) |
Expected Number of responses |
Unit response time (hours) |
Annual Total response time over 3-year data collection (hours/year) |
Total burden (hours) |
Originally approved burden for district administrative records on students and teachers from OMB Control Number 1850-0942 (Amount to be carried over to current request) |
12 |
100 |
12 |
16 |
64 |
192 |
New burden requested for district administrative records on students and teachers |
12 |
100 |
12 |
4 |
16 |
48 |
Total |
12 |
|
12 |
|
80 |
240 |
The number of targeted respondents is 12 districts, and the expected response rate is 100 percent because the districts all signed MOUs agreeing to participate in the study and provide the requested data. Therefore, the expected number of responses is 12 districts. Each response is expected to take 20 hours, which includes time for the district to compile all of the requested data elements, upload for the evaluation team to review, and time to answer any follow-up questions about the data that the evaluation team may have.
The total burden across all 12 districts is estimated at 240 hours over a 3-year timeframe or an average of 80 annual burden hours. This is an increase of 48 total hours from the originally approved OMB submission, based on anticipated additional burden due to coronavirus pandemic delays and challenges with providing the necessary data. Therefore, 192 hours are being carried over from the original approved burden, and an additional 48 hours are being newly requested in this package.
There is no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the information.
The total cost to the federal government for this evaluation is $8,885,814. The estimated average annual cost—including recruiting districts, designing and administering all collection instruments, processing and analyzing the data, and preparing reports—is $1,777,163 (the total cost divided by the five years of the evaluation).
The coronavirus pandemic created substantial disruptions to schools during the 2019–2020 and 2020–2021 school years. As a result, the ability to collect data from district administrative records in this evaluation was significantly hampered. For example, while student assessment data for the 2020–2021 school year are expected to be available, it will likely be on a delayed timeline due to the ongoing coronavirus pandemic. That is why this new package is requesting a 1.5 year extension to the original expiration date of June 30, 2021 in order to complete the district administrative records part of the evaluation data collection.
Originally, 192 hours were approved for this part of the collection. The current request is to carryover those unused 192 hours and add an additional 48 hours of burden (4 hours per district) because the request is expected to be slightly more complex due to the challenges of the coronavirus pandemic.
The evaluation will provide evidence on implementing departmentalized instruction and on how outcomes differ for students and teachers from departmentalized schools and self-contained schools.
Student outcomes (standardized math and reading test scores, attendance, and disciplinary incidents) and teacher outcomes (amount of instructional planning and professional development, quality of student-teacher relationships, teaching practices, job satisfaction, and retention) will be compared using regression models. To increase the precision and validity of the comparisons, the analysis will account for a number of baseline student, teacher, and school characteristics. These regression models will be estimated for the full combined sample and separately in districts with and without teacher effectiveness scores, to see how the evidence on departmentalized instruction differs across these types of districts.
The implementation analysis will describe schools’ approaches to departmentalization and benefits and challenges encountered. The analysis will document the structure of departmentalization in treatment schools, including number of subjects and classes per teacher, assignment of teachers to subjects, and time allocated to instruction and planning. The analysis will also examine how principals assigned teachers to subjects (in districts with and without teacher effectiveness scores) and any implementation challenges. In all schools, the evaluation will document time for instruction, planning, and teacher professional development. Understanding the implementation experiences and challenges of schools and teachers participating in the intervention will provide important information for districts and elementary schools considering departmentalizing instruction. The implementation analysis will also provide important context for interpreting the results from the outcomes analysis.
The findings for this evaluation are expected in 2022 and will be published in a report available on the IES website. The report will be 15 pages, with a set of technical appendices. The report will be written for an audience of policy makers and practitioners. The report will follow the recent January 2020 IES Style and Report guidance and meet all 508 compliance requirements.
IES is not requesting a waiver for the display of the OMB approval number and expiration date. All data collection forms will display the expiration date for OMB approval.
This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).
1 OMB Control Number 1850-0942 (https://www.reginfo.gov/public/do/PRAViewICR?ref_nbr=201801-1850-001)
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Subject | OMB |
Author | MATHEMATICA |
File Modified | 0000-00-00 |
File Created | 2021-06-15 |