Evaluation of the Innovative Assessment Demonstration Authority Pilot Program-Preliminary Activities
Supporting Statement for Paperwork Reduction Act Submission
PART A: Justification
July 2020
Contract # 91990019C0059
Submitted to:
Institute of Education Sciences
U.S. Department of Education
Submitted by:
Westat
An Employee-Owned Research Corporation®
1600 Research Boulevard
Rockville, Maryland 20850-3129
(301) 251-1500
Page
A1. Circumstances Necessitating the Collection of Information 1
A.2.1 Data Collection Activities for Which Clearance is Requested as Part of this Package 2
A3. Use of Technology to Reduce Burden 3
A4. Efforts to Avoid Duplication of Effort 4
A5. Methods of Minimizing Burden on Small Entities 4
A6. Consequences of not Collecting Data 4
A8. Federal Register Announcement and Consultation 4
A10. Assurances of Confidentiality 5
A11. Justification for Sensitive Questions 6
A12. Estimates of Hours Burden 6
A13. Estimate of Cost Burden to Respondents 7
A14. Annualized Cost to the Federal Government 7
A15. Reasons for Program Changes or Adjustments 7
A16. Plans for Tabulation and Publication of Results 7
A17. Approval not to Display the Expiration Date for OMB Approval 8
Appendix A. Instructions for Preparing and Submitting the Teacher List A-1
Appendix B. Notification Letters and Follow-up Emails B-1
Table
A-1 Estimated response time for preliminary activities 7
Part A. Justification
Since 1994, federal law has required states to regularly administer assessments to students in selected grades and subjects.1 The purpose of these assessments is to inform teaching and learning, and to hold schools accountable for student performance. To improve the quality and usefulness of these assessments, the law was most recently updated in 2015 to create an Innovative Assessment Demonstration Authority (IADA) Pilot Program. The program (Title I, Section 1204 of the Every Student Succeeds Act, or ESSA) allows the U.S. Department of Education (the Department) to exempt a handful of states from certain testing requirements if they agree to pilot new types of assessments. The Department, through its Institute of Education Sciences (IES), is requesting clearance to recruit school districts and collect teacher lists for the Congressionally mandated evaluation of the IADA program. A second package will request clearance for district, principal, and teacher survey instruments and the collection of these data.
Congress mandates two reports on the IADA program: (1) a Progress Report on pilot states developing and implementing innovative assessment systems, and (2) a Best Practices Report to inform future development and use of innovative assessment systems in more states. The Progress Report will be based only on existing documents from pilot states, as required by ESSA. This report will guide the Department’s technical assistance to pilot states and inform any expansion of the program beyond the handful of pilot states. The subsequent Best Practices Report will add, via surveys, the perspectives of district leaders, principals, and teachers on the development, implementation, and outcomes of IADA assessments. Not only will the collection of this information fulfill a Congressional mandate, it will also help the Department appropriately target its resources to tackle the largest barriers to adequate progress in pilot states, and provide a valuable guide for other states that may want to develop a new assessment in the future.
Westat and its partners HumRRO and Plus Alpha Research are conducting the evaluation. The evaluation will describe the development, implementation, and outcomes of innovative assessments in the first four states approved for the pilot.2 The evaluation’s research questions are:
Are IADA states developing innovative assessment systems that meet federal requirements? Are they developing technically compliant assessments, systems of support for districts and educators to implement the system, and systems of accommodations and supports for students participating in innovative assessments? How did states identify and address gaps in readiness or capacity for the innovative assessment system? What challenges did states face in developing the system, and how were they addressed?
How are states, districts, and educators using data from the innovative assessment system to inform curriculum, instruction, and accountability? What types of training and supports are available to help districts and educators understand the system and how to use its data to inform curriculum, instruction, and accountability?
Is the innovative assessment system considered an improvement compared to the state’s regular assessment system? Do states, districts, principals, and teachers consider the innovative system more useful than the regular system for informing curriculum, accountability, instruction, and engaging with families? Do they consider the innovative assessment system burdensome compared with the regular system?
How are states planning for scale up and sustainability? How are states using the continuous improvement process to refine their systems? What challenges have they encountered in the scale-up process, and how were they addressed?
To address the evaluation’s research questions and draft the two Congressionally mandated reports, the evaluation team will review existing state documents from Louisiana, New Hampshire, North Carolina, and Georgia; interview these states’ IADA program directors; and administer web surveys to all of the pilot districts from these four states, and a sample of principals and eligible teachers in pilot schools.
Because states are to scale up the IADA program over time, the evaluation team will increase the number of districts, principals, and eligible teachers surveyed between the first and second data collection so that the findings from the second year will better represent the mix of participants at that time. All participating districts in the four pilot states will be surveyed in each of the two survey years (spring 2021 and spring 2022). Two schools from each district will be randomly sampled for the spring 2021 data collection and three schools per district for the spring 2022 data collection.3 Approximately five teachers per sampled school will be randomly selected in each year. (See Part B for more information on the sampling approach.)
The evaluation’s data collections are listed below. This package only requests clearance for district recruitment and teacher list collection. The remaining data collections are provided as context. A follow-up package will request clearance for the survey instruments and associated data collection procedures. The state interviews and extant document reviews are not part of the information collection request because they rely on responses from fewer than nine entities.
for each school sampled for the evaluation (see Appendix A for written instructions and Appendix B for accompanying notification letters and follow-up emails to the Superintendent and designated District Coordinator). Participating teachers are those whose grade and subject (or course for high school teachers) is identified as part of the IADA program in the data collection year. This information is needed to build the frame from which teachers will be sampled. During the ensuing survey data collection stage later in spring 2021, the Coordinator will encourage principals to complete their surveys and work with sampled schools to prompt teachers to complete their surveys. This process will be repeated in the 2021-22 school year to similarly prepare for the evaluation’s second round of surveys in spring 2022.
Performance Report (APR); technical, administration, accommodation, and scorer manuals for the innovative assessment; test specification documents; and the state’s own IADA evaluation report, if available.
The data collection plan is designed to obtain information in an efficient way that minimizes respondent burden. District Coordinators will have the option to submit teacher lists electronically. The email address, to which respondents can electronically direct questions, will be included in the materials for preparing the teacher list.
There are no other sources that systematically and comprehensively report which schools and teachers are participating in the innovative assessment pilot in each year. This information is needed to accurately draw a survey sample that will be representative of these participants’ perspectives. To avoid duplication of effort and minimize respondent burden, the study will rely as much as possible on extant documents, such as the APR, to
No small businesses or entities will be involved as respondents. Every effort will be made to minimize the burden on all respondents, whether they are from larger or smaller districts and schools. To minimize the burden on the District Coordinator for the teacher list collection, respondents have the option to use existing staff lists, and edit them as needed to eliminate ineligible teachers. We will accept lists in all formats and assist respondents by telephone and email.
If the district recruitment and collection of teacher lists are not conducted, then it will be impossible to accurately sample and represent the perspectives of pilot districts, schools, and teachers. Without these survey data, the Best Practices Report will not include lessons learned from key stakeholders in the administration and use of the IADA assessments. Thus, the report would lack information critical to other states and local stakeholders as they decide whether to pursue innovative assessments and apply to the IADA program. Such a gap would limit the usefulness of the evaluation and prevent it from fulfilling a key objective of the Congressionally mandated evaluation.
There are no special circumstances involved with this data collection. Data collected will be conducted in a manner consistent with the guidelines in 5 CFR 1320.5.
The 60-day Federal Register notice was published on July 14, 2020 (85 FR 42370). One nonsubstantive comment was received that did not result in changes to this request. A 30-day Federal Register notice will be published.
This study has a Technical Working Group (TWG) that includes members with expertise on the types of innovative assessments being used by the four IADA states in the evaluation (e.g., performance assessment, interim and formative assessments, computer adaptive testing); assessment development, including psychometric properties; the implementation of assessments at the state and local levels; and quantitative and qualitative research methods. The TWG members are:
Suzanne Lane, Professor, Research Methodology Program, Department of Psychology in Education, University of Pittsburgh
Peter Leonard, Director of Assessment, Chicago Public Schools
Richard Patz, Distinguished Research Advisor, University of California, Berkeley
Andy Porter, Director, The Center on Standards, Alignment, Instruction, and Learning Professor Emeritus of Education, University of Pennsylvania
Michael Rodriguez, CEHD Associate Dean for Undergraduate Programs, Diversity, and Equity; Campbell Leadership Chair in Education and Human Development; and Professor, University of Minnesota
Phoebe Winter, independent consultant, who has held positions as an assessment measurement specialist in the South Carolina and Virginia Departments of Education, and is the former research director of the Center for the Study of Assessment Validity and Evaluation at University of Maryland.
This TWG will advise on the conduct of this evaluation including, but not limited to, sample design, instrumentation related to survey design, data collection and analysis, as well as the reporting of best practices related to the development, implementation, and use of innovative assessments.
If allowed by district policy, the evaluation team will give district coordinators a small incentive ($50) for providing the lists of eligible teachers in sampled schools. Obtaining teacher rosters is critical to ensuring the quality of the teacher sample. As discussed in Section A12, we expect that it may take up to two hours for coordinators to compile the rosters. Depending on the state’s IADA program, only certain grades or subjects (or courses in high school) are eligible to participate in the pilot. But even within eligible grades or subjects, it is not necessarily the case that all teachers are participating in the pilot. Coordinators may need to do a fair amount of investigation to accurately compile this information, depending on how thorough their district’s documentation is. Particularly in the current environment where districts are likely to be juggling with fiscal uncertainty, public health, and logistical challenges related to the coronavirus, we recognize that district staff have tremendous demands on their time, and we expect that the incentive will reduce the non-response follow-up (and associated costs) necessary to achieve the desired response rate of at least 85 percent for the teacher lists.
The Education Sciences Reform Act of 2002, Title I, Part E, Section 183 requires, “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” The names and email addresses of potential survey respondents will be collected for the limited purpose of drawing a sample, contacting those selected to complete the survey, and following up with non-respondents. This information is typically already available in the public domain as directory information (i.e., district and school websites). The following language will be included on the cover sheet of all information collection forms under the Notice of Confidentiality:
“Information collected for this study comes under the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). Responses to this data collection will be used only for statistical purposes. The reports prepared for the study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).”
Specific steps to guarantee confidentiality of the information collected include the following:
Identifying information about respondents (e.g., respondent name, email address) will not be entered into the analysis data file, but will be kept separate from other data and will be password protected. A random, study-specific identification number for each survey respondent will be used for building raw data and analysis files.
A fax server used to send or receive documents that contain confidential information will be kept in a locked field room, accessible only to study team members.
Confidential materials will be printed on a printer located in a limited access field room. When printing documents that contain confidential information from shared network printers, authorized study staff will be present and retrieve the documents as soon as printing is complete.
In public reports, survey findings will be presented in aggregate or by IADA pilot state. No reports will identify individuals, districts, or schools.
Access to the sample files will be limited to authorized study staff only; no others will be authorized such access.
All members of the study team will be briefed regarding required procedures for handling any confidential data.
Most survey data will be entered via the web systems. However, a control system will be established to monitor the status and whereabouts of any hard copy data collection instruments during data entry.
All data will be stored in secure areas accessible only to authorized staff members. Any computer-generated output containing identifiable information will be maintained under the same conditions.
Hard copies containing confidential information that are no longer needed will be shredded.
This study will include no questions of a sensitive nature.
The preliminary activities requested in this submission include notifying districts of their selection for the evaluation, identifying a District Coordinator to work with the evaluation team, and requesting a list of participating teachers in each sampled school in the district.
Table A.1 provides an estimate of burden for the data collection included in the current request, broken down by instrument and respondent. These estimates are based on the evaluation team’s prior experience collecting similar data from districts. For example, recent experience requesting teacher lists on other data collections suggests that a response rate of at least 85 percent from districts is realistic.
The number of targeted respondents is 141 and the expected number of responses is 120. The total burden is estimated at 240 hours or an average of 120 annual burden hours calculated across 2 years of data collection.
Table A-1. Estimated response time for preliminary activities
Respondent/Data request |
Number of targeted respondents |
Expected response rate (%) |
Expected number of responses |
Unit response time (hours) |
Annual total response time over 2-year data collection (hours/year) |
Total burden (Hours) |
||
Coordinator -Teacher lists (winter 2021) |
64 |
85 |
54 |
2 |
54 |
108 |
||
Coordinator - Teacher lists (winter 2022) |
77 |
85 |
66 |
2 |
66 |
132 |
||
Total for current request (rounded) |
141 |
|
120 |
|
120 |
240 |
The total of 240 hours is based on the assumption that the evaluation team will reach out to an estimated 64 District Coordinators in the winter of the 2020-21 school year and 77 District Coordinators in the winter of the 2021-22 school year. The number of District Coordinators increases between the two years to reflect the expected additional IADA district participants in the 2021-22 school year. The evaluation team expects that 120 of the 141 District Coordinators (54 in Year 1 and 66 in Year 2) will ultimately respond (85%). It is expected that each response (collecting the list of participating teachers from the two or three sampled schools in the district) will take the coordinator an average of 2 hours.
There is no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the information.
The amount for the design, conduct of surveys, and analysis and reporting of the data from the spring 2021 and spring 2022 surveys is $1,666,149. The annualized cost over the four and a half years of the project for these activities is $370,255.
This is a new collection. No changes apply.
The first report will describe participating states’ progress with their innovative assessment system and draw exclusively on extant documentation provided by states. The second report will present best practices, or lessons learned, on the development, implementation, and outcomes of innovative assessment systems. The second report will draw on all of the data collected for the evaluation, including the district, principal, and teacher survey results; the findings from the state interviews; and extant documents.
Responses to survey questions will be tabulated into descriptive statistics (such as percentages) and simple statistical tests (such as tests for differences between percentages). These tabulations will provide a snapshot of district, school, and teacher experiences at each time point, as well as aggregate changes over time. The study is descriptive and not designed to estimate the impact of federal policies on state and local actions.
The Progress Report is expected to be published in 2022, and the Best Practices Report is expected in 2023. Both reports will be available on the IES website. Each report will be 15 pages, with a set of technical appendices. The report will be written for an audience of policy makers and practitioners. The reports will follow the recent January 2020 IES Style and Report guidance and meet all 508 compliance requirements.
The Institute of Education Sciences is not requesting a waiver for the display of the OMB approval number and expiration date. The surveys and notification letters will display the expiration date for OMB approval.
This submission does not
require an exception to the Certificate for Paperwork Reduction
Act
(5 CFR 1320.9).
1 Improving America’s Schools Act of 1994, P.L. 103-382, 20 U.S.C. § 6301 et seq.
2 To date, ED has approved five states for the program: Louisiana and New Hampshire in 2018, North Carolina and Georgia in 2019, and Massachusetts in 2020.
3 If a district has fewer than two participating schools for spring 2021 or three participating schools for spring 2022, all participating schools in the district will be selected for the evaluation.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Warner, Elizabeth |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |