Supporting Statement Part B

Supporting Statement Part B.docx

Study of Schools Targeted for Improvement Using Title I Section 1003(g) Funds Provided Under ARRA (Study of School Turnaround)

OMB: 1850-0878

Document [docx]
Download: docx | pdf

American Institutes for Research®


Shape1



Study of School Turnaround





OMB Clearance Request

For Data Collection Instruments



Part B: Supporting Statement for Paperwork Reduction Act Submission



February 10, 2011




Prepared for:

United States Department of Education

Contract NO. ED‑04‑CO‑0025/0022





Prepared by:

American Institutes for Research

Mathematica Policy Research

Decision Information Resources

Education Northwest

Contents

List of Appendices

Appendix A: Criteria for the Selection of States A–1

Appendix B: Construct Matrix B–1

Appendix C: Protocols and Consent Forms C–1

Appendix C–1: Draft State Administrator Interview Protocol and Consent Form C–1

Appendix C–2: Draft District Administrator Interview Protocol and Consent Form C–8

Appendix C–3: Draft Elementary School Principal Interview Protocol and Consent Form C–18

Appendix C–4: Draft High School Principal Interview Protocol and Consent Form C–28

Appendix C–5: Draft Elementary School Teacher Interview Protocol and Consent Form C–38

Appendix C–6: Draft High School Teacher Interview Protocol and Consent Form C–46

Appendix C–7: Draft Instructional Coach Interview Protocol and Consent Form C–55

Appendix C–8: Draft Union Representative Interview Protocol and Consent Form C–61

Appendix C–9: Draft External Support Provider Protocol and Consent Form C–67

Appendix C–10: Draft Elementary School Teacher Focus Group Protocol and
Consent Form C–75

Appendix C–11: Draft High School Teacher Focus Group Protocol and Consent Form C–83

Appendix C–12: Draft School Improvement Team Focus Group Protocol and
Consent Form C–92

Appendix C–13: Draft Parent Community Focus Group Protocol and Consent Form C–100

Appendix C–14: Draft High School Student Focus Group Protocol and Consent Form C–105

Appendix C–15: Draft Elementary School ELL Teacher Interview Protocol and
Consent Form C–113

Appendix C–16: Draft High School ELL Teacher Interview Protocol and Consent Form C–121

Appendix C–17: Draft District ELL Coordinator Interview Protocol and Consent Form C–130

Appendix D: Teacher Surveys D–1

Appendix D–1: Teacher Survey: Elementary Longitudinal Module D–1

Appendix D–2: Teacher Survey: High School Longitudinal Module D–9

Appendix E: Request for Documents and Files (RDF) District Director of Fiscal Services
2008–09 and 2009–10 School Year E–1

Appendix F: Walk Through Observation Guide F–1

Appendix G: State, District, and School Notification G–1



List of Exhibits

Exhibit 1. Conceptual Framework 6

Exhibit 2. Study of School Turnaround Evaluation Questions 8

Exhibit 3. Main Study Components, Proposed Sample, and Schedule of Data Collection Activities 9

Exhibit 4: Sample Indicators of a School-Level Analytic Rubric 13



Introduction

The Institute of Education Sciences (IES) of the U.S. Department of Education (ED) requests clearance for the data collection for the Study of School Turnaround (SST). The purpose of the study is to document over time the intervention models, approaches, and strategies adopted and implemented by a subset of schools receiving federal School Improvement Grant (SIG) funds. To this end, the evaluation will employ multiple data collection strategies. Clearance is requested for the study’s design, sampling strategy, data collection, and analytic approach. This submission also includes the clearance request for the data collection instruments.

This document contains three major sections with multiple subsections:

  • Study of School Turnaround

    • Overview

    • Conceptual Framework

    • Evaluation Questions

    • Sampling Design

    • Data Collection Procedures

    • Analytic Approach

  • Supporting Statement for Paperwork Reduction Act Submission

    • Description of Statistical Methods (Part B)

  • Appendices contain a 50‑state table detailing sampling criteria, the study’s construct matrix, site visit interview and focus group protocols, a teacher survey, state interview protocol, Request for Documents and Files, school observation guide, consent forms for all respondents, and notification materials for state and district participants.

Study of School Turnaround

Overview

The Study of School Turnaround (SST)1 will involve case studies to document over time the intervention models, approaches and strategies adopted and implemented by a subset of schools receiving federal School Improvement Grant (SIG) funds. Authorized under Section 1003(g) of the Title I of the Elementary and Secondary Education Act (ESEA) and supplemented by the American Reinvestment and Recovery Act (ARRA), SIGs will target $3.5 billion over the next three years toward the goal of turning around the nation’s lowest‑performing schools. Guidance issued by the U.S. Department of Education has defined both the criteria for selecting eligible schools and the permitted intervention models (School Improvement Grants, 2010). Eligible schools are defined as belonging to one of three categories:

  • Tier I, which includes any Title I school in improvement, corrective action, or restructuring that (1) is among the lowest‑achieving five percent of those schools in the state; or (2) is a high school that has had a graduation rate below 60 percent for a number of years.2

  • Tier II, which includes any secondary school that is eligible for, but does not receive Title I, Part A funds and (1) is among the lowest‑achieving five percent of such secondary schools in the state; or (2) has a graduation rate below 60 percent for a number of years.3

  • Tier III, which includes the remaining Title I schools in improvement, corrective action, or restructuring that are not Tier I schools.4

For each Tier I and II school identified in an LEA’s SIG subgrant application, the LEA must specify one of four improvement models to be implemented in an effort to turn around the school.

  • Turnaround model: replaces the principal and no less than 50 percent of the staff, introduces new governance structure and introduces significant instructional reforms, increases learning time, and provides flexibility and support;

  • Restart model: reopens the school under the management of a charter school operator, charter management organization, or an education management organization;

  • School closure: closes the school and reassigns students to higher achieving schools; and

  • Transformation model: replaces the principal, introduces significant instructional reforms, increases learning time, and provides flexibility and support.

These models are consistent with those defined in other ARRA‑funded initiatives, including Race to the Top (RTT) and the State Fiscal Stabilization Funds (SFSF)‑Phase 2.

The SST will follow the experiences of 60 case study schools in “real time,” from the point at which they receive their SIG funding through a three year period thereafter. The study will involve the following data collection strategies: (1) site visits, telephone interviews, teacher surveys, and (2) document collection at the state, district and school levels that includes fiscal data and information on the school turnaround process.

The approach to this study’s design embraces three interrelated objectives:

Objective 1: To document the change process in a set of chronically low‑performing schools receiving SIG funds.

This study will describe the characteristics of 60 SIG schools, the decisions and strategies they undertake, and the constraints they face as they work to implement intervention models intended to improve student outcomes. Because the study will collect “real time” longitudinal information over the course of three years in a variety of school contexts, it will offer a unique window on how SIG implementation unfolds. In particular, the study team will seek to understand the school‑level processes associated with the planning, implementation, and sustainability of change strategies. School change is a dynamic process, requiring attention, motivation, and internal capacity on the part of school‑level stakeholders. In these 60 schools, the study team will examine the extent to which school‑level actors are engaged in school improvement processes and the level and quality of the implementation of their change strategies.5

The study team recognizes, however, that neither school improvement nor school failure occurs in isolation. Data will be collected from the states and districts in which the case study schools are located, examining school practices as the product of complex and interacting factors. These factors include decisions and practices at multiple levels of the system, characteristics of school populations and personnel, prior reform histories and resulting capacities, the actions of external providers and partners, and use of fiscal resources. Indeed, a particularly important aspect of the study will be the integration of data concerning resource allocation with information about other aspects of the change process.

Objective 2: To study leading indicators of school turnaround.

An objective of the study will be to study factors that are hypothesized to promote the change process in SIG schools. Drawing on existing studies of school improvement and turnaround, the conceptual framework for this study delineates a set of potential leading indicators. The study will track these indicators for study schools over the course of the project.

Objective 3: To support schools undertaking actions to turn around student performance by sharing accumulating knowledge and lessons from study schools with SIG program staff and other key stakeholders.

Each year, the study team will produce reports and research briefs with annual study findings. The study also will share accumulating knowledge with program staff in ED, with the goal of informing ED’s management of the grant program and provision of technical assistance to states, districts, and schools. These knowledge‑sharing activities will enrich the study and its reach and will yield lessons for future evaluations.

Conceptual Framework

The conceptual framework for this study addresses the research questions in Exhibit 2, drawing on an understanding of the SIG program requirements and on the research literature concerning organizational change processes, policy implementation, and effective schools. Undergirding the framework and the design are several assumptions based on prior research:

  • The heart of the change process (and thus of this study) consists of people, activities, and relationships inside the school. At the same time, school performance is influenced by the systems in which these schools are situated, thus, systemic contributors to chronic low performance also must be considered.

  • The strategies that states, districts, and schools select and employ will reflect different “theories of action”—including different conceptions of the problem(s) to be addressed and different assumptions about how the chosen strategies will address that (those) problem(s). The study should seek to understand both what people do to turn around the lowest‑performing schools and why they do so.

  • Schools are complex social systems—the characteristics of the schools and the various improvement strategies they employ will interact and overlap, making it difficult to tease out causality or predict effects.

  • The quality of implementation is a critical determinant of the effect of any policy or program, and implementation takes shape as policies and practices are interpreted and acted on across multiple levels of the system.

  • Interventions and strategies have both descriptive characteristics that can be directly observed or measured and qualitative dimensions that can only be derived from analysis across multiple characteristics. For example, the literature on external support providers suggests that an important determinant of their effectiveness in a given school is the “fit” between the strategies they employ or promote and the needs of that school. Fit, however, cannot be measured directly but must be inferred analytically from data on the school, its past performance, potential contributing factors to that performance, and the actions and strategies of the support provider.

  • Policy interpretation and implementation are mediated by intervening contextual variables, which also will influence outcomes.

  • Implementation changes over time as effects accumulate and as individuals and units interpret results and modify practice.

Exhibit 1 on page 6 depicts the conceptual framework that guides the study, reflecting these assumptions and study goals. Several aspects of the graphic are important to note, as discussed below.

Schools at the core: Highlighted in pale blue are the boxes labeled “School Implementation of SIG” and “Leading Indicators of School Improvement,” emphasizing that the core purposes of this study are to document the actions of the 60 study schools to “turn around” their chronic low performance and to track a set of leading indicators that are hypothesized to be associated with subsequent gains in student achievement. The study’s evaluation questions (in the next section) specify domains targeted by the SIG program guidance as likely to foster improved student outcomes; the study will attend to school actions in each of these domains. At the same time, study schools are likely to combine their actions in these domains differently, and the choices made across domains may be interrelated. For example, decisions about the choice of models or instructional improvement strategies (EQs1 and 2) and staffing (EQ3) may be integral aspects of a change in governance (EQ4), such as a charter conversion. Also, depending on their own determination of needs and “theories of action,” schools will differ in their selection of “entry points” for turnaround efforts. For example, some schools may start the process by changing school leadership or staff, while others may start with changing their governance (i.e., becoming a charter school), and still others by striving for a “quick win” (e.g., bringing order to a chaotic environment) to spur motivation and attention.

In addition to examining actions and strategies undertaken in each school, the study will examine the qualities of the schools’ approaches to turnaround. The degree of coherence across multiple strategies, their divergence from past practices, and the level of buy-in, for example, are some of the qualities likely to influence depth of implementation and eventual effectiveness; indeed, these qualities may spell the difference between success and failure across schools following the very same intervention “model.” In each box in Exhibit 1, therefore, the relevant descriptive characteristics of the strategies and the analytic qualities of the larger approach to turnaround in the school are indicated.

The actions and strategies undertaken by schools are expected to influence improvements in student outcomes through the changes they bring about in the behaviors and capacities of the staff and students. Because such changes are expected precursors to improved student achievement, they are referred to as “leading indicators.” For example, changes in school personnel or professional development efforts would be likely to improve student outcomes only if they result in staff with increased knowledge and skills. A culture of high expectations for students and of continuous improvement also are examples of likely precursors of changes in student outcomes. Tracking such leading indicators and the strategies that are related to them is an important goal of much of the turnaround literature.

Multi‑level implementation of SIG: As illustrated in the conceptual framework, district and state implementation of the SIG program shape schools’ implementation. Districts may have the primary role in selecting the intervention models to be used by SIG schools, or they may prescribe instructional approaches, or provide additional flexibility to SIG schools, or provide technical assistance specifically for SIG schools. The study will examine the actions and strategies undertaken by the districts in which study schools are located. As with school implementation, the study will go beyond documentation of actions and strategies, and attempt to understand qualities of district

Exhibit 1. Conceptual Framework

Shape2

approaches that are likely to be associated with successful school implementation. For example, districts’ SIG‑related strategies may differ in their specificity, or in their emphasis on applying pressure vs. support for SIG schools; districts also will differ in the comprehensiveness and accessibility of data that schools can use to inform their improvement efforts. The study will examine how district actions contribute to (or impede) school implementation. Of course, state SIG policies such as the definition of eligible schools and SIG guidance set the parameters for district and school policies as illustrated in the conceptual framework, and the study will examine SIG policies for the states in which study schools and districts are located.

The role of external support providers: Most approaches to school turnaround recognize that schools that are chronically low‑performing are so in part because they lack the capacity to significantly improve on their own. An expected component of this study, therefore, and one of the research questions, will be to examine the role of external change agents and the assistance that they provide to schools. Indeed, there are different sorts of external support providers, all of which will be considered in this study. One set of support providers (educational management organizations and charter management organizations) are those that provide comprehensive support to chronically low‑performing schools, and have tools and processes that will guide the turnaround process. In some cases, this assistance may be combined with actual line authority over the schools. Another set of support providers are outside vendors and non‑profits helping the school with one or multiple aspects of their improvement strategies (such as professional development in mathematics, or how to collect and manage classroom observation data). Finally, individuals contracted with the state (for example, those affiliated with the statewide system of support) may be providing direct, long‑term assistance to the case study schools. Given these diverse and prominent roles, the study will examine the work of external agents in the case study schools, as noted in the box in Exhibit 1 marked “External Partners.”

The role of context: The SIG program does not intervene in a vacuum. A key feature of the study’s conceptual framework, therefore, is the emphasis on contextual and systemic factors that mediate SIG‑specific actions and strategies, and their qualities, at the school, district and state levels. School change is embedded in a system. While many prior studies of school reform have focused exclusively on the school, this study will examine the systemic, historical, community and other contexts of school actions and how they influence school actions.

Time: Exhibit 1 demonstrates that the range and complexity of variables examined in Year 1 will be repeated over the three years of data collection. Thus, unlike prior, retrospective research, this study will capture the dynamics of the change process. Arrows in the conceptual framework illustrate a process of continuous feedback; one particular focus will be the extent to which student outcome data are used to revisit intervention models at the school, district and state levels. Because this is a longitudinal, real‑time evaluation, the study team will be in a better position to examine how the turnaround process begins and evolves, how the different levels of the education system interact with one another, and the extent to which existing contextual factors are related to subsequent decisions.









Evaluation Questions

To meet its three objectives, the SST will document the contexts, actions, strategies, and qualities of these strategies that are implemented in a subset of schools that receive SIG funds. The evaluation questions for this study address seven aspects of school turnaround relevant to the SIG grants: (1) selection of intervention models, (2) instructional improvement strategies, (3) human capital strategies, (4) approaches to school governance, (5) use and role of external support partners, (6) the allocation of SIG funds, and (7) contextual and systemic influences. Within each of these, the study team will focus a subset of analyses on issues related to English Language Learners (ELLs). Exhibit 2 presents the broad evaluation questions that will guide the data collection and analysis for each aspect.

Exhibit 2. Study of School Turnaround Evaluation Questions

EQ1: Intervention Models. Which of the four intervention models are districts and schools selecting for turning around the SIG‑funded schools in this study, and why? What roles are states, districts, schools and turnaround partners playing in the decision making and design of the intervention, and how do these roles change over time?


EQ2: Instructional Improvement Strategies. What specific actions are states, districts, and schools taking to improve instruction and outcomes in the SIG‑funded schools in this study? What are the rationales for these actions and how are the decisions made? How well are these strategies planned, implemented, refined, and sustained? How do they change over time?


EQ3: Human Capital. What strategies are states, districts, and schools using to improve the qualifications and effectiveness of teachers, principals, and other staff at the SIG‑funded schools in this study? To what extent do these strategies change over time?


EQ4: Approaches to School Governance and Flexibility. What new governance approaches are being adopted for study schools? Are states and districts providing significant new flexibility to enable implementation of SIG intervention models, instructional improvement strategies, or human capital strategies? To what extent do governance approaches change over time?


EQ5: External Support. What is the nature and quality of external support provided to SIG schools in this study? What roles do states, districts, and turnaround partners play in guiding the change process and helping schools implement improvement strategies? How does support for study schools change over time?


EQ6: Uses of Funds. How are states and districts in this study allocating school improvement funds provided under Section 1003(a) and 1003(g)? How are study states, districts and schools using these funds? To what extent do funds allocation and uses of funds change over time?


EQ7: Contextual and Systemic Influences. How do school, district, and state contexts shape the adoption, implementation, and changes over time of the strategies employed at each level of the system? How do prior school, and district and state intervention, human capital strategies, and governance approaches contribute to strategies employed in SIG schools?





Sampling Design

The main components of this study are presented in Exhibit 3 along with the proposed sample and schedule of data collection activities. A detailed discussion of the sampling design is provided in the Supporting Statement for Paperwork Reduction Act Submission, Part B of this package.

Briefly, the study will include a base sample of 60 schools and two nested subsamples. The first nested sample (the core case studies) will consist of 25 schools in which the study team will conduct in-depth case studies over three years of data collection. The second nested sample (special topics case studies) will consist of two sets of 10 schools in which the study team will explore focused topics of policy interest.



Exhibit 3. Main Study Components, Proposed Sample, and Schedule of Data Collection Activities





Base Sample:

60 Schools



State interviews

Principal Phone Interviews

Longitudinal
Teacher Survey

Fall 2010

Fall 2011

Fall 2012

Fall 2010

Fall 2011

Fall 2012

Winter 2011

Winter 2012

Winter 2013



Site visits: Winter 2011, Fall 2011, Spring 2012, Fall 2012, Spring 2013

S

Core Case Studies:

25 Schools

pring supplement teacher survey:
Once each spring (in addition to the fall administration, as part of the base sample)



Site visits: Special Topic A—Spring 2011 and Spring 2012
Special Topic B—Fall 2012 and Fall 2013

S

Special Topics Case Studies:

2 sets of 10 schools

pecial topic survey supplement
: Once each year (twice for each set of schools)







Data Collection Procedures

The data collection for this study includes site visits, telephone interviews, teacher surveys, and document data collection. All of the study’s data collection instruments have been included in this submission.6 Exhibit 3 above presents a summary of the data collection activities. A more detailed discussion of these procedures is provided in the Supporting Statement for Paperwork Reduction Act Submission, Part B section of this package. Copies of the site visit interview and focus group protocols and state interview protocol are located in Appendices C–1 through C–17. Copies of the teacher survey, Request for Documents and Files, school observation guide, and state notification letter are included in Appendices D, E, F, and G, respectively.

Each year, the study team will collect data from a base sample of 60 schools through a teacher survey and principal interviews. In addition, the study team will interview state officials from the states in which the 60 schools are nested. Among these 60 schools, 25 will be selected as core case study schools, from which the study team will collect additional data through site visits and a survey supplement. In addition, the study team will identify two sets of 10 schools from the base sample of 60 schools, in which the study team will explore special topics of policy interest. The first set of 10 schools will include schools with a high proportion of ELLs; the focus of the second set of 10 schools will be determined in consultation with IES and the study’s TWG.

Analytic Approach

Site Visits

The most important element of the study is the site visit, which will consist of data collection in the 25 core case study schools and two sets of special topic case study schools. For each school, study staff will conduct interviews or focus groups with the principal, teachers, support providers, and other stakeholders, as well as interviews with officials of the district in which the school is located. These interviews and focus groups will be guided by semi-structured interview protocols designed to ensure that discussion of specific topics of interest are consistent across respondents, and that respondents are also able to describe school improvement processes and policies in their own words.

Analyses of the qualitative site visit data will enable the study team to answer the full gamut of evaluation questions at the school level, for example: how decisions are being made about which intervention models to use (EQ1); what new instructional practices schools are implementing (EQ2); what professional development approaches schools are implementing (EQ3); to what extent school governance approaches change over time (EQ4); what external support is being provided (EQ5); how schools use SIG funds (E6), and the role of contextual and systemic influences (EQ7).

Qualitative site visit data will be analyzed through a carefully-structured five-step analytic process guided by the study’s evaluation questions and conceptual framework. The process is designed to build in reliability and validity into the case study process, by both building a chain of evidence and using triangulation to identify themes (Yin, 2003). Steps one through four will be conducted for each case study site visit, and step five will require an analysis of across cases. In step one, the preliminary data capture will occur immediately following each site visit and will require researchers to enter information into a data capture template in a web-based software platform. During step two, site visit interview and focus group data will be coded by researchers using a code book and a qualitative analysis software program such as Atlas.Ti, NVivo, or HyperResearch. In step three, within-case data will be analyzed across interviews and focus groups within a single case. Analyses will be guided by structured rubrics that align with the study’s conceptual framework. In step four, a case narrative guided by a common outline template will combine analysis from steps 1-3 and will also include context-specific observations that clarify critical school, district and state dynamics. Finally, in step five, case narratives will be analyzed collectively using rubrics to determine cross-cutting themes.

Shape11

Reliability and Validity Strategies. Research suggest there are several strategies that, if embedded into the data collection and analyses processes can improve reliability and validity of data analysis (e.g., Yin, 1992, 2003). First, all researchers will be trained and provided with guidance materials in order to improve consistency in data capture and analyses. Second, researchers conducting analyses will be convened at a minimum of every two weeks to discuss the data analysis process, questions about coding of data and other discrepancies. As a result of these meetings, additional trainings and revisions to guidance materials will be made and distributed to the team. Third, two lead researchers will review data analyses of each team member on a weekly basis to improve consistency in reporting and analysis across cases. Discrepancies identified during these data analysis checks will be discussed and resolved by the team during the regular meetings. Fourth, our analytic process and sources of data collection allow for triangulation of data, which will allow researchers to verify the observations in the data (Stake, 2000). Last, in each step of the analysis process, procedures for ensuring inter-rater reliability measures will be implemented, discrepancies identified will be resolved and discussed at the team meetings (Armstrong, Gosling, Weinman, & Marteau, 1997).

The following is a more in-depth discussion of each of the five steps of the analytic process. Within each step the analysis process is described followed by a brief discussion of step-specific validity and reliability measures.

  1. Preliminary Data Capture

In order to capture initial impressions about case study schools and case study respondents that can sometimes be lost in the time that elapses between site visits and recording field notes, the study team has designed a web-based data capture field workbook. The data capture field workbook will be aligned with the key dimensions and qualities identified in the study’s conceptual framework. The web-based platform will allow field researchers to catalog responses to interviews and focus groups immediately following each visit. First impressions regarding development of turnaround strategies and quality of implementation will be recorded consistently across all cases. This will ensure that all field teams are collecting data consistently and that no data sources or areas of interest are overlooked. Preliminary data capture will also allow the research team to identify emerging themes and to focus on these as the study progresses. Use of this capture template will be on-going, and more detailed analysis of this data during step three through five will produce quantified analysis on key indicators both within and across cases.

Researchers will be trained in the use of the data capture field workbook and will participate in weekly meetings to discuss site-visits and issues related to the use of the capture field book. Additionally, lead researchers will review the field workbooks weekly to identify possible gaps in data collection and to inform additional guidance provided during these meetings.

  1. Case Interview and Focus Group Coding

Site visit interview and focus group notes will be cleaned, reviewed for accuracy, and then coded. To guide the coding process, a coding book organized by the dimensions and qualities identified in the study’s conceptual framework and evaluation questions will be developed. The coding book will detail how each code should be applied and include examples from the data that illustrate each code. Researchers will code the data using a qualitative analysis software package. Qualitative analysis software programs such as Atlas.T1 facilitate the analysis of large quantities of qualitative data by enabling researchers to develop and test hypotheses using codes that are assigned to specific portions of the narrative. It also allows the research team to organize and categorize data within-case or across cases on a year-to-year basis.

Researchers will be trained to use the coding book and qualitative analysis software. To improve the reliability of coding guidance, the process will be piloted, allowing the research team to gain consensus about the meaning of specific codes, and therefore improving the reliability and consistency of the coding process. Regular meetings of the researchers analyzing the data will ensure there is consistency across coding and spot-checks of coding conducted by a lead researcher will improve inter-rater reliability.

  1. Within-Case Analysis

After all within-case data have been cataloged and coded using web-based tools and qualitative analysis software, the research team will use rubrics and matrices to organize the data by theme and to quantify the evidence across subgroups. Rubrics will be organized by construct, and within construct by indicator. For each indicator, the rubric will delineate levels with concrete descriptors. Analysts will ascribe a rating to each indicator, based on coded data from each set of respondents. Exhibit 4 provides an example of two indicators in a rubric developed for the dimension of coherence at the school level. (The full rubric for coherence includes six indicators; this is included only as an example). The corresponding matrices will embed direct evidence (in the form of quotes from interviews or documents) to provide evidence of each rating. Finally, analysis using within-case rubrics will strengthen the reliability of the analysis by cataloging thematic data across multiple sources to ensure that findings are triangulated and persistent across sources.

To ensure validity and reliability of the within-case analyses, researchers will be trained in the use of rubrics and matrices. An introductory training supplemented by a pilot within-case analysis will be guided by a lead researcher. Additionally, the research team will meet regularly to discuss and resolve discrepancies. Last, a lead researcher will review all coding and provide one-on-one and group feedback as needed.


Exhibit 4: Sample Indicators of a School-Level Analytic Rubric

School-level COHERENCE

Indicator

Levels Descriptors

Rating for principal interview*

School leadership adopts new strategies that are consistent with turnaround goals. 

1. There is limited or no connection between the strategies/actions being implemented and the school’s turnaround goals

[insert numeric rating]

2. While some strategies and actions are aligned with the school’s turnaround goals, there are still some strategies that have limited or no connection to the school’s turnaround goals

[insert supporting quote]

3. All or nearly all of the strategies/actions that have been adopted are consistent with the school’s turnaround goals

[insert hyperlink to interview transcript]

Staff believe strategies/actions implemented are aligned with the school’s turnaround goals

1. Staff describe fragmented strategies/actions that are disconnected or in conflict with school’s turnaround goals

[insert number for rating]

2. Staff clearly articulate how some strategies/actions are aligned with school’s turnaround goals, but also describe strategies/actions that are fragmented.

[insert supporting quote]

3. Staff clearly articulate how most strategies/actions are aligned with the school’s turnaround goals.

[insert hyperlink to interview transcript]

*Note: The full rubric will include columns for different respondents, e.g., teacher interview, instructional coach interview, etc. However, they have been omitted from this example because many columns would limit the readability in this format.

  1. Case Narrative

The primary purpose of the case narrative is to develop a cohesive, comprehensive summary for each case that integrates the data from steps 1-3, but also includes important contextual data that would be difficult to measure using the above analysis tools. These narratives will be 5-10 page summaries of each case that convey how the study’s conceptual framework has been operationalized within the school, paying close attention to not only the types of turnaround strategies being implemented but the quality of the implementation efforts. Additionally, because this is a large-scale, longitudinal study, case narratives will prove valuable for identifying changes to school context and quality of implementation from year-to-year and for capturing changes in school culture that may be affecting the turnaround process but that are sometimes difficult to quantify using rubrics. Each case narrative will be reviewed by a lead researcher to ensure there is consistency in reporting across cases.

  1. Cross-Case Analysis

The study team will conduct cross‑case analyses to identify emergent themes, associations, and processes. The analysis will include a comparison of topics across the schools, districts and states in the case study sample. The primary data sources for these analyses will be rubrics, as this quantified form of qualitative data facilitates cross-case comparisons and identification of associations among practices and school characteristics. In addition, the case narratives will provide additional contextual information that will help to explain patterns and relationships evidence by the rubric data.

Inter-rater reliability measures will focus on training, regular meetings and inter-rater reliability checks. Researchers will be trained to use the rubrics and matrices for cross-case analyses. Regular meetings of researchers will be convened to discuss discrepancies, improve definitions of codes, and provide examples from the data to support a mutual understanding of the codes and analyses. Last, continual inter-rate reliability checks will be conducted.

Fiscal Data

Although the fiscal data must be integrated into analyses for each of the 25 core case study schools and the two sets of special topic case study schools, they necessitate some different analytic strategies. In addition to the qualitative fiscal information collected from interviews that will be analyzed in the case study reports and the cross‑case analyses, the evaluation team also will analyze documents collected at the state and district levels. These will include consolidated applications for federal funds for districts with case study schools and expenditures for each case study school (extracted from its district’s full expenditure files). Site codes contained in the expenditure files will allow the study team to analyze expenditures at each case study school site individually and to observe changes in expenditure patterns in case study schools over time. Object and function codes will permit the documentation of changes over time in policy‑relevant expenditure categories such as personnel, contracted services, and technology. Fund codes will provide descriptive data how SIG funds themselves were used, and how expenditures overall changed after receipt of the SIG grant. Information obtained through interviews with district and school officials will provide insight into the improvement strategies behind the expenditure decisions (e.g., whether increases in expenditures on personnel represent a reduction in class sizes, an increase in the number of specialized staff such as coaches, or other strategies). In addition, the study team will seek to determine if there are unique features of the financial decisions in schools that are more successful (assuming the sample captures some such schools).

State Interviews

The study team believes that school‑level turnaround processes are likely to be shaped by the historical and policy context of each state and its demographic and urban characteristics. Interviews with state officials in the six states in which the 60 schools are situated will provide needed insight on state‑level decisions with regard to a range of evaluation questions on state funding (EQ6), state contexts (e.g., legal constraints and flexibility) (EQ4), and state actions and technical assistance (EQ5). The analysis will consist of coding text data, an iterative process that includes reading, reviewing, and filtering data to identify prevalent themes relating to each of these evaluation questions.

State Extant Analyses

To situate the 60 case study schools in a broader context, the study team will analyze and report on extant national data, including state and district SIG applications, and data from EDFacts and the Common Core of Data (CCD). The study team will use these data to describe (1) SIG policies and guidance provided by states and districts to SIG schools and (2) the characteristics of SIG‑eligible and SIG‑awarded schools. First, the review of state and district SIG applications will document critical background information about SIG policies and guidance, including, for example, evaluation criteria for reviewing and prioritizing district applications; planned processes for monitoring SIG implementation and reviewing districts’ annual goals for student achievement for SIG‑awarded schools; and planned activities related to technical assistance. Second, using data on the SIG‑eligible and SIG‑awarded schools, in conjunction with data from EDFacts and CCD, the study team will conduct analyses of the features of SIG schools, including grade level, size, urbanicity, funding levels, and characteristics of enrolled students. The study team will also report on school‑level AYP performance and accountability status of the SIG‑funded schools. In addition, the study team will use these data to address questions related to state strategies for targeting resources and variation in selected intervention models by state, region, school level, and urbanicity.

Principal Interviews

Each fall, the study team will conduct telephone interviews with the principal in each of the 60 base sample schools. Collecting longitudinal data on the processes associated with SIG implementation in all 60 schools enables the study team to situate the core and focused case study schools in a larger sample, and to provide necessary background data on each. In the first year of data collection, these telephone interviews will serve as a screener process for selecting the 25 core case studies. In subsequent years, the principal interview will also help the study team to identify schools to be included in the special topic case studies.

Teacher Survey

A brief Web‑based teacher survey (approximately 10 minutes response time) will be administered to all teachers in the 60 sampled SIG schools in the winter of the 2010-11, 2011-12, and 2012-13 school years. A spring supplement of similar length will also be administered annually for three years to teachers in the 25 schools which will be visited by the study staff. Finally, a “special topic” supplement will be administered annually for two years to two subsets of 10 schools in which the study team will explore focused topics of policy interest. The purpose of these surveys will be (a) to collect longitudinal data from the larger set of schools (60) to inform case study analyses and the selection of focused samples, and (b) to collect data on topics for which feedback from all teachers is necessary, and for which focus groups are not the optimal strategy.

Although a high response rate is expected, the study team does not anticipate a 100 percent response rate; thus, the study team will analyze whether the teachers who respond to the survey are different from the full population of teachers in each school in observable ways. The survey administration group will closely monitor response rates among subgroups of teachers and will target follow-up prompts to any subgroup for which the non-response rate is low.

Because the survey is intended to inform understanding of each school as a “case,” most analyses will be within-school. The study team anticipates that most of the analyses will involve univariate (means, frequencies, etc.) and bivariate (comparisons of means, cross‑tabulations, etc.) approaches to provide an overview of the data and investigate changes in knowledge, perspectives, and behaviors of respondents by independent background characteristics. The study team also will analyze data from the open‑ended survey items and will develop matrices to summarize data across respondents within a given school.

Supporting Statement for Paperwork Reduction Act Submission

Description of Statistical Methods (Part B)

  1. Sampling Design

To generate a sample of schools, districts, and states that will yield informative and varied data, the study team proposes to generate a purposive sample of 60 SIG schools in a set of states and districts with appropriate variation. The 60 school base sample will include two sub‑samples, as described below.

  • Base Sample: The study team will identify a set of 60 SIG schools in six states which will constitute the base for the study. This sample is intended to provide data of great breadth and consistency over time, and to serve as a base from which the study team will select the core case study schools and schools in which to address special topics. A larger sample will ensure the study team has a set of schools that reflects a wide range of relevant policy topics and will enable the study team to situate the cases within a larger sample.

  • Of the initial 60 schools, the study team will sample 25 (on average, four to five in each state) to be the focus of one site visit in 2010–11, and two site visits in each following years: a more intense site visit (2.5 days) and one follow‑up visit (1 day). The purpose of the core case studies will be to document the change process, over time, with rich, detailed data. To select these schools, the contractors will draw on the first wave of principal interviews and teacher surveys, seeking a set of schools that reflects diversity in terms of intervention models, school levels, student characteristics, and district approaches to turnaround. In addition, the study team will seek sets of two or three schools nested within a given district.

Special Topics Case Studies. The study team will select two subsets of approximately 10 schools which will be the focus of case studies on two topics of particular policy interest. These focused case studies are likely to include the 35 schools of the base sample that are not part of the core case studies, but may also include some core case study schools, as warranted by the selected topic.

One set of focused case studies will address issues related to English Language Learners. To understand the change process in schools with high proportions of ELLs, the study team will identify the 10 schools (from the base 60 schools) with the highest percentage of ELLs and will conduct site visits in these schools in the spring of 2011 and spring of 2012.

The second special topic will be identified through the course of the study; the purpose of this component of the study is to enable the study to be flexible and responsive, addressing issues of policy interest or unanticipated developments that merit further exploration. For example, the contractors and IES may choose to address turnaround issues in rural schools, schools that receive particularly intensive supports, schools that seek to leverage new technologies, or to increase the study’s focus on turnaround processes in high schools.

Sample Selection: The selection of the base sample of 60 schools will be informed by the study team’s work on the baseline data report, in which the distribution and characteristics of the SIG‑eligible and SIG‑awarded schools will be analyzed nationally. This will inform the contractors’ understanding of the distribution of SIG schools in terms of intervention model, grade level, nesting with districts, and student characteristics. Following this review, the study team will focus on a subset of approximately 10 states (later to be narrowed to six states) with SIG‑awarded schools that are likely to meet the sampling criteria detailed below. The identification of the final sample will be an iterative process, through which the team will identify potential schools, consider the features of the districts and states in which they are nested, and re‑evaluate the candidate schools.

The study team will seek as much variation as possible among observable state, district, and school characteristics which may be associated with implementation patterns and turnaround success. To the extent possible, given limitations of the universe of SIG schools, the full set of 60 schools will reflect the following variation:

  • Tier I and Tier II schools. For the purposes of this study, the study team will focus on Tier I and Tier II schools only.

  • School intervention model. Initially, the study team had anticipated drawing a purposive sample of schools with roughly equal numbers of each of the different models (with the exception of school closure)—approximately one‑third of the case study schools representing each model. However, current information on SIG‑awarded schools suggests that most are undertaking the transformation model, while far fewer are adopting the restart model.7 The intent of the study team is that the final distribution of models within this sample of 60 schools should generally reflect of the models to be implemented in SIG‑awarded schools. However, if a given model is under‑represented in the universe of SIG schools (as may be the case with the restart model) the study team may oversample this model to ensure a sufficient sample of such schools.

  • Grade level. In seeking appropriate variation across all sampling dimensions, the contractors have concluded that the study would be hard‑pressed to include all grade levels. If the sample were to include all three school levels (elementary, middle, and high school) it would be impossible to sample on all desired variables, and analyses by key dimensions would be compromised. Thus, the study team believes the study would be better served by focusing on only two school levels, sampling 30 elementary schools and 30 high schools (excluding middle schools). These two levels merit inclusion for several reasons. High schools warrant inclusion, in part, because of the challenges they entail. For example, high schools are frequently compartmentalized into academic departments, serve students with a diverse set of post‑secondary goals, and are populated by adolescents who often face adult responsibilities (Le Floch, Boyle, Therriault, & Holzman, 2010; Harvey & Housman, 2004; Siskin, 2003). High schools also are widely perceived to be the most resistant to improvement strategies (and thus merit further inquiry) and are the focus of current policy interest. Indeed a convergence of efforts on the part of private foundations, researchers, and advocacy groups has focused attention on high schools (Yohalem, Wilson‑Ahlstrom, Ferber, & Gaines, 2006; Hill, 2006; Hess, 2005).8 On the other hand, elementary schools frequently are reported to be more amenable to improvement interventions. Studies of school turnaround provide more evidence of improved outcomes at the elementary school level (Herman, Dawson, Dee, Greene, Maynard, & Redding, 2008), thus the study team may increase the odds of finding evidence of success—and hence lessons for the field—by including elementary schools in the sample.

  • School size. Understanding that school turnaround processes are likely to differ in schools of varying enrollment levels, the case studies will include a range of small, medium, and large schools, to the extent possible. That is, the study team will seek both small, medium, and large high schools, as well as small, medium, and large elementary schools.

  • Characteristics of enrolled students. The study team will structure the case studies so that they reflect the students enrolled in such schools—recognizing that chronically low‑performing schools serve disproportionate numbers of students of color. Thus, at least half of the students in the sampled schools will likely be African‑American and Hispanic, but the study team will ensure that Asian students as well as white students are represented.

At the state level, the study team will seek to ensure variation on a small set of key variables. These include:

  • Right‑to‑Work vs. highly unionized States: Right‑to‑work laws are labor statutes, enforced in 22 states, that restrict union activities. Because teacher unions can be a powerful force in shaping and enforcing tenure and other requirements, such state laws have implications for education employment practices. In states that do not have right‑to‑work laws, collective bargaining agreements and political pressures from powerful teacher and administrator unions may render it more difficult to replace faculty or other staff. Given that the replacement of staff (principals and/or teachers) is a requirement of three of the four intervention models, this is a critical dimension of variation for the study.

  • Level of spending on education: Adjusting for regional variation in costs, states vary a great deal in the financial resources allocated to public education. For example, in 2006, per‑pupil expenditures ranged from over $15,000 in Vermont to almost $6,000 in Utah (Editorial Projects in Education Research Center, 2009). Clearly these differences in spending have implications for the way in which chronically low‑performing schools attempt to ameliorate practices and the potential impact of the infusion of SIG funds.

  • The concentration of SIG schools in the state: Depending both on the number of schools identified for improvement, corrective action, and restructuring under ESEA and the ways in which states opt to prioritize SIG schools, the number of SIG schools can vary greatly across states. If states choose to restrict SIG schools to a fairly low number in relation to the amount of available funds, they will be able to concentrate financial resources as well as attention and support.

Sampling states based on these criteria will enable the study team to capture adequate variation on additional important dimensions. For example, the study team seeks to sample states in a way that represents different geographic regions of the country, which will likely be accomplished by factoring in presence of right‑to‑work laws. Other dimensions for which the study team will seek appropriate variation include the capacity of states’ data systems, Race to the Top (RTT) status, and states’ capacity to support school improvement.9 In addition, the study team intends to represent states of different sizes; towards this end, selecting states with varying numbers of SIG schools is likely to result in a sample of five or six states, both large and small. (See Appendix A for a state‑by‑state overview of variables under consideration.)

Within the selected states, the study team will identify a set of districts (approximately three to four per state) that include SIG schools. Among these districts, the study team will seek a set that includes appropriate variation on the following dimensions:

  • District size (small, medium, and large districts)

  • Urbanicity (urban, suburban, and rural districts)

  • Level of per‑pupil expenditures (high, moderate, and low expenditures within each state)

  • Variation in student characteristics, including a substantial number of English Language Learners (high, moderate, and low concentration; also ensuring language diversity)

  • District enabling conditions: The study team also will take into consideration the reform strategies adopted by the districts that include SIG schools, favoring districts that have adopted innovative reform approaches. In addition, the study team may choose districts that have characteristics that contribute to school‑level success, such as stable district leadership. In doing so, the study team may be more likely to capture a sample of SIG schools that includes some that successfully turn around. Overall, the objective will be to have two‑thirds of the case study schools in districts with favorable enabling conditions, with the remaining third of schools in districts that do not distinguish themselves in terms of enabling conditions.

Within each school, the study team will solicit the assistance of a school-level administrator to identify teachers to participate in focus groups and interviews to ensure variation in the grades and subjects taught, years of teaching experience, and years within the case study school (see Exhibit 6 on page 22 for additional information on school-level respondents).

Based on the above criteria, the study team believes that the sample of 60 SIG‑funded schools is likely to be nested in 12 districts and five states. Considering the districts with the highest numbers of SIG-awarded schools—and seeking regional variation—the study team believes it is likely the sample will include schools in some districts in the ten states listed in Exhibit 5. Of course, to ensure appropriate variation within the full set of case studies (e.g., by including rural schools), the study team cannot restrict the sample to the districts below. Once all Tier I and II schools receiving a SIG are identified, a preliminary set of schools based on available extant data will be identified.

Exhibit 5. Probable Key Districts for the Case Study Schools10

State

Key Districts with SIG‑Awarded Schools

Alabama

Montgomery County (5 SIG schools), Lowndes County (4 SIG schools)

California

San Bernardino (11 SIG schools), San Francisco (10 SIG schools), Santa Ana (6 SIG schools)

Colorado

Denver (9 SIG schools), Pueblo (6 SIG schools)

Iowa

Des Moines (7 SIG schools), Waterloo (2 SIG schools)

Maryland

Baltimore (7 SIG schools), Prince George’s County (4 SIG schools)

Massachusetts

Boston (10 SIG schools)

Ohio

Cleveland (12 SIG schools), Columbus (7 SIG schools), Cincinnati (6 SIG schools)

Pennsylvania

Philadelphia (27 SIG schools), Pittsburgh (7 SIG schools)

Virginia

Richmond (3 SIG schools), Sussex (2 SIG schools)

Wisconsin

Milwaukee (11 SIG schools)



When the final set of SIG‑awarded schools is determined and the list available, the study team will select 60 schools for consideration by ED. The study team also will identify a set of 10 to 15 replacement schools in the event that some of the schools are inappropriate for inclusion in the case studies (e.g., substantial turnover in student population, closure in 2009–10) or decline to participate.

  1. Procedures for Data Collection

The data collection procedures of each of the main components of the study (site visits, principal interviews, teacher surveys, and state interviews) are discussed in detail below.

Site Visits

Protocol Development

Interview and Focus Group Protocols. Two contractors, AIR and Mathematica, have developed protocols to guide the case study interviews and focus groups. These protocols include both open‑ended questions and more closed‑ended questions. The open‑ended questions will encourage in‑depth responses, and incorporate flexibility so that the study team can follow threads of conversation to their logical conclusion, while the closed‑ended questions will ensure that similar types of data are collected across the case study schools. The study team recognizes the utility of including broad questions to initiate discussion, followed by focused probes to ascertain insights in important areas. In developing protocol questions, the study team has sought to avoid language that may be loaded, leading, or likely to yield socially desirable responses.

Interview and focus group protocols have been designed to fully explore the evaluation questions without placing undue burden on the respondents. The contractors began the protocol development process by creating a matrix of key domains and constructs, grounded in the study’s conceptual framework, that cover the range of evaluation questions guiding this study (see Appendix B). The construct matrix, which maps out the important constructs for each data collection instrument helped ensure that the full set of interview protocols, focus group protocols, and extant data sources will generate sufficient information to address all evaluation questions. Because study includes a special focus on strategies to address the needs of ELLs, and ways in which schools with a high proportion of ELLs experience the change process, the data collection instruments contain a special focus on these topics.

The contractors anticipate that site visitors will adapt the protocols for the context and particular intervention models being implemented in the site visited schools. To facilitate this process, the protocols include suggested probes, topics that that the interview should “listen for,” and notes to indicate where the interviewer should adapt the protocol to the particular respondent’s circumstances. Although each question contains many possible probes, the interviewer is not required (or expected) to ask each of these. Rather, the probes are intended to ensure the interviewer is attuned to the full range of possible topics and variables to be addresses in the responses, and – in rare cases – to prompt a less communicative interviewee.

To enrich the understanding of SIG expenditures gained from school and district fiscal files, interviews of district and school administrators also include questions regarding SIG expenditure decision processes and strategies. Interview questions asked of school administrators address topics including how funding decisions are made, the level of budgetary discretion at the school site, and how funds are used to support key turnaround strategies (e.g., to support coaches; to purchase technology, etc.). In particular, the study team will seek detailed data on specific uses of funds, rather than limiting the focus to the broad categories of expenditures often found in district fiscal files. For example, instead of simply learning that SIG funds pay for instructional staff, the study team will determine whether these activities included class size reduction, paying teachers for after‑school activities, performance bonuses, or incentives for teachers of hard‑to‑staff subjects.

Following publication of this OMB submission for public comment, the study team conducted piloting interviews with district officials, principals, and teachers in three states. Study team members asked these educators and administrators to react to (1) the overall organization, flow, and length of the interview, (2) the clarity of the interview wording and language, (3) specific questions that were unclear or difficult to answer, and (4) any recommended changes. The study team then used the responses from these pilots to modify the protocols to make them as suitable as possible for the actual site visits.

School Observation Templates. The study team also will collect school‑level observational data. Through use of an observation protocol (see Appendix C), the study team will record behaviors, tangible resources, or other evidence of constructs that are related to the turnaround effort. These may include the level of observed technological resources, presence of a parents’ center and evidence of its use (or neglect), orderly transitions between classes, positive interactions between adults and students, and well‑maintained school grounds. Over time, progress on topics recorded through the school observation may provide evidence of leading indicators of academic turnaround.

Request for Documents and Files (RFDF). The study team has developed a Request for Documents and Files (RFDF) for districts (see Appendix E). This RFDF will ask for electronic fiscal files. The fiscal files will be requested for the previous school year(s) (2007-08, 2008–09 and 2009–10 in Fall of 2010, and 2011–12 in Fall of 2012) and will include information on revenues and expenditures from all sources (federal, state, and local) including the SIG grants. These electronic files contain line items that report total expenditures by standard accounting codes: that is, figures may be broken down by fund, function, and object of expenditure for each and every school or central office site in the district. These data will allow the study team to track expenditures in the most policy‑relevant categories, including personnel and non‑personnel expenditures on instruction, administration, pupil support technology, and other areas. The study team will examine expenditures from SIG funds in particular as well as from all revenue sources. In each year of data collection, the study team also will obtain a copy of the budget for each case study school; specifically the study team will request a copy of the document provided to principals by the district. This document will be used to support the work of the site visitors in identifying how the school uses funds to support its turnaround strategies, and to help understand the data contained in the district data files (in the years that those files are analyzed).

Prior to conducting site visits, AIR will ensure that all data collection procedures (including informed consent forms, in Appendix C of this document) are approved by its Institutional Review Board (IRB).

Training and Preparation

Training for Site Visitors. The study team will train the site visit team so that each member brings to his or her on‑site work a consistent understanding of the relevant policies, the study, and the data collection needs. Prior to the first wave of data collection, all site visit staff from AIR and Mathematica will convene in Washington DC for a one‑day training session. In subsequent years, a half‑day training will be conducted via videoconference. The site visit team leads from AIR and Mathematica will jointly develop and conduct the trainings so that consistent messages and training content are delivered across organizations. The one‑day training will cover many topics, including relevant ESEA provisions and ED regulations, adequate yearly progress, Title I school improvement activities: sections 1003(a) and 1003(g), American Recovery and Reinvestment Act of 2009, Race to the Top, evaluation questions, site visit approach and activities, data collection on site (including brief review of interview protocols), and coordinating site visit work.

For the training, the site visit task leaders will develop a site visit checklist that will outline all tasks the site visitors need to perform before, during, and after each visit. All site visit team members will adhere to this checklist to ensure that visits are conducted efficiently, professionally, and consistently. There will be regular meetings of the site visit task leaders from AIR and Mathematica as well as weekly email updates and periodic larger meetings of the site visit staff.

Preparation for Site Visits. Prior to each site visit, the study team will compile and review all extant background information on the school. Although some document data must be collected on site, Web‑based resources (primarily state, district and school Web sites) include important contextual information on the history, priorities, performance, community support, and staffing of each school. Staff will review these closely before each visit and will enter preliminary information in a data capture template.

Selection of School‑Level Staff for Site Visits. To facilitate site visits, the case study team will work with a coordinator from each school site who will help schedule site visits. The study team will provide guidelines for the selection of participants in interviews and focus groups, but will not randomly select teachers from a roster. In addition to the specifications below (see Exhibit 6), the study team will explicitly ask the school to select participants who have a variety of perspectives on the school’s history and current change strategy. Moreover, the sample of teachers may be guided by reform activities at the school. If a school is focusing the roll‑out of improvement strategies in specific grades or subjects (reading instruction in early grades, for example) the study team will over‑represent such teachers in the data collection activities.

Exhibit 6. Sample of School‑level Staff to Participate in Case Studies


Elementary school level

High school level

Number of Interviews per school

Three teachers (four in larger schools) from first, third, and the last grade level in the school (fifth or sixth)

Four teachers, one per grade

Number of Focus Groups per school

One focus group of six participants

Two focus groups of 4–6 participants, one to be composed of department chairs

Subjects taught

Interviews will be conducted with teachers in core subjects (English language arts, mathematics, science, and social studies). The study team will seek guidance from school leaders to learn whether they are focusing reform efforts on specific subjects so the study team can ensure the study team is interviewing teachers of these subjects and to help prioritize interviews, if necessary.

Focus Groups will be conducted with teachers of core subjects, but incorporating other subject areas such as vocational education, arts, and foreign languages. Focus groups should also include teachers of English as a Second Language and teachers with primary responsibility for special education students.

Teacher Experience

The study team will seek a balance of teachers who are new to the profession and experienced, as well as those who are new to the school and have been with the school for several years. In follow‑up visits, the study team will seek to conduct interviews and focus groups with the same teachers as in prior years, but incorporating new teachers in schools with high turnover.



Administration of the Site Visits

Two‑person site visit teams will be instructed to be flexible in determining how to organize each school visit, being mindful of district and school context and the need to collect data that answer specific evaluation questions. However, the site visits will have a standard set of features.

For the 25 core case study schools, the primary site visit activities will include approximately three days on site per school, allowing adequate time for interviews and focus groups with district officials and school administrators, teachers, school improvement staff, a union representative, parents and a community leader, and students (high school only). During these visits, the study team also will conduct school‑level observations that will provide data on school resources and climate (see Exhibit 7 for an example of a fall site visit schedule, assuming two schools in the same district). The study team intends to conduct more focused follow‑up data collection in the spring of 2012 and 2013. The purpose of the spring data collection is to enhance the depth, comprehensiveness, and validity of the case study data. In addition, interviewing key stakeholders more than once during the school year will help validate or refute preliminary findings from fall data analyses, ensuring that the reports are as robust as possible. Each spring, the study team will interview district officials, principals, teachers, and external support providers.

For the special topic case studies, the study team will conduct site visits on issues related to English Language Learners in a subset of 10 schools in the spring of 2011 and 2012; site visits related to a second special topic, to be identified by IES and the contractors, will be conducted in a second subset of 10 schools in the fall of 2012 and 2013. These focused site visits will involve approximately two days on site per school and will consist of interviews and focus groups with district officials, school administrators, teachers, and students (high school only).

Exhibit 7. Sample Case Study Schedule

DAY 1

8:00–9:00 Principal/assistant principal

9:30–10:30 School improvement team focus group

11:00–12:00 Instructional coach [dependent on school]

12:30–1:15 Lunch

1:30–2:15 Teacher interview #1

2:30 – 3:30 Student focus group (in high schools only)

4:30–5:30 Teacher focus group #1



DAY 2

8:30–9:15 Teacher interview #2

10:00–10:45 Teacher interview #3

11:30–12:30 District Title I Director

12:30–1:15 Lunch

1:30–2:30 Other district representative (e.g. financial officer)

3:30–4:15 District superintendent

5:30–6:30 Parent/community focus group



DAY 3

9:00–9:45 Other TA providers [dependent on school]

10:30–11:15 Union representative [dependent on state]

11:30–12:15 Lunch

12:30–1:15 External support provider

2:00–3:00 Teacher focus group #2 [in larger elem/high schools]

4:00–5:15 Teacher interview #4 [in larger elem/high schools]



Each site visit team will consist of a lead researcher and junior researcher from either AIR or Mathematica. Having the researchers work as a team during the site visits will enhance the consistency and reliability of data gathered. The same pair of researchers will be responsible for scheduling and conducting the visits, and both site visitors will attend all interviews if possible. However, site visitors may conduct interviews separately if this is necessary to collect information from all respondents. Because of this, each interview and focus group will be audio recorded. In preparation for each visit, site visitors will review notes from interviews with state officials, review extant data on each site, consider the reasons for which each jurisdiction was selected for inclusion in the sample, review other relevant documentation, and annotate each section of the individual interview protocols accordingly. Further, these notes will be used to guide the wording of each question. The use of experienced interviewers, coupled with careful preparation, will ensure that interviews are not “canned” or overly formal. The lead site visitor for each case will have ultimate responsibility for completing and submitting the specified set of standard deliverables on the checklist for each site visit or follow‑up.

Throughout the process of data collection and reporting, the contractors will make all efforts to protect the privacy of respondents participating in the site visits. The study team will not identify by name any of the interviewees; nor will the study team attribute quotes. Although the study team will identify the names of states in the final reporting of case studies, districts and schools will be identified by pseudonyms.

Collection of Documents and Files

In fall 2010 and fall 2012, prior to the site visits to the districts in which the 25 core case study schools are located, the study team will send a Request for Documents and Files (RFDF) to the Chief Business Officer of the district. For the fall 2010 data collection, the study team will include a request for expenditure files from a base year and the year prior (e.g., 2008–09 and 2009–10). The study team is including a request for the 2008–09 data in order to understand any recent budget cuts districts might have been forced to make in the current recession, so that the study team can later understand the extent to which SIG funds replenish these cuts or add additional funds. For the fall 2012 data collection, the study team will request expenditure files from the prior school year only (i.e., the 2011–12 school year). The study team will request data files for the entire district.

Along with the electronic files, the study team also will request the document that describes the chart of accounts commonly used to organize fiscal information within the district. In some cases, the chart of accounts documents may be obtained from the state. The study team will use cross walks between the codes in the chart of accounts and the desired categories of reporting for expenditures to permit the study team to compare spending patterns across case‑study schools in different states.

District expenditure files also will allow the study to obtain a comprehensive understanding of how total spending may have changed in these schools during the turnaround process. The study team will request data files that include fund, function, object, and job codes. That is, files should show funds allocated for personnel as well as expenditures such as instructional materials, professional development, and external contracts (including EMOs), so the study team can isolate each. In addition to the district expenditure files, the study team will request school‑level budgets for the case study schools annually as well as consolidated applications from districts.

Finally, the study team will request both district and school policy and background documents. At the district level, this may include statements of district strategies and goals as well as information on current interventions. At the school level, such extant data may include documents pertaining to the schools’ turnaround processes.

Principal Interviews

Protocol Development

The protocols for the telephone interviews with principals were developed in conjunction with all protocols for the site visits, described above, and followed the same procedures.

Administration of Interviews

Each fall, primary research staff will conduct one‑hour telephone interviews with the principals in all 60 base sample schools. The telephone interview format allows some standardization across questions asked, but provides adequate opportunity for respondents to elaborate on their responses. Each interview will be accompanied by a note‑taker and will be recorded digitally. Notes will be summarized following each interview. In the event that exact quotes or verification is needed, the audio file will be available as a backup.

Teacher Survey

The teacher survey will be administered to all teachers in the 60 base sample schools in the winter of each year of the study (the longitudinal module), while teachers in 25 core case study schools will be surveyed twice a year for the three years of data collection (the case study spring supplement). Teachers in the two sets of special topic case study schools will be surveyed twice a year (the special topics supplement).

Protocol Development

The study team has employed a well‑tested process for developing a brief on‑line teacher survey that involves (1) defining the content areas to be addressed by the survey in the construct matrix; (2) including some survey items from previous research studies that can be adapted and drafting new items as needed; (3) creating a first draft of the survey; (4) having the survey reviewed by internal survey experts; and (5) developing a final version of the survey. After the final version of the survey has been approved, it will be passed to a Web design specialist who will create a Web‑based version. This version will be thoroughly tested by staff before it is released to teachers.

The full set of teacher surveys will provide a better understanding of the strategies being implemented to improve their instructional practices (EQ2), schools’ human capital and climate (EQ3) and governance strategies (EQ4). In addition, in the case study schools, the survey will enable the study team to triangulate qualitative data on important constructs, particularly those related to the quality and depth of implementation. Because school turnaround is associated with how well schools are implementing practices—not simply what they are implementing—using multiple data sources is necessary. In particular, the study team may need to quantify qualitative data, and survey data would lend credence to indices that rely, in part, on judgment. Moreover, a goal of this study is to provide timely information to schools, and the survey data will provide data that are more easily transmitted back to the case study schools.

One of the objectives of this study is to be flexible and nimble, responding to questions of policy interest and following up on topics that emerge in the course of the case studies. For this reason, this OMB package does not include all teacher surveys to be administered throughout the study. Appendix D includes the elementary longitudinal module and the high school longitudinal module. Supplementary modules for the core case studies and special topics case studies will be developed after the first set of telephone interviews and surveys have been administered and the questionnaires will be submitted to OMB for clearance.

Administration of Teacher Surveys

Once data collection procedures have been approved by OMB, DIR staff will contact each of the case study schools to obtain teacher lists and to select a coordinator to assist in the administration of the teacher survey and other aspects of data collection. Each case study school will be sent a package containing: (1) a cover letter; (2) promotional material describing the evaluation; (3) a copy of the teacher survey for review; and (4) instructions for completing the online version of the survey. After the school-level coordinator has had an opportunity to review the materials, the survey team will send an email to all teachers in the school, including a unique link to the Web-based survey. The email will include a brief introduction to the study and the purpose of the survey, and the first full page of the Web-based survey will include all human subject notifications, in accordance with AIR’s IRB review. As explained in the following section (page 27), if teachers do not respond to the survey, they will be prompted by email and telephone. As a last strategy, the survey team will send return postage paper-and-pencil surveys11 to the school, or will request that site visit teams deliver hard copy surveys during their school visit. Survey data will be stored in files containing only ID numbers, without any identifying information, as a further measure of confidentiality.

State‑Level Data Collection

Protocol Development

The interview protocol for state Title I directors or school improvement directors includes questions that are phrased in a clear, conversational manner; will generate systematic data across the states in which case study schools are located; will allow respondents to provide adequate contextual information on their state’s approaches; and include, as appropriate, questions from the Study of State Implementation under NCLB (SSI‑NCLB) interviews in 2004 and 2006 to examine change over time. Key topics include state‑level decisions with regard to intervention models, the allocation of SIG funds, actions and strategies to support chronically low‑performing schools, and the integration of SIG activities with accountability, assessment, and support systems. In preparation for these interviews, AIR staff will annotate sections of the protocol to identify areas for clarification and further elaboration, thereby maximizing the productivity of each interview.

Administration of Interviews

Each fall, primary research staff will conduct one‑hour telephone interviews with the Title I director or school improvement director with primary responsibility for the SIG program in each of the states where case study schools are located. If appropriate, the study team will also interview the state accountability director or federal program director to ensure the study team fully understands each state’s approach and context. The telephone interview format allows some standardization across questions asked, but provides adequate opportunity for respondents to elaborate on their responses. Each interview will be accompanied by a note‑taker and will be recorded digitally. Notes will be summarized following each interview. In the event that exact quotes or verification is needed, the audio file will be available as a backup.

Document Collection

At the state level, as with the district and school levels, the study team will collect documents that are related both to turnaround processes as well as fiscal resource allocation. Many state policy documents are available on SEA Web sites, including descriptions of interventions and supports, evaluations commissioned by the SEA, data analysis guidelines, and planning documents.

Also during the state interview process, the study team will request consolidated applications for federal funds from each state for each district with SIG case study schools. These applications outline the planned use of SIG funds at the state, district, and school levels and will enable us to determine how much SIG money (and other federal funds) will be used at the district level and how much is budgeted for each school.

  1. Methods to Maximize Response Rate

Data collection is a complicated process that requires careful planning. The research team has developed interview protocols, focus group protocols and questionnaires that are appropriately tailored to the respondent group and are designed to place as little burden on respondents as possible. The team has also piloted core data collection instruments to ensure they are user‑friendly and easily understandable, all of which increases participants’ willingness to participate in the data collection activities and thus increases response rates.

In addition to careful instrument design, a high response rate among the 60 case study sites may be assured through careful recruitment materials. These recruitment materials will emphasize the social incentive to respondents by stressing the importance of the data collections as part of a high‑profile study that will provide much‑needed information to districts and schools. AIR’s experience in past evaluations has demonstrated the importance and value of building a consensus of support with participating districts. This leads to districts and schools that have the capacity, willingness, and commitment to cooperate fully with the research and data collection responsibilities. Investing in site development at the front end reduces problems in the back end, helping to ensure smooth implementation of the evaluation.

To further ensure a high response rate on the teacher survey, the study team will implement the following strategies for collecting data. After mailing the packages to schools, DIR staff will monitor and prepare a weekly log of all responses received. Approximately two weeks after the initial notification, DIR staff will begin the process of survey follow‑up, which will include email reminders to all respondents, as well as targeted telephone calls and mailings to schools about specific non‑respondents. If teachers do not respond to the on-line survey after email and telephone prompts they will be asked to respond to a paper-and-pencil version of the survey.12 In addition, a small compensation will be provided to respondents in return for participating in the survey. For each wave of data collection, teachers who complete the survey will be mailed a thank‑you letter and a $10 gift card as a gesture of appreciation for their time and effort and a means to create a positive association with the survey and with the research project as a whole.

Due to these practices and given that under this grant program the states and subgrantees are required to provide data for this and other evaluation purposes, the study team anticipates 100 percent response rates for the case study interviews and focus groups and an 80 percent or greater response rate for the teacher survey.

  1. Expert Review and Piloting Procedures

Following publication of this OMB submission for public comment, the study team conducted piloting interviews with district officials, principals, and teachers in three states. Study team members asked these educators and administrators to react to (1) the overall organization, flow, and length of the interview, (2) the clarity of the interview wording and language, (3) specific questions that were unclear or difficult to answer, and (4) any recommended changes. The study team then used the responses from these pilots to modify the protocols to make them as suitable as possible for the actual site visits.

  1. Individuals and Organizations Involved in Project

AIR is the contractor for the Study of School Turnaround and, in collaboration with Mathematica Policy Research, Decision Information Resources, Education Northwest, and the Institute of Education Sciences will carry out the study activities. Drs. Beatrice Birman, Jennifer O’Day, and Brian Gill (Mathematica) serve as co‑principal investigators and Dr. Kerstin Carlson Le Floch is the project director. Dr. Margaret Goertz serves as senior consultant to the study. Case study site visits will be led by Dr. Susan Cole, fiscal data collection by Ms. Karen Manship, with support and guidance from Dr. Jay Chambers. Mr. Fleischman (Education Northwest) leads communications and dissemination.

During data collection and particularly during the initial phase of analysis, the contractors will draw on the cross‑staffing of some key members of the study, including the project director, co‑principal investigators, and team leaders, to combine findings across these data sources to create the synergies that are at the core of the mixed‑methods design. Contact information for the individuals and organizations involved in the study is presented in Exhibit 8.

Exhibit 8. Organizations, Individuals Involved in Project

Responsibility

Organization

Contact Name

Telephone Number

Co‑Principal Investigator

AIR

Dr. Beatrice Birman

202 403–5318

Co‑Principal Investigator

AIR

Dr. Jennifer O’Day

650 843–8166

Co‑Principal Investigator

Mathematica Policy Research

Dr. Brian Gill

617 301–8962

Project Director

AIR

Dr. Kerstin Carlson Le Floch

202 403–5649

Senior Consultant

Consultant

Dr. Margaret Goertz

609 737–2464

Site Visit Task Leader

AIR

Dr. Susan Cole

650 843–8187

Fiscal Data Collection Task Leader

AIR

Dr. Karen Manship

650 843–8198

Senior Advisor, Fiscal Data Collection

AIR

Dr. Jay Chambers

650 843–8111

Communication and Dissemination Task Leader

Education Northwest

Mr. Steve Fleischman

503 275–9507

Research Scientist

Institute of Education Sciences

Thomas Wei

202 208-0452

Education Research Analyst

Institute of Education Sciences

Audrey Pendleton

202 208-7078

References

Armstrong, D., Gosling., A., Weinman, J., and Marteau, T. (1997). The Place of Inter-Rater Reliability in Qualitative Research: An Empirical Study. Sociology 31(1) 597-606.

Editorial Projects in Education Research Center (2009), Education Week School Finance Data, available at: http://www.edweek.org/rc/articles/2009/01/21/sow0121.h27.html.

Harvey, J. and Housman, N. (2004) Crisis or possibility: Conversations about the American high school. Washington, DC: National High School Alliance.

Herman, R., Dawson, P., Dee, T., Greene, J., Maynard, R., & Redding, S., (2008). Turning around chronically low‑performing schools; IES Practice Guide. Washington DC: U.S. Department of Education, Institute of Education Sciences.

Hess, F. (2005). Inside the gift horse’s mouth: Philanthropy and school reform. Phi Delta Kappan 87(2), 131–137.

Hill, P. (2006). A foundation goes to school: Bill and Melinda Gates shift from computers in libraries to reform in high schools. Education Next 6(1), 44–51.

Le Floch, K.C., Boyle, A., Therriault, S., and Holzman, B. (2010). State efforts to support and improve high schools. AIR Research Brief. Washington DC: American Institutes for Research.

School Improvement Grants—American Recovery and Reinvestment Act of 2009; Title I of the Elementary and Secondary Education Act of 1965, 75 Fed. Reg. 3375–3383 (2010)

Siskin, L. (2003). When an irresistible force meets and immovable object: Core lessons about high schools and accountability. In M. Carnoy, R. Elmore, and L. Siskin (Eds.), The new accountability: High schools and high stakes testing (pp. 175–194). New York: Routledge Falmer.

Stake, R. (2000). Chapter 16: Case Studies. In Handbook of Qualitative Research, 2nd Edition, edited by Denzin, N. and Lincoln, Y. Thousand Oaks, CA: Sage Publishing.

Yin, R. K. (2003). Case study research, design and methods, 3rd ed. Newbury Park: Sage Publications.

Yin, R.K. (1992). Evaluation: A singular craft. Paper presented at the annual meeting of the American Evaluation Association, Seattle, WA.

Yohalem, N., Wilson, Ahlstrom, A., Ferber, T., & Gaines, E. (2006). Supporting older youth: What’s policy got to do with it? New directions for youth development, 111, Fall 2006 (pp. 117–129).



1 The contractors for this study are the American Institutes for Research (AIR), Mathematica Policy Research, Decision Information Resources (DIR), and Education Northwest.

2 States have the option of identifying Title I eligible elementary schools that (1) are no higher achieving than the highest‑achieving school identified as a persistently lowest‑achieving school in Tier I; and that (2) have not made AYP for at least two consecutive years; or are in the state’s lowest quintile based on proficiency rates.

3 States may also identify as Tier II schools Title I eligible secondary schools that (1) are no higher achieving than the highest‑achieving school identified as a persistently lowest‑achieving school in Tier II; or that have a graduation rate of less than 60 percent over a number of years; and that (2) have not made AYP for at least two consecutive years; or are in the state’s lowest quintile based on proficiency rates.

4 States have the option of identifying as Tier III schools (1) Title I eligible schools that do not meet the requirements to be in Tier I or Tier II; and (2) have not made AYP for at least two consecutive years; or are in the state’s lowest quintile based on proficiency rates.

5 Please see page 10 for a preliminary discussion of the analytic processes for this study.

6 With the exception of the state interview protocol (which will be administered to fewer than nine respondents), all instruments require clearance.

7 Note that at the time of this submission, 42 states had disbursed SIG funds to districts and schools, thus, the universe of SIG schools was not yet available.

8 Initiatives such as the Bill and Melinda Gates Foundation’s Small Schools Initiative and Early College High School Initiative and Achieve’s American Diploma Project, the Center for Research on the Education of Students Placed At Risk at Johns Hopkins University, and the National Governors Association’s Honor States are examples of national efforts that focus on improving high schools through private foundations, researchers, and advocacy groups.

9 The SEA capacity to support school improvement may be gauged through state SIG applications and some extant data sources, including data collected through the Study of State Implementation under NCLB (SSI‑NCLB).

10 This table only includes districts from states that had awarded SIG subgrants at the time of submission. Thus, some states which may ultimately be included in the sample, are omitted.

11 The paper-and-pencil survey is included in this clearance package; the Web-based version will include identical content.

12 The paper-and-pencil survey is included in this clearance package; the Web-based version will include identical content.

1000 Thomas Jefferson Street, NW | Washington, DC 20007‑3835

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorInformation Technology Group
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy