Download:
pdf |
pdfFinal Agreement Form (School MOU)
School Roles and Responsibilities:
Early Warning and Intervention Monitoring System Study
Dear ,
On behalf of American Institutes for Research (AIR) (www.air.org) and its partners, we welcome
you to the Early Warning and Intervention Monitoring System (EWIMS) study. We are excited
about this project. Its purpose is to examine the impact of EWIMS on (1) student outcomes
including student risk status for dropout, scores on state assessments, persistence and progress in
school and likelihood of on-time graduation; and (2) school outcomes including how schools
allocate dropout prevention interventions for students and their data-use culture. This document
contains an overview of the study and a brief description of the intervention, followed by a
description of the roles and responsibilities for your school and for the study team, including the
benefits of participation, and the project timeline. Please review the contents of this document
and sign the last page to indicate your agreement to participate. Return the signed last page to Dr.
Nicholas Sorensen (nsorensen@air.org, fax 312-288-7601).
Overview
The primary goal of the EWIMS study is to evaluate whether implementing an early warning and
intervention monitoring system for identifying students at-risk of dropping out and using this
system to assign students to dropout prevention interventions will improve student outcomes
including student risk status for dropout, scores on graduation tests, persistence and progress in
school and likelihood of on-time graduation. In addition, this study will also examine the impact
of implementing EWIMS on school outcomes including how schools allocate dropout prevention
resources for students and their data-use culture. Study results will yield valuable information for
the state of and for districts and schools across the country about the viability and
benefits of using an early warning system to prevent high school dropout and help struggling
students get back on track for eventual graduation. The study is being funded by the U.S.
Department of Education, Institute of Education Sciences, and will be conducted from March
2014 through the spring of 2016.
We look forward to working with as a partner in this project!
The EWIMS Model
The EWIMS model, developed by the National High School Center at AIR, is a multistep
process intended to encourage systematic and comprehensive implementation within schools.
The process is based on a combination of research on data use in schools and National High
School Center’s experience working with states, districts, and schools implementing early
warning systems.
1
At the heart of the EWIMS process is an early warning data tool used to flag students as “at risk”
based on attendance, course performance (grades, credits, grade point average [GPA]), and
behavior indicators. The tool enables schools to identify students who are at risk of dropping out
of school, record assignments to available interventions, and monitor students’ response to those
interventions.
Beyond the development of the data tools, the National High School Center has devised a sevenstep EWIMS implementation process to support implementation. The process guides users to
make informed decisions about how to use data to support at-risk students and how to continue
to monitor their progress over time. In addition to focusing on individual students, the process
guides users to examine the success of specific supports or interventions and to examine possible
systemic issues (e.g., school climate) that may relate to dropout trends.
Figure A-1. Early Warning Intervention Monitoring System Implementation Process
STEP 1—Establish
Roles & Process
STEP 7—Evaluate
& Refine EWIMS
STEP 2—Use the
EWS tool
STEP 3—Analyze
EWS data
STEP 6—Monitor
Students
STEP 5—Assign &
Provide
Interventions
STEP 4—Interpret
EWS data
As shown in Figure 1, the steps are intended to be cyclical. At the core of this data-driven
decision-making process, the steps focus users on key indicators that identify which students are
showing signs of risk of dropping out of high school and guide users to go beyond the indicator
data and other relevant information to connect at-risk students to dropout prevention or academic
support interventions. The EWIMS model does not prescribe specific interventions for
schools to implement. Instead, the model is designed to allow schools flexibility to decide
which interventions they believe are most effective for their students’ needs. Ideally, the EWIMS
model allows users to identify students with accuracy and provide supports and intervention of
your school’s choosing to at-risk students, resulting in improved outcomes for students,
including higher attendance rates and improvement in academic performance leading toward
successful graduation.
This Study
Despite increasingly widespread implementation of early warning systems by states, districts,
and schools, there have been no rigorous studies testing the impact of using an early warning
system to improve student outcomes such as staying in school, progressing in school, and
2
graduating. There is also little research examining how using an early warning system can shape
a school’s culture for data use—increased data-driven decision making (assigning interventions
to students) and processes and professional development to support using data to improve
teaching and learning. This study will address these gaps and provide the first rigorous test of the
impact of an early warning system. The study will:
Identify a sample of eligible and interested schools in . The study team
will conduct outreach to schools that meet initial eligibility criteria to confirm eligibility
and discuss interest in participating in the EWIMS study. The study will include
approximately 70 high schools in the Midwest. To qualify, schools must (1) have at least
150 ninth-grade students; (2) a graduation rate between 25 and 95 percent and (3) not
already be implementing an early warning system tool for using data to flag at-risk
students.
Use a lottery to randomly assign half of the participating schools to implement
EWIMS in March of 2014 and the other half to implement in fall of 2015. Half of
participating schools will be randomly assigned to receive access to the EWIMS model in
March 2014 (including the tool and technical support for implementation) and the other
half of participating schools will be randomly assigned to conduct “business as usual” for
identifying at-risk students until the fall of 2015 when they will receive the same
resources and supports.
Implement EWIMS. All schools will implement the seven-step EWIMS process either
in March 2014 or the fall of 2015 (depending on random assignment by lottery). The
seven steps for implementation are as follows:
o Step 1—Establish Roles and Process. In this step, the composition of the EWIMS
team is established;1 team members then determine frequency and duration of
meetings and develop a shared vision or focus of the team’s work.
o Step 2—Use the EWS tool. In this step, the school-based EWIMS teams are
trained on the use and purpose of the tool itself. This step also includes first
customizing the tool settings and importing the student demographic and initial
administrative data and also ongoing refreshing of the administrative data in the
tool and the running of automated and custom lists and reports available within
the tool.
o Step 3—Analyze EWS data. In Step 3, EWIMS teams focus their attention on
student- and school-level data, based on the indicators available in the tool. This
data review process is intended to identify areas of focus and further
investigation.
o Step 4—Interpret EWS data. Step 4 guides teams to bring in additional data
(external to the tool) to provide more context and a fuller picture to inform the
EWIMS team’s consideration of specific needs of individuals or groups of
flagged students. Unlike Step 3, which is focused on the indicator flags
themselves (i.e., the data in the tool), this step addresses root causes of why
The goal is not to create yet another “team” with functions that may or may not overlap with functions of other
already existing teams. Rather, integration with existing team structures (if functionally operational) is optimal.
1
3
students might be identified as at risk for one or more indicators. The
implementation team will provide training to identify root causes that focus on
acquiring additional formal (e.g., administrative records) and informal (e.g., from
teacher, family, and student) input. This training will occur face to face and will
last for one hour.
o Step 5—Assign and Provide Interventions. In this step, EWIMS team members
make informed decisions about the allocation of available resources and strategies
to support students identified as at risk of dropping out of high school. The
EWIMS team matches individual students to specific interventions after having
gathered information about (1) potential root causes for individual flagged
students (Step 4) and (2) the available dropout prevention and academic and
behavioral support programs in the school, district, and community, which are
locally determined.
o Step 6—Monitor Students. In this step, EWIMS teams continue to examine
student indicators at regular intervals to continually identify students who show
signs of being at risk. The teams will use the same indicators to closely monitor
already-identified students who were assigned to interventions for progress in
school and risk status. This step provides critical ongoing feedback about
additional student- and school-level needs and apparent successes.
o Step 7—Evaluate and Refine the EWIMS. Through active and structured
reflection, EWIMS team members assess whether students are responding to
assigned interventions, revise their specific strategies or general approach as
needed, and determine how resources are allocated to improve support for
students. This step encourages EWIMS teams to make course corrections to all
parts of the EWIMS implementation. As implied by the cyclical depiction of the
seven-step process, this step (as well as the other six) reflects an ongoing process
of continual improvement.
Evaluate the effects of EWIMS on student and school outcomes. The study will
examine student outcomes for all students in grades 9 and 10 during the 2013–14 school
year and all students in grades 9 through 11 during the 2014–15 school year. All student
outcome data will be collected from school or district administrative data, the EWS tool
or the . In addition, all participating schools will be
asked to complete an annual Web-based survey about data use practices and schools
randomly assigned to implement EWIMS in March 2014 may be asked to participate in
interviews about their experiences using the tool.
School Roles and Responsibilities
We look forward to partnering with for this exciting project! More
detailed information on the responsibilities of participating schools follows. Final determination
of school eligibility requires willingness to adhere to the study guidelines and responsibilities.
Please note that your school’s participation in this project is voluntary.
will not be penalized in any way for not participating and you may discontinue participation at
any time without penalty.
4
Maintain a sustained commitment to participate in the study. To evaluate the impact
of EWIMS, it is critical that schools agree to adhere to the study guidelines and timelines
associated with random assignment by lottery to implement EWIMS in March 2014 or
fall of 2015. All schools are expected to implement the EWIMS model as designed
(including all seven steps of the implementation cycle).
Adhere to the results of the lottery/random assignment process. It is essential that the
groupings that result from the lottery remain intact over the course of the study. Schools
assigned by lottery to implement the EWIMS model in March 2014 will serve as the
treatment group. Their counterparts assigned to implement EWIMS in fall of 2015 will
serve as a control group from March 2014 through the spring of 2015. Schools randomly
assigned to the control group must continue with “business as usual” practices for
identifying at-risk students and assigning dropout prevention interventions until EWIMS
implementation begins in the fall of 2015. Students in control schools should continue to
receive any services that would be offered to them in the absence of the study. As for
students in the “treatment” group, no “typical” services should be withheld. Please refer
any questions or concerns from parents or school staff about this to the study team.
Participate in all data collection activities. Participating schools should provide schoollevel information including high school graduation rates, average state achievement
scores in reading and mathematics and demographics (e.g., percentage of students
receiving free or reduced-price lunch).
Administrative student-level data collection in participating schools will focus on all
students in grades 9 and 10 during the 2013–14 school year and all students in grades 9,
10 and 11 during the 2014–15 school year. The study team will obtain as much
administrative data as possible from the and the
school district. However, all participating schools should provide the study team access to
the following administrative data for students should this data be unavailable through
other sources:
o Demographic information (e.g., race/ethnicity, gender, free or reduced-price lunch
[FRPL], individualized education program [IEP], and English language learner
[ELL] status, and parents’ education)
o Grade point average (GPA)
o State test scores
o Attendance rates
o Course grades in core academic courses by semester
o Credits earned by semester
o Disciplinary information (e.g., suspensions)
o Enrollment information (e.g. whether students are enrolled or have left school for
reasons other than transfer to another district, including dropping out)
o Grade promotion
5
In addition to administrative records, one administrator at all schools should complete an
annual Web-based survey assessing how schools use data to allocate dropout prevention
resources to students.
Finally, all schools assigned by lottery to the treatment group (implementing EWIMS in
March 2014) should participate in data collection efforts focused on implementation.
Specifically, the project team will collect data on attendance and satisfaction with
EWIMS training sessions and meetings. EWIMS teams at schools assigned to the
treatment group will be required to submit their EWS tool securely to the study team and
may be asked to participate in interviews about their implementation experience.
EWIMS implementation (either in March 2014 or fall of 2015) will require the following:
Develop an EWIMS team within your school. A diverse, well-informed, EWIMS team
within your school is essential to the success of this process. The EWIMS team may be
established as a new team or may build on or be integrated into existing teams (school
improvement team, response to intervention team, student support team). It is not
necessary to create an entirely new team for EWIMS work, but an existing team that
takes on the responsibility to use the tool for dropout prevention efforts should include a
broad representation of staff within the school (e.g., principals, representatives from
feeder elementary/middle schools, guidance counselors, teachers, specialists). The
EWIMS team is responsible for identifying students who are at risk and ensuring that
their individual needs are met through school-based interventions. In most cases, this
team is not directly responsible for applying interventions for students but their focus
should be on helping students navigate the school systems to access appropriate and
needed services.
Participate in EWIMS professional development. The EWIMS team will receive
professional development on the EWIMS process and tool capabilities, and subsequently
be given adequate time to implement the EWIMS process. The professional development
activities include the following:
o One two-hour training on how to use the tool (e.g., uploading data) for the
individual who will manage data entry (also a member of the EWIMS data team).
The project team anticipates that these training sessions will be held on site at
each participating school.
o Full-day in-person regional training (estimate no more than 100 miles maximum
from any participating school) on seven-step EWIMS process and model. Project
will cover mileage for up to five building faculty/staff and potentially $500 for
substitute teachers if the school elects for one or two teachers to join the EWIMS
data team.
o Two two-hour webinars
Reviewing data and monitoring progress over time (all five team
members)
Evaluating and refining the EWIMS process (all five team members)
o Monthly one-hour conference calls for a community of practice of all
participating schools (minimum one person per team must participate)
Import student data into the EWS tool. A member of the EWIMS team is responsible
for entering, or importing, data into the EWS tool, which facilitates the EWIMS team’s
6
use of the tool to use the data to initially flag students as at risk. Participating schools
should upload attendance data at the 20- or 30-day mark and after every grading period.
Course performance, GPA, and behavioral data (optional) should be uploaded after every
marking period.
Produce reports of at-risk students and assign students to appropriate interventions
or services. The tool houses information about interventions assigned to each student and
documents students’ transition in and out of each intervention and their ultimate response
to the intervention(s) (i.e., for each student flagged, did the assigned intervention(s) have
a positive influence on the number of flags as calculated during subsequent grading
periods?). The EWS tool does not prescribe specific interventions for students based on
the type or number of indicators, but rather relies on EWIMS teams to make data-driven
decisions within their own local context of potential interventions available, to match
students with interventions.
Conduct EWIMS monthly team meetings that are organized and documented. An
agenda for each meeting should be prepared at the end of the prior meeting, and at least
some agenda items should be routine, such as a review of the data from the tool, actions
taken for individual or groups of students, a review of previous meetings’ action items
(ongoing or completed), new action items, and communication with staff and leadership.
Notes should be taken at each meeting and include action items assigned to specified
individuals to accomplish. Agenda, meeting notes, and a faculty/staff sign-in sheet should
be kept on file to provide a record of the team’s work.
Communicate with individuals and groups outside of the EWIMS team. Information
on flagged students, intervention effectiveness, and team-identified needs to support
students should be routinely reported to and discussed with school and district leadership.
Teachers should receive regular updates about students in their classes who are
displaying indicators of risk, as well as input about supports available to them to use with
these students. Last, students and their parents should be engaged in the conversation
about their risk status and the plans to ensure that they are able to get back on track for
graduation. Although the EWIMS team may not be directly responsible for meetings with
individual students and their parents (i.e., delivering the individual interventions), the
team should be in a position to prompt such meetings or to share information routinely
about student progress and the early warning signs of risk. Of critical note, the team
should share the knowledge of students’ risk with sensitivity, ensuring that identification
is used to prompt action and support, not to assign labels that carry stigma.
Solicit feedback from stakeholders. Feedback from administrators, teachers, staff,
students, and parents can help the EWIMS team uncover underlying causes for students
displaying indicators of risk. This information may help the EWIMS team match students
to appropriate interventions and supports.
Monitor progress. The EWIMS team should monitor progress as it strives to improve
educational outcomes for students during a single school year and over the course of
multiple school years. The team should be responsible for presenting progress reports to
key stakeholders, including principals, staff, district leadership, the local board of
education, and parents.
7
Study Team Role and Responsibilities
The project team is composed of researchers from the Midwest Regional Educational Laboratory
at AIR. The major responsibilities for the study team are as follows:
Obtain necessary approvals from review boards (federal, district, and organizational
Internal Review Boards) and comply with the research protocols in place.
Provide access to the EWS tool and training and technical support for implementation.
Collect data for the study. The majority of the data for this study will be administrative
records transmitted from the district, thus minimizing the data collection burden on
participating schools. The study team also will conduct an annual Web-based survey of
all schools and collect all implementation data from schools assigned to the treatment
group (EWS tool data, interviews with EWIMS team members).
Assure confidentiality. The study team will collect data only for the purposes of this
study and will not use or allow the use of the data for evaluating individual participants,
schools, or districts.
Each participant will be assigned a study-specific identification number, in place of
their names. A data file that links each participant with their identification number
will be kept in a password-protected file that only the study team can access.
The published analysis of the results will aggregate results across all schools and will
not include results that have been disaggregated by school or district.
All members of the study team are required to complete a comprehensive training
course that addresses current federal government standards and sign federal data
confidentiality agreements.
Analyze data and produce reports. The study team will be responsible for aggregating
information about the effectiveness of EWIMS on student and school outcomes. The
study team expects that the final report will be released in 2016, pending the federal
review process, and will ensure that participating schools receive this report.
Timeline
Table A-1 presents the major tasks of the project as they were described previously.
Table A-1. Major Tasks of the Project
Tasks
Dates
EWIMS Implementation
Treatment schools implement EWIMS with Grades 9 and 10
March 2014–June 2014
Treatment schools implement EWIMS with Grades 9, 10, and 11
August 2014–June 2015
Control schools implement EWIMS
August 2015–June 2016
8
Tasks
Dates
Data Collection
Collect administrative records from the state and district
March 2014–June 2015
Collect EWS tool data from treatment schools
March 2014–June 2015
Conduct annual Web-based survey
May 2014 and May 2015
Conduct interviews with EWIMS team members in treatment
schools
June 2014 and June 2015
Analysis and Reporting
Draft and submit final report.
December 2015
EWIMS implementation is aligned with the academic calendar. The school-based EWIMS teams
meet monthly, with other critical activities occurring prior to school beginning, after the first 20
or 30 days of school, shortly after the end of each grading period, and at the end of the academic
year. Table A-2 details expected key activities of EWIMS implementation over the course of an
academic year.
Table A-2. Schedule and Key Activities for Early Warning Intervention and
Monitoring System Implementation
Year 1 Activities
Schedule
March/April
2014
Key Activities (aligned to the Early Warning Intervention and Monitoring System
[EWIMS] implementation steps)
Forming/designating an EWIMS team (Step 1)
Setting up the early warning system (EWS) Tool (Step 2)
Begin convening monthly EWIMS team meetings (Step 1)
Importing or entering students’ absences, course failures, and behavior information
(e.g., referrals and suspensions, by grading period) (Step 2)
Reviewing and interpreting student- and school-level reports (Steps 3 and 4)
Identifying and implementing student interventions (Step 5)
Monitoring students’ responses to existing interventions in which they are
participating (Step 6)
Revising students’ intervention assignments, as needed (Steps 5 and 6)
9
Year 1 Activities
Schedule
Key Activities (aligned to the Early Warning Intervention and Monitoring System
[EWIMS] implementation steps)
Updating student roster to reflect new enrollees, transfers in and out, and so forth
(Step 2)
Importing or entering students’ absences, course failures, and behavior information
(e.g., referrals and suspensions), by grading period, if applicable (Step 2)
At the end
of the school
year (~June
2014)
Reviewing and interpreting student- and school-level data (Steps 3 and 4)
Identifying and implementing new student interventions (Step 5)
Monitoring students’ responses to existing interventions in which they are
participating (Step 6)
Revising students’ intervention assignments for summer and for the next academic
year, if needed (Steps 5 and 6)
Evaluating the EWIMS process, using student- and school-level reports, and revise as
necessary (Step 7)
Exporting student data to (1) prepare the EWS tool for the next school year and/or
(2) for those students who are transitioning to high school, share data with students’
high school(s).
Year 2 Activities
Schedule
At the
beginning of
the school
year
(~August
2014)
After the
first 20 or 30
days of the
school year
(~October
Key Activities (aligned to the Early Warning Intervention and Monitoring System
[EWIMS] implementation steps)
Reconvening the EWIMS team meetings (Step 1)
Importing or entering student information and, if available, incoming risk indicator
data into the EWS Tool (Step 2)
Reviewing and interpreting student needs based on data from the previous year (e.g.,
review the Overage Student Report) (Steps 3 and 4)
Verifying student information, especially enrollment status, and updating student
roster to reflect new enrollees, transfers in and out, and so forth (Step 2)
Reviewing incoming risk indicators or previous year data, including any additional
information (e.g., bridge program participation, summer school participation, prior
course performance), to review and interpret student needs (Steps 3 and 4)
Identifying and implementing student interventions or supports based on incoming
risk indicator information, if available (Step 5)
Updating student roster to reflect new enrollees, transfers in and out, and so forth
(Step 2)
Importing students’ absences (Step 2)
Reviewing and interpreting student- and school-level reports (Steps 3 and 4)
Identifying and implementing student interventions (Step 5)
10
2014)
After the
midyear
grading
period
(~February
2015)
At the end of
the school
year (~June
2015)
Monitoring students’ initial response to interventions (Step 6)
Revising students’ intervention assignments, as needed (Steps 5 and 6)
Updating student roster to reflect new enrollees, transfers in and out, and so forth
(Step 2)
Importing or entering students’ absences, course failures, and behavior information
(e.g., referrals and suspensions, by grading period) (Step 2)
Reviewing and interpreting student- and school-level reports (Steps 3 and 4)
Identifying and implementing student interventions (Step 5)
Monitoring students’ responses to existing interventions in which they are
participating (Step 6)
Revising students’ intervention assignments, as needed (Steps 5 and 6)
Updating student roster to reflect new enrollees, transfers in and out, and so forth
(Step 2)
Importing or entering students’ absences, course failures, and behavior information
(e.g., referrals and suspensions), by grading period, if applicable (Step 2)
Reviewing and interpreting student- and school-level data (Steps 3 and 4)
Identifying and implementing new student interventions (Step 5)
Monitoring students’ responses to existing interventions in which they are
participating (Step 6)
Revising students’ intervention assignments for summer and for the next academic
year, if needed (Steps 5 and 6)
Evaluating the EWIMS process, using student- and school-level reports, and revise as
necessary (Step 7)
Exporting student data to (1) prepare the EWS tool for the next school year and/or
(2) for those students who are transitioning to high school, share data with students’
high school(s).
Benefits to Participation
There are many benefits of participation for your high school. Critical indicators in ninth and
tenth grade that powerfully predict whether students are “on track” for high school graduation
can be used as part of an early warning system to flag at-risk students early, assign appropriate
interventions, and get students back on track. Participating in this high-profile, large-scale study
will give your school and district an opportunity to access the Early Warning and Intervention
Monitoring System (developed by the National High School Center at AIR) at no cost. The
EWIMS model, currently in use in 67 districts in six states, includes both an excel-based tool and
training and technical support for implementation. Your participation in this study will also play
an important role in informing educational policy focused on dropout prevention in
and at the federal level.
11
Questions or Comments
If you have any questions or comments about the study or the opportunity it provides for your
school, please feel free to contact Dr. Nicholas Sorensen (nsorensen@air.org or 312-283-2318)
or Dr. Mindee O’Cummings (mocummings@air.org or 202-403-5254).
12
Signatures of Commitment
Return via fax (312-288-7601) or e-mail (nsorensen@air.org)
The following people have read this document detailing the study and agree to the roles,
responsibilities, and conditions of participation on behalf of and the study team.
District Representative Signature
Printed Name
Title
Date
Principal
Principal Signature
Principal-Investigator
Printed Name
Title
Jessica Heppen
Co-Principal Investigator
Printed Name
Title
Date
Date
Mindee O’Cummings Co-Principal Investigator
Principal-Investigator
Project Director
Project Director
Printed Name
Title
Ann-Marie Faria
Project Director
Printed Name
Title
Nicholas Sorensen
Deputy Project Director
Printed Name
Title
Date
Date
Date
Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I,
Part E, Section 183, responses to this data collection will be used only for statistical purposes.
The reports prepared for this study will summarize findings across the sample and will not
associate responses with a specific district or individual. Any willful disclosure of such
information for nonstatistical purposes, except as required by law, is a class E felony.
13
File Type | application/pdf |
Author | Stachel, Suzanne |
File Modified | 2014-03-07 |
File Created | 2014-03-07 |