CE 4.4.3 OMB Clearance Supporting Statement A 1-14-14

CE 4.4.3 OMB Clearance Supporting Statement A 1-14-14.docx

Evaluating the Retired Mentors for Teachers Program

OMB: 1850-0908

Document [docx]
Download: docx | pdf





EVALUATING THE RETIRED MENTORS FOR NEW TEACHERS PROGRAM



PAPERWORK REDUCTION ACT

CLEARANCE REQUEST


SECTION A





Prepared For:


Institute of Education Sciences

United States Department of Education

Contract No. ED IES-12C-0007





Prepared By:

REL Central Regional Educational Laboratory



September 16, 2013















Section A. Justification

The U.S. Department of Education (ED) requests OMB clearance for data collection related to the Regional Educational Laboratory (REL) program. ED, in consultation with REL Central under contract ED IES-12C-0007, has planned a study of a program that uses retired master educators to provide mentoring support to new teachers in high need elementary schools in Colorado’s Aurora Public School District. This program is referred to as the “Retired Mentors for New Teachers Program.” OMB approval is being requested for REL Central’s data collection for this project, including a REL Central teacher survey and a total of ten focus groups conducted by REL Central with mentee teachers and mentors in the program.

The study will also draw upon several types of data that the school district collects. These include teacher turnover rates in high-need elementary schools, data from school administrator evaluations of teacher performance, mentor records of support, and student assessment scores for students served by teachers in participating elementary schools. OMB approval is not being sought for this turnover, evaluation, and assessment data since these are collected by the district and not REL Central.

A1. Circumstances Necessitating Collection of Information

This data collection is authorized by the Educational Sciences Reform Act (ESRA) of 2002 (see Appendix A). Part D, Section 174(f)(2) of ESRA states that as part of their central mission and primary function, each regional educational laboratory “shall support applied research by . . . developing and widely disseminating, including through Internet-based means, scientifically valid research, information, reports, and publications that are usable for improving academic achievement, closing achievement gaps, and encouraging and sustaining school improvement, to—schools, districts, institutions of higher education, educators (including early childhood educators and librarians), parents, policymakers, and other constituencies, as appropriate, within the region in which the regional educational laboratory is located.”

Statement of Need

The importance of teacher effectiveness has been well-supported by studies demonstrating that teachers vary in their ability to produce student achievement gains; all else being equal, students taught by some teachers experience greater achievement gains than those taught by other teachers (Konstantopoulos & Chung, 2010; Aaronson, Barrow, & Sander, 2007; Nye, Konstantopoulos, & Hedges, 2004; Wright, Horn, & Sanders, 1997). These findings have encouraged wide interest in identifying the most effective programs and practices to enhance teacher effectiveness.

Studies of induction and mentoring for beginning teachers also show positive impacts on student achievement, teacher retention, and teacher instructional practices (Ingersoll & Strong, 2011; Strong, 2006), including positive impacts for new or probationary teachers (Smith & Ingersoll, 2004). However, few studies have used experimental designs, and not one has examined the impact of using retired master teachers to deliver mentoring support.

Understanding the impacts of such a program on students and teachers comes at a particularly critical time for many states and districts in the Central Region. Conversations with chief state school officers in the Region and research alliance members suggest strong receptivity to expanding teacher mentoring in high-need schools. Student achievement continues to lag in high-poverty schools, large numbers of teachers are expected to retire in the near future, and state and district budget shortfalls continue to preclude many districts from undertaking more expensive approaches to supporting their teachers.

While retirements may exacerbate existing teacher turnover issues in high-need schools, retirees also represent a growing, untapped, potentially lower-cost resource of talent to guide and support the next generation of teachers needed for schools and districts to succeed. In fact, one report indicates that “the nation’s teaching population is older than it has ever been. Almost half of the K-12 public school teaching force is over age 50, steadily approaching the average teacher retirement age of 59. During the next decade, at least “1.8 million K-12 teachers and school leaders will be eligible to retire” (Foster, 2010, p. 2). The report also indicates that surveys show many of these retirees want supplemental careers where they can continue to contribute their expertise and skills to the field of education after retiring.

It is for these reasons that the results of this study will be of interest not only to large, urban districts, such as Aurora Public Schools, but to suburban and rural districts as well who also grapple with an aging teacher workforce and teacher turnover challenges.

Overview of the Study Design

The purpose of the study is to evaluate the impacts of the Retired Mentors for New Teachers Program on teacher effectiveness and teacher retention. The program is based on the following theory of change:

By providing probationary teachers in high need schools with regular coaching and support from a recently retired, master educator, the Retired Mentors for New Teachers program will enhance the probationary teacher’s instructional practice leading to an increase in student achievement on district assessments and improved teacher evaluation performance and retention.

This theory of change is supported by the logic model shown in Figure 1. This model outlines the current situation and needs in the target district, the major components of the intervention provided through the Retired Mentors for New Teachers Program, and the targeted outcomes.











Figure 1. MENTORING INTERVENTION LOGIC MODEL

Shape3 Shape2 Shape4 Shape5 Shape1

Improved Outcomes

Students:

Improved instruction delivered by probationary teachers increases their students’ achievement as measured by reading and math assessments. (Confirmatory outcome measure).

Teachers:

Improvement in administrator evaluations of probationary teachers. (Exploratory outcome measure).

Reduced probationary teacher turnover. (Exploratory outcome measure).



Major Intervention Components

  • Pair probationary teachers in high need elementary schools with recently retired, master educators that have:

    • History of success in the district.

    • Years of experience with district expectations and student challenges.

    • Flexibility to meet with probationary teachers before, during, or after school.

    • No input into teacher evaluations.



  • Mentoring provided over two school years.

  • Two half days of summer professional development for probationary teachers.

  • Individualized, one-on-one mentoring and classroom support.

  • Mentoring support with other probationary teachers.

  • Mentor meetings with principals to ensure common understanding of school priorities.



- Quarterly mentor meetings to discuss and continually improve practice.

Current Situation and Need Diagnosis

- Children in high need elementary schools continue to underperform in reading and math.



- Probationary teachers require added support to deliver effective reading and math instruction.



- Retaining teachers in high need schools remains an ongoing challenge.



- High teacher turnover forces higher percentages of probationary teachers into high need schools.







































The study will randomly assign teachers to either receive added mentoring support from a retired master educator along with the district’s typical mentoring support, or to receive only the district’s typical mentoring support. The typical mentoring approach currently operated in Aurora can best be described as a “buddy” mentoring model that assigns each new teacher to work with a more experienced “buddy” teacher for one year. The district expects the “buddy” mentor to spend 15 contact hours with their mentee over the course of the year, but the degree of mentoring provided can vary significantly based on the willingness and availability of the mentor to advise the mentee. Only district teachers in their first year in the profession receive the typical mentoring model. Teachers with more than one year of teaching experience do not receive district mentoring.

In contrast to the typical mentoring approach, the intervention includes the following components:

  • The provision of highly-experienced mentors to work directly with teachers in their first three years with the district regardless of prior experience . Mentors will be assigned by school and will be selected by a team of district leaders, including instructional coordinators and the head of professional development, to ensure their quality.


  • Criteria for mentor selection will include substantial teaching or school leadership experience in APS (a minimum of five years), experience working in high-need schools, and a record of excellence in the district as reflected by student performance, performance reviews, and reputation as an effective educator. Selected mentors will meet quarterly to discuss and continually improve their practice.


  • Treatment group participants will receive individualized mentoring and classroom support. Treatment teachers will receive tailored mentoring throughout the course of each year. This tailored mentoring will be flexible, using the mentor’s experience to identify the key needs of each teacher that they work with and to create a mentoring plan that matches each teacher’s individual strengths and weaknesses. It is anticipated that:

    • The first year of mentoring will focus on creating and maintaining an effective classroom environment for student learning. In this regard, mentors will draw upon content from California’s Continuum of Teaching Practice (BTSA, 2012). More details on this Continuum and the creation of an effective classroom environment for student learning are provided below.

    • Establishment of an effective classroom environment for student learning provides the foundation for establishing “conditions of learning” in the classroom, which will also be a focus of mentoring support. At APS, “conditions of learning” refers to the model of learning espoused by Brian Cambourne. This model contains seven key learning elements: 1) immersion; 2) demonstration; 3) expectation; 4) responsibility; 5) use; 6) approximation; and 7) response.1

    • Subsequent support will delve more deeply into:

      • Mastery of academic content,

      • Use of additional assessment techniques to guide instruction,

      • Use of data to inform instruction,

      • Development of capacity for reflective teaching and continuous improvement, and

      • Other improvement opportunities based on teacher, school, or district priorities.


    • Mentoring will draw on a number of strategies based on the needs of each teacher. The district does not believe that if a mentor chose not to use a specific strategy, that it would weaken the program. This is because the decision regarding the best mix of approaches to use with each mentee relies on the judgment of each mentor. This is an especially important component, because every mentee teacher comes to the district with a unique set of needs, skills, strengths, and weaknesses. The types of strategies used by mentors may include observation, co-teaching, lesson modeling, and visits to other classrooms or schools to observe model instruction.


    • Mentoring will include both individual and group support that involves groups of probationary treatment teachers within schools. Group work will be designed to build a community of learners among probationary teachers within the school that can be sustained past the life of the mentoring program.


  • Treatment group participants will be involved in group support. Mentors and the probationary teachers in each school receiving the treatment will meet 1-2 times per month as a group. While treatment teachers may teach different grade levels, the purpose of these treatment group meetings will be to: share common challenges, share strategies for addressing common challenges, identify any areas where additional mentoring focus is needed or desired, and build a network within the school that can support new teachers when the mentor is not present.


  • Retired mentors will meet with school principals. Each mentor will meet with the school principal to discuss a variety of topics designed to improve the mentor’s understanding of the current school culture and setting and to allow the principal to communicate key priorities and expectations for all new teachers at that school. Mentor meetings are not to be used to share information regarding individual mentee performance. An initial meeting with the principal will establish the expectations around the mentoring program, including that mentors do not serve in an evaluative capacity but are designed to serve merely as a support and critical friend to mentees. Subsequent meetings with the principal over the course of the year will be designed for the mentor to:

    • Gain input from the principal on potential areas of desired focus for new teacher support based on school- or district-level priorities and schoolwide benchmark testing data; and

    • Discuss appropriate alignment of mentoring support with other district coaching or induction initiatives.


The project will include two analyses: 1) an implementation analysis, designed to document the intervention’s implementation and to provide information regarding the contrast between the mentoring received by teachers in the treatment and control groups; and 2) an impact analysis, with confirmatory impact questions focused on student achievement in reading and math, and exploratory impact questions focused on teacher retention and on principal evaluations of teachers.

The primary purpose of this study is to provide an unbiased estimate of the mentoring program’s impact on student achievement in reading and math. The secondary purpose of the study is to provide an unbiased estimate of the program’s impact on teacher retention and teachers’ instructional quality. The impact analysis will be supplemented with the implementation analysis, which is designed to document the intervention’s implementation and to provide detailed information on the contrast between the mentoring received by teachers in the treatment and control groups. The implementation analysis will provide useful information needed to interpret the findings regarding the impact of the program.

Overview of the Specific Data Collection Plan

The impact analysis will use student test scores in reading and math on district-administered assessments, teacher turnover data in study schools, and teacher evaluation ratings. Assessments will be given in the Fall and Spring of each school year and will provide a pre- and post-source of data with which to compare the performance of students served by teachers in the control group with the performance of students served by teachers in the treatment group. The district will also provide its data regarding teacher turnover (teachers leaving the district) across control and treatment groups, and scoring data from the district’s teacher evaluation process, which will allow comparison of evaluation ratings across control and treatment groups.

The implementation analysis requires collecting data using three researcher-developed instruments: (1) a teacher survey questionnaire; (2) a focus group protocol for teacher mentees; and (3) a focus group protocol for mentors. The implementation analysis also will make use of existing data, including detailed records of mentor support. Mentors will make preliminary notes for each mentee on an ongoing basis and submit completed records monthly. These records will document the dosage and type of mentoring provided to each teacher.

REL Central will administer the teacher survey electronically to all probationary teachers (control and treatment) at participating school sites once in the Spring of each school year during both years of the intervention. The surveys will be used to gather data on teacher background and prior experiences as well as to gather data on the dosage of mentoring received (both typical mentoring and added mentoring provided through the intervention), the types of mentoring support received, and to gather any lessons learned regarding the program. The survey instrument is included in Attachment A.

REL Central will conduct four 60-minute focus groups in the spring of each school year with probationary teachers that are receiving the treatment in participating schools, using a semi-structured Teacher Focus Group Protocol developed by REL Central. The purpose of these focus groups will be to gather input from the teachers regarding the quality, frequency, and intensity of mentoring support received and to validate data gathered through the mentor records of support regarding content of mentoring received. The focus group protocol is included as Attachment B.

Focus groups will also be conducted once each spring with mentors. Focus groups will use a Mentor Focus Group Protocol developed by REL Central. The focus groups will address a variety of topics, including implementation fidelity and adherence to the program design, lessons learned in implementation, and key areas of mentoring focus. Focus groups will be limited to 60 minutes, with questions prioritized to aid focus group facilitators in the event of time constraints. The focus group protocol is included as Attachment C.

Records of mentor support will be filled out monthly by each mentor for work conducted with each mentee teacher to document topics of mentoring each month, and methods used to deliver support (such as in-class support, organizing site visits, modeling instruction, etc.). The completion of the record of support form is part of each mentor’s job as required by the district. This expectation is communicated to mentors by the district head of professional development prior to the start of the school year, including the expectation that record of support forms are to be completed and turned in monthly to the district. The district head of professional development will meet with mentors at least quarterly and will monitor any issues associated with the regular and timely completion of the Record of Support forms. Consistent completion of the forms will contribute to the head of professional development’s evaluation of mentor performance. Mentors will be compensated directly by the district for their time in filling out the records of support. The records will be provided to the district head of professional development who will share them with REL Central.

Table 1. Timeline for the Data Collection

Timeframe

Data Collection

Spring 2014, 2015

Administer teacher survey, conduct mentee and mentor focus groups

July 2014, 2015

District provides data on teacher turnover as well as teacher evaluation data

May 2014, October 2014

District provides data from fall student assessment measure in reading and math

May 2014, 2015

District provides data from spring student assessment measure in reading and math



The implementation analysis data collection schedules are based on the assumption that OMB approval will be received for these data collections. If OMB approval is not received, only data will be analyzed using data that the district collects. 

A2. How, by Whom, and for What Purpose Information is to Be Used

Findings from this study will be used by state and district leaders to inform decisions regarding the types of support to provide new and probationary teachers. This study will result in a report intended for district and state leaders who are responsible for developing and implementing teacher mentoring and support systems and overseeing support for teachers’ professional growth and effectiveness. The report will provide the findings in an accessible format, describing possible interpretations and implications of the results. These findings will be helpful for states and districts in the Central Region who are exploring innovative ways to provide support to new teachers in their schools. Researchers will also facilitate discussions of the results with state and district leaders to examine and interpret the results from this study.

For the study’s impact analysis, student achievement data from the district’s MAP Reading and Math Assessments will be used to compare the achievement of children served by a teacher that received added mentoring support through the intervention to the achievement of children served by teachers in the control group. The MAP assessments are standardized tests given by the district that are aligned to standards across the nation. The district has purchased and used these assessments and will deliver them and collect the data in study schools. The district’s data on teacher turnover will be used to compare turnover rates (the proportion of teachers leaving the school district) between treatment and control groups of teachers. District-collected data on teacher evaluations will be used to compare teacher performance across treatment and control groups as measured by school administrative leader scoring on a district-created evaluation rubric.

For the study’s implementation analysis, data collected from teacher surveys and focus groups as well as mentor focus groups (both conducted by REL Central) will be used to assess the experiences of probationary teachers with mentoring and to compare these experiences across the treatment and control groups. These data sources will inform researchers about: whether the intervention is being implemented as planned; the types of support provided by mentors; and lessons learned through the mentoring process. Mentor records of support, which are already collected by the district, will be used to triangulate the survey and focus group input.

The table below provides an overview of the data collection methods that will be used to answer implementation questions.

Table 2. Connecting Data Collection Methods to Implementation Research Questions


Data Collection Methods

Implementation Research Question

Teacher survey

Records of Mentor Support

Mentee focus groups

Mentor focus groups

Are the components of the mentoring program being implemented as planned?


Is the program reaching the intended target population with the appropriate services at the planned rate and “dosage”?


What are the background characteristics and experience of mentors and mentees?




How does the mentoring experience and dosage differ between the treatment and control group teachers?





What are key lessons learned that might improve the program over time?





A3. Use of Automated, Electronic, Mechanical or Other Technological Collection Techniques

The teacher survey will be administered online using survey software such as SurveyMonkey in order to reduce the burden on respondents. REL Central’s conversations with the district indicate that teachers are familiar and comfortable with SurveyMonkey and electronic surveys. Respondents will be sent a link via email that will lead them to the online survey. A progress bar will indicate the percentage of the survey instrument completed so teachers can see the progress they will make in responding to questions. Reminder emails will be sent electronically to facilitate higher response rates. Mentee and mentor focus group discussions will be captured electronically through note taking on laptop computers. The focus groups will be recorded to ensure accuracy. Records of Mentor Support will be collected by the district electronically and coded by the mentor.

A4. Efforts to Identify Duplication.

The impact of the mentoring program will be examined using several sources of existing district data. The existing data includes teacher evaluation data, teacher turnover data, and data from student reading and math assessments. Mentor records of support will assist in tracking program implementation. However, implementation will also be examined using several new sources of data, including information gathered from surveys of treatment and control group teachers and focus groups of mentors and treatment group teachers. The district does not currently collect such implementation data on the program, and this type of information is not available from any other data sources.

A5. Sensitivity to Burden on Small Entities

The data collection does not involve schools or other small entities.

A6. Consequence to Federal Program or Policy Activities if the Collection is Not Conducted or is Conducted Less Frequently

If the proposed data were not collected, the goals of the IES Regional Education Laboratory program may not be met. For instance, REL Central would not be able to contribute to key IES goals, including goals to “conduct and support high quality studies on key regional priorities” that “incorporate data and research into everyday decision-making.” If the data were not collected, decision makers in the region would not have access to the research needed to help meet these key goals.2 Furthermore, the district would like to use the research-based evaluation results to inform policy decisions about their mentoring program in the future. Because the district’s program is designed specifically as a two--year intervention, REL Central will assess ongoing implementation over two school years.

A7. Special Circumstances

There are no special circumstances.

A8. Federal Register Announcement and Consultation

Federal Register Announcement

A 60 day Federal Register notice was published on November 7, 2013. No public comments have been received to date. We will summarize and address any public comments that are received.

Consultations Outside the Agency

We have consulted with Aurora Public School District leaders on the availability of data, the clarity of the instructions, and the data gathering instruments. Additionally, we have consulted with content and methods experts to develop the study including technical working group members Dr. Bruce Randel, President of Century Analytics; Dr. Linda Damon, retired Director of Professional Learning; and Dr. Michelle Reininger, Assistant Professor at Stanford University and Executive Director of the Stanford Center for Education Policy Analysis. As appropriate, we will consult with representatives of those from whom information is to be obtained, such as research alliance members, to address any public comments received on burden.

Table 3. Individuals consulted on the statistical, data collection, and analytic aspects of this study

Name

Title

Organization

Contact Information

Michelle Reininger

Executive Director

Center for Education Policy Analysis at Stanford University

reininger@stanford.edu;

(650) 736-1258

Bruce Randel

President

Century Analytics

bruce.randel@centuryanalytics.com;

(720) 488-5503

Linda Damon

Retired (Former Director of Professional Learning)

(Formerly) Aurora Public Schools

damon.l@att.net



A9. Payment or Gift to Respondents

The school district is covering all payments to teachers and mentors who participate in surveys or focus groups. This compensation is not paid by REL funds. Teachers will be offered an incentive of $25 each to participate in a survey or focus group. Focus groups are expected to take approximately an hour, while surveys are expected to take no more than 20 minutes. Because the district is paying all incentives, the incentive amounts are determined based on district requirements and standard operating practice. The district rate for compensating teachers for work outside of regular contract hours or for participating in research activities is $25 per hour. The results of the Reading First Impact Study (RFIS) incentives study were considered, which found that incentives significantly increase response rates in surveys (Gamse, Bloom, Kemple, & Jacob, 2008). Teachers will receive the incentive upon completion of the survey. Maintaining use of current district stipend levels during the study has the added benefit of allowing the district to maintain consistency in how it compensates teachers if the district wishes to continue its own similar data gathering through surveys and focus groups past the life of this study.

A10.Confidentiality of the Data

REL Central will be following the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, which states: “All collection, maintenance, use, and wide dissemination of data by the Institute” are required to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment. Subsection (c) of section 183 referenced above requires the Director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.” Subsection (d) of section 183 prohibits the disclosure of individually identifiable information, and makes any publishing or communicating of individually identifiable information by employees or staff a felony.

As permitted by law, REL Central will attempt to protect the confidentiality of information collected for the study and will use it for research purposes only. Information from participating institutions and respondents will be presented at aggregate levels in reports. Unique identification numbers will be assigned to each participating teacher and used to identify all survey responses. The ID number/name association files will be kept secure in a confidential file separate from the data analysis file. Data from the online survey software system will be downloaded and deleted from the online system after the survey window closes. These data files will then be stripped of identifying information. Information will be reported in aggregate. All identification lists will be destroyed at the end of the project. The research team is trained to follow strict guidelines for soliciting consent, administering data collection instruments, and preserving respondent confidentiality. All members of the research team have successfully completed the Collaborative Institutional Training Initiative (CITI) course in the Protection of Human Research Subjects courses through Liberty IRB.

REL Central will collect consent forms from study participants which will also be stored in a locked file. Copies of the teacher and mentor consent forms are provided in Attachments D and E, respectively. A copy of the affidavit of non-disclosure is provided in Attachments I and J for each researcher who will have access to the data.

All study materials will include the following language:

Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific individual except as required by law. Any willful disclosure of such information for non-statistical purposes, without the informed consent of the respondent, is a class E felony.

A11. Additional Justification for Sensitive Questions

No questions of a highly sensitive nature are included in the survey or interview.

A12. Estimates of Hour Burden

The total burden associated with the two-year data collection is 274 hours (see Table 4).

Approximately 100 respondents will be invited to take the teacher survey. The target response rate is 80%. Of the 100 teachers, 50 will be randomly assigned to receive the added mentoring treatment from a retired, master educator and 50 will be randomly assigned to the control group and will receive the district’s business as usual mentoring support. While both treatment and control group teachers will be asked to take the survey, the survey will be considerably shorter for control group teachers (about half as many questions) because they will not be asked to answer a set of questions on their experience with a retired mentor. The approximate time for completing the survey for control group teachers is 10 minutes. The approximate time for completing the survey for treatment group teachers is 20 minutes.

There will also be up to three email reminders to individuals who do not complete the survey on the first notification. Each follow-up email is estimated to take 2.5 minutes (or 0.04 hours) for each teacher to read.

Approximately 40 teachers will participate in one-hour focus groups once per school year (in the Spring), and approximately 8 mentors will participate in a one-hour focus group once per school year (in the Spring).

An in-person meeting to discuss the research project with 100 teachers will take approximately 30 minutes and will allow teachers to sign consent forms and will include information about survey and focus group participation.

It is anticipated that district personnel will spend no more than 40 hours per school year providing required data for the study. It should be noted that the district already collects and compiles the data required for the study. Therefore, only a small amount of additional time is needed to provide this data to REL Central. With regard to use of the MAP assessment for the study’s student outcome measure, the assessment will be administered by the district and not REL Central. Collection of assessment data will also be conducted by the district and not REL Central. The district has agreed to share this assessment data with REL Central.

The following table presents each data collection task, the number of individuals participating in each data collection task for each of the two years, the average time per participant for each task, and the total time to complete each task.

Table 4: Participation times

Data collection task

Number of Participants

Average Time (hours)

Total Burden

(hours)


Approximately 100 teachers will be invited to attend an in-person meeting in which they will learn about the study and will sign consent forms.

In-person meeting Year 1

100

0.50

50

Of the 100 teachers, there will be 50 in the treatment group. With an 80% response rate in the first year, 40 are expected to complete the survey, with an estimated burden of 20 minutes. There will also be 50 in the control group. With an 80% response rate in the first year, 40 are expected to complete the survey, with an estimated burden of 10 minutes. We expect a 20% attrition rate in survey completion from year 1 to 2, leaving us with 32 participants in each group in year 2.

Online survey for treatment group teachers Year 1

40

0.33

13.2

Online survey for control group teachers Year 1

40


0.17


6.8


Online survey for treatment group teachers Year 2

32

0.33

10.56

Online survey for control group teachers Year 2

32

0.17

5.44

Approximately half of the 100 teachers will require reminders to take the online survey each year. Each of the 3 email reminders is estimated to take about 2.5 minutes (0.04 hours).

Read survey reminders Year 1

50

0.12 (0.04 x 3)

6

Read survey reminders Year 2

50

0.12 (0.04 x 3)

6

Forty teachers (4 focus groups with a maximum of 10 teachers) will participate in a focus group each year. Eight mentors (1 focus group with a maximum of 8 mentors) will participate in a focus group each year.

Teacher focus groups Year 1

40

1

40

Mentor focus groups Year 1

8

1

8

Teacher focus groups Year 2

40

1

40

Mentor focus groups Year 2

8

1

8

District staff will spend a total of 40 hours per year providing required data to support the project

Provision of data Year 1

1

1

40

Provision of data Year 2

1

1

40

Year 1 Total



164

Year 2 Total



110

Total Burden Hours



274



The total burden in hours is 274 hours, which amounts to approximately 92 hours per year over the three year OMB clearance period. The total number of responses will be 442, which amounts to 148 responses per year over the three years of the clearance period.

As shown in Table 5, the annual cost is $3,360 in the first school year and $2,960 in the second school year, which the district has agreed to pay. The total cost over two years is therefore $6,320. The district will compensate mentees and mentors for any time associated with participating in focus groups or surveys. To reiterate, this compensation will not be paid by REL funds. The district will compensate teachers $25 for participation in each survey or focus group, based on current district policy. The district compensates its mentors based on flat daily rates which amount to an average cost of $45 per hour.

The following table presents the number of participants, the time to complete each data collection task and the associated monetary burden on the district to compensate participants (at the rate of $25 per focus group or completed survey for teachers and $45 per hour for mentors).

Table 5: Participation times and estimated monetary burden

Data collection task

Number of participants

Participation

cost*

Estimated monetary burden district will pay

Of the 100 teachers, there will be 50 in the treatment group. With an 80% response rate, 40 are expected to complete the survey. There will also be 50 in the control group. With an 80% response rate, 40 are expected to complete the survey.

Online survey for treatment group teachers Year 1

40

$25

$1,000

Online survey for control group teachers Year 1

40


$25

$1,000

Online survey for treatment group teachers Year 2

32

$25

$800

Online survey for control group teachers Year 2

32

$25

$800


Forty teachers will participate in focus groups each year. Eight mentors will participate in a focus group each year.

Teacher focus groups Year 1

40

$25

$1000

Mentor focus groups Year 1

8

$45

$360

Teacher focus groups Year 2

40

$25

$1000

Mentor focus groups Year 2

8

$45

$360

Year 1 Total



$3,360

Year 2 Total



$2,960

Total


--

$6,320



The total cost is $6,320, which amounts to approximately $2,107 per year over the three year OMB clearance period.

A13. Estimate for the Total Annual Cost Burden to Respondents or Record Keepers

There are no start-up costs for this collection.

A14.Estimates of Annualized Costs to the Federal Government

The total estimated cost for this study is $1,025,000 over five years. The average yearly cost is $205,000. The costs include developing, administering, and analyzing the survey; conducting and analyzing the focus group results; gathering and analyzing district data (teacher turnover, teacher evaluations, student reading and math performance) and report writing and development. The costs also include personnel costs of several federal employees involved in project oversight and analysis that amount to an annual cost of $5,000 for federal labor. The total annual cost for the evaluation is therefore the sum of the annual contracted evaluation cost ($200,000) and the annual federal labor cost ($5,000) or a total of $205,000 per year.

A15. Reasons for Program Changes or Adjustments

This is a new study.

A16. Plans for Tabulation and Publication.

The district will administer the mentoring intervention over two school years to the group of teachers receiving the treatment. Teachers will be randomly assigned to treatment and control groups and the school district will collect data on the characteristics of the sample of schools such as: school size, student-teacher ratio, principal turnover rate, and teacher turnover rate.

Impact Analysis

The impact analysis will address the following confirmatory questions:

  1. What is the impact of the Retired Mentors for New Teachers program on elementary students’ reading assessments?

  2. What is the impact of the Retired Mentors for New Teachers program on elementary students’ assessments in math?

The impact analysis will address the following exploratory questions:

  1. Does being a member of the treatment group in the Retired Mentors for New Teacher program impact teacher turnover?

  2. Does being a member of the treatment group impact principal evaluations of teachers?


Analyses will be conducted to provide descriptive statistics of the sample of schools. Descriptive statistics will be calculated and presented for the following school-level variables: School size, student-teacher ratio, principal turnover rate, teacher turnover rate.

A second set of descriptive analyses will describe the characteristics of teachers assigned to the treatment and control groups as a result of the baseline characteristics obtained from the teacher survey. Statistics will be presented for: Gender, education level, number of years of total teaching experience, number of years teaching in Aurora, and grade level assignment at year-1 baseline.

A third set of descriptive analyses will compare the students in the treatment and control groups using: Grade level, gender, race/ethnicity, and eligibility for free or reduced price lunch.

Data Collection Procedures

The table below summarizes the instruments used in the study as well as the corresponding proposed implementation dates.

Table 6. Data collection instruments for impact analyses

Data Collection Instrument for Confirmatory Impact Research Questions

Data Already Collected by the District?

Student performance on grades 1–5 MAP Reading Assessments

Yes. The district currently has a contract with NWEA to use the MAP assessment.

Student performance on grades 1–5 MAP Math Assessments

Yes. The district currently has a contract with NWEA to use the MAP assessment.

Data Collection Instrument for Exploratory Impact Research Questions:

Data Already Collected by the District?

Data provided by the district of principal evaluation ratings of probationary and control teachers in participating schools.

Yes. These ratings are currently generated by principals in all schools and are collected electronically by the district.

Teacher turnover data collected from the district human resources department

Yes. Teacher turnover data is routinely collected by the district.


For data on student characteristics and performance, the data collection is focused at the district level. For the two research questions addressing student reading and math achievement, REL Central will use the results of MAP Assessments (developed by Northwest Evaluation Association - NWEA). MAP is a standardized test, given under standardized conditions. It is an adaptive, electronic assessment which provides RIT scores aligned on a common vertical scale. The adaptive nature of MAP allows the assessment to adjust the difficulty of each question based on how well a student answers previous questions. As the student answers correctly, questions become more difficult. If the student answers incorrectly, the questions become easier. The assessment then produces final RIT scores in both reading and math (Northwest Evaluation Association, 2009). NWEA conducts RIT scale norming studies every three years. The latest study was conducted in 2011 and included data drawn from over 5 million students taking the test in all 50 states (Northwest Evaluation Association, 2011).

For the research question on teacher turnover, REL Central will use data provided and electronically collected by the district’s human resources office. For this study, teacher turnover is defined as those study sample teachers who leave the Aurora Public School district before the start of the 2015–16 school year (that is, as of the July 2015 data collection point). We use this definition because one of the goals of the intervention is to reduce teacher turnover from the district. The impact on turnover will be estimated at the teacher level, which is the level of random assignment.

For the research question on principal evaluation of teachers, REL Central will use data gathered by the district using a teacher evaluation rubric. This rubric outlines the knowledge and skills required of an effective teacher within five major standards: pedagogical expertise (up to 24 points; establishing a safe and respectful learning environment (up to 24 points); planning and delivering effective instruction (up to 32 points); reflecting on practice (up to 12 points); and demonstrating leadership (up to 16 points). Teachers will receive a rating for several elements under each of the five quality standards from their principal or other administrative supervisor. The ratings are rigorously scored based on a check list on a five point scale from “not evident (0 points)” to “exemplary (4 points).” All administrators in Aurora Public Schools that directly supervise teachers will be trained by the district to reliably administer the rubric for all teachers during the Spring of 2013.

Baseline Equivalency

We plan to test the baseline equivalency for several reasons. First, it is teachers who are randomly assigned to the experimental groups, not the students. Second, student mobility within the target district is relatively high, such that attrition rates may exceed those considered acceptable by the What Works Clearinghouse (U.S Department of Education, n.d.). Establishment of baseline equivalency of the analytic samples will be critical to supporting a causal inference. These analyses will compare the treatment and control students on the baseline data for the confirmatory outcomes only for those students included in the impact analysis sample. These analyses will account for the cluster of students within classroom.

We plan to include baseline measures of student achievement in all of our impact analyses. The study can meet What Works Clearinghouse standards if the difference in baseline achievement is no greater than 0.25 standard deviations. Should a baseline difference be greater than 0.25 standard deviation we still plan to estimate the program’s impact because Aurora Public Schools is interested in the impact of the program despite any baseline differences. We will inform the district that findings from a study sample where the baseline equivalence difference exceeds the What Works Clearinghouse standard should be interpreted with caution.

Impact Analysis Models

Program impacts will be estimated for the confirmatory outcomes and each of the exploratory outcomes.

Analytic models described here pertain directly to the confirmatory student achievement outcomes and will serve as the benchmark impact model. The purpose of the benchmark confirmatory impact model will be to provide an unbiased estimate of the mentoring program’s impact on the two confirmatory outcomes: student reading achievement and student math achievement. The impact will be estimated at the teacher/classroom level, which is the level of random assignment.

The purpose of the first exploratory research question will be to estimate the impact of the intervention on probationary teacher turnover. For this outcome, turnover is defined as those teachers who leave the district any time during the study period up to July 2015. The impact on turnover will be estimated at the teacher level, the level of random assignment.

The purpose of the second exploratory research question will be to estimate the impact of the intervention on principal evaluations of probationary teachers. The district will use an evaluation rubric that is based on five standards of teaching quality. Though each of the standards are evaluated on a five-point ordinal scale, the combination of the results across the five-point scales generates a value for each evaluated teacher that closely resembles an interval scale. For the purposes of our analysis, we will therefore treat the teacher evaluation results data as an interval scale. The data that results from these evaluations will be used as the dependent variable in the exploratory analysis of the impact of mentoring on teacher evaluations.

Table 7 is an example shell of a table that we will use to present the results of our impact analysis.





Table 7. Example table shell


Regression adjusted posttest means

(standard deviation)





Measure

Intervention group

Control group

Estimated differencea

(standard error)

95 percent confidence interval

p-value

Effect sizeb

Outcome A




()



()



()




Outcome B



()



()



()




Outcome C



()



()



()




a. Estimated difference may not equal difference between means because of rounding.

b. Calculated as the estimated difference divided by control group standard deviation.

Note: Numbers in parentheses are standard errors. All results are based on analysis using a mixed-model approach to account for the sources of variability in the data that resulted from the nested structure of the school environment.

Sensitivity Analyses

Sensitivity analyses will be conducted for the confirmatory impact analyses of the student achievement outcomes only. Sensitivity analyses will test the robustness of the benchmark impact estimates to: inclusion of covariates; treatment of missing pretest data; inclusion of the students’ year 1 teacher status variables; inclusion of the teacher transfer variables; possible moderation of the treatment effect by teacher grade level; and potential moderation of the year 2 treatment effect by student exposure levels to the treatment in year 1.

Addressing Multiple Comparisons

The program’s impact will be estimated for two confirmatory student achievement outcomes: English language arts scores and mathematics scores. Further, impacts will be estimated for these two outcomes using the year 1 student sample and also using the year 2 student sample. The What Works Clearinghouse applies a correction for multiple hypotheses testing to studies that test a given outcome measure with multiple comparison groups (U.S Department of Education, n.d.). The What Works Clearinghouse does not apply a correction to outcomes from different domains, only to outcomes within the same domain. This study’s two confirmatory outcomes fall into two domains as defined by the What Works Clearinghouse. Student English language arts scores come under the general reading achievement domain (U.S Department of Education, n.d.). Student math scores come under the elementary school mathematics achievement domain (U.S Department of Education, n.d.). The year 1 comparison group and the year 2 comparison group are different comparison groups. Given the above, we plan to apply the Benjamini-Hochberg correction separately to the results from two comparison groups used for the reading outcome and to the results from the two comparison groups used for the mathematics outcome.

Implementation Analysis

The primary purpose of the implementation analysis is to capture the extent to which the program was implemented as intended and to better understand any variations in implementation across school sites. A secondary purpose is to contribute to a greater understanding of the context and process of implementation, which will be of great interest to Aurora Public Schools and any other districts who might be interested in implementing this intervention in the future.

Key implementation research questions include:

  • Are the components of the mentoring program being implemented as planned?

  • Is the program reaching the intended target population (core subject elementary school teachers in their first three years at the district) with the planned “dosage”?

  • What are the background characteristics and experience of mentors and mentees?

  • How does the mentoring experience and dosage differ between the treatment and control group teachers?


Using data collected through surveys, mentor records of support, and focus groups, we will describe the following for the treatment stream:

  • Teacher and mentor background and prior experience

  • The extent to which mentoring met expected dosage levels

  • The extent to which mentoring met program fidelity expectations with regard to receiving a mix of mentoring support through individual and group sessions

  • The methods of mentoring support provided (for example, whether mentors used observation, co-teaching, or modeling of instruction)

  • Key topics addressed in mentoring

  • Whether there are any special circumstances and constraints surrounding the mentoring experience.


Qualitative findings will be summarized as needed using traditional qualitative data analysis methods. The approach used will follow methods explicated by Miles and Huberman (1994). This approach emphasizes well-defined study variables to ensure the comparability of cross-site data and reduction of data using data displays and matrices so that common themes can be identified.

We will analyze all data for key themes that characterize the implementation of the program, including the methods of support received by the intervention group. Such themes include a comparison of the average amount and intensity of mentoring activities received by the treatment and control group teachers. To ensure the validity of these comparisons, we will use uniform survey data collected annually from teachers in both research groups to conduct these analyses.

Publication Plans

All results for REL Central’s rigorous studies will be made available to the public through peer-reviewed evaluation reports that are published by IES. The datasets from these rigorous studies will be turned over to the REL’s IES project officer. REL Central will follow IES guidelines in creating restricted-use datasets and supporting documentation which will be provided at the end of the project,. Even the REL contractor would be required to obtain a restricted-use license to conduct any work with the data beyond the original evaluation.

In preparation for the production of a draft report in 2015, we expect to convene a number of meetings in the Spring and Fall of 2015 to review the report outlines and report. These meetings will include TWG members and other relevant experts. The interim and final reports will target a district-level policymaking audience and will be disseminated via the official IES channels, including the national REL website.

The study’s timetable for data collection and analysis is presented below:

Table 8: Schedule of activities

Activity

Schedule

Implementation Analysis

Hold in-person consent meeting

Upon OMB approval

Administer teacher surveys

Spring 2014, Spring 2015

Collect records of mentor support

Spring 2014–Spring 2015

Conduct teacher focus groups

Spring 2014, Spring 2015

Conduct mentor focus groups

Spring 2014, Spring 2015

Analyze and report

June–November 2015

Impact Analysis

District provides descriptive data on schools, teachers, and students

Spring 2014

District provides student reading and math performance data

Spring 2014, Fall 2014, Spring 2015

Collect teacher evaluation data

July 2014, July 2015

Collect teacher turnover data

July 2014, July 2015

Analyze and report

June–November 2015



A17. Approval to Not Display the Expiration Date for OMB Approval

We are not requesting approval to not display the expiration data for OMB approval.

A18. Exception to the Certification Statement

No exceptions to the certification statement are being sought.



References



Aaronson, D., Barrow, L., & Sander, W. (2007). Teachers and student achievement in the Chicago public schools. Journal of Labor Economics, 25(1), 95135.

Enders, C. K., & Tofighi, D. (2007). Centering predictor variables in cross-sectional multilevel models: A new look at an old issue. Psychological Methods, 12(2), 121-138.

Foster, E. (2010). How boomers can contribute to student success: Emerging encore career opportunities in K-12 education. Washington, D.C.: National Commission on Teaching and America’s Future.

Gamse, B. C, Bloom, H.S., Kemple, J.J., Jacob, R.T., (2008). Reading First Impact Study: Interim Report (NCEE 2008–4016). Washington, D.C. : National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Glazerman, S., Isenberg, E., Dolfin, S., Bleeker, M., Johnson, A., Grider, M. &Jacobus, M. (2010). Impacts of Comprehensive Teacher Induction: Final Results From a Randomized Controlled Study (NCEE 2010-4027).Washington, DC: National Center for Education.

Hedges, L. V. (2007). Effect sizes in cluster-randomized designs. Journal of Educational and Behavioral Statistics, 32(4), 341–370.


Hedges, L. V. (2009).“Effect sizes in nested designs.” In H. Cooper, L. V. Hedges & J. C. Valentine (eds.), Handbook of research synthesis and meta-analysis, 2nd Edition. New York: Russell Sage Foundation.


Hedges, L. V. & Olkin, I. (1985). Statistical methods for meta-analysis. San Diego, CA:

Academic Press.


Ingersoll, R. & Strong, M. (2011) “The Impact of Induction and Mentoring Programs for Beginning Teachers: A Critical Review of the Research,” Review of Educational Research. Retrieved from http://rer.sagepub.com/content/81/2/201.

Konstantopoulos, S., & Chung, V. (2010). The persistence of teacher effects in elementary grades. American Educational Research Journal 48(2), 361386.

Miles, M. D. & Huberman, A. M. (1994).Qualitative data analysis: An expanded sourcebook. Newbury Park, CA: Sage.

Northwest Evaluation Association. (2005). RIT scale norms for use with achievement level tests and measures of academic progress. Lake Oswego, OR: Author.

Northwest Evaluation Association. (2009). Technical manual for Measures of Academic Progress™ and Measures of Academic Progress for primary grades™. Lake Oswego, OR: Author.

Northwest Evaluation Association. (2011). 2011 RIT Scale Norms, Frequently Asked Questions. Lake Oswego, OR: Author.

Nye, B., Konstantopoulos, S., & Hedges, L. V. (2004). How large are teacher effects? Educational Evaluation and Policy Analysis, 26(3), 237257.

Smith, T. & Ingersoll, R. (2004). What are the effects of induction and mentoring on beginning teacher turnover? American Educational Research Journal, 41(3), 681–714.

Strong, M. (2006). Does new teacher support affect student achievement? [Research brief]. Santa Cruz, CA: The New Teacher Center.

Wright, S. P., Horn, S. P., & Sanders, W. L. (1997). Teacher and classroom context effects on student achievement: implications for teacher evaluation. Journal of Personnel Evaluation in Education 11: 5767.

2 IES goals for Regional Educational Laboratories can be found on the following website: http://ies.ed.gov/ncee/edlabs/about/.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitlePAPERWORK REDUCTION ACT CLEARANCE REQUEST
SubjectSection A
AuthorDale DeCesare
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy