SE 4 1 34 OMB Clearance Part B_4 18 14

SE 4 1 34 OMB Clearance Part B_4 18 14.docx

The Impact of Professional Development in Fractions for Fourth Grade

OMB: 1850-0909

Document [docx]
Download: docx | pdf

THE IMPACT OF PROFESSIONAL DEVELOPMENT IN FRACTIONS FOR FOURTH GRADE TEACHERS ON STUDENT ACHIEVEMENT AND TEACHER KNOWLEDGE IN GEORGIA AND SOUTH CAROLINA



OMB Clearance Request

Supporting Statement Part B



April 18, 2014



Prepared for:

NCEE Contracting Officer’s Representative: Sandra Garcia

U.S. Department of Education

Institute of Education Sciences

555 New Jersey Ave., NW, Rm. 308

Washington, DC 20208

(202) 208-7078


Submitted by:

Regional Educational Laboratory Southeast at the Florida Center for Reading Research – Florida State University

2010 Levy Avenue, Suite 100

Tallahassee, FL 32310

(850) 644-9352



Lab Director:

Barbara Foorman, Ph.D.

Florida State University

2010 Levy Avenue Suite 100

Tallahassee, FL 32310

(850) 644-9352

rel-se@fsu.edu

http://rel-se.fsu.edu

Principal Investigator:

Russell Gersten, Ph.D.

Instructional Research Group

4821 Katella Avenue Suite 205

Los Alamitos, CA 90720

Phone: (714) 826-9600

Fax: (714) 826-9610

http://www.inresg.org




SUPPORTING STATEMENT Part B

FOR PAPERWORK REDUCTION ACT SUBMISSION



Contents

B. Data Collection Procedures and Statistical Methods 4

B1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection 4

B2. Describe the procedures for the collection of information 5

B3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied 6

B4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information 6

B5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency 7





Appendices


Appendix A: Approved Teacher Consent & Demographic Form A-1

Appendix B: Mathematical Knowledge for Teaching (MKT) Sample Items B-1

Appendix C: Teacher Professional Development Survey C-1

Appendix D: District/School Memorandum of Understanding (MOU) D-1

Appendix E: Frequently Asked Questions E-1

Appendix F: Approved Parent/Guardian Information Letter and Opt-out F-1

Appendix G: Approved Student Assent Form G-1

Appendix H: Process for selecting DMI from possible programs H-1

Appendix I: Placeholder for 60-Day and 30-Day Federal Register Notices I-1

Appendix J: Details for Question A12, Estimates of Burden of DMI PD J-1

Appendix K: Details for Question B1 J-1

Appendix L: Details for Question B2 K-1






SECTION B:


B1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


Full details of the recruiting and sampling plan can be found in the attached design document included in Appendix K. The study design was approved by NCEE/IES at the end of June 2013.


This study is not intended to formally generalize to any population beyond the sample included in the study itself. The study focuses on grade 4 teachers and their students from Georgia and South Carolina. The sample consists of a non-representative group of rural, small-town and urban schools from 4-5 districts in each of these 2 states. Schools will be recruited from each of those three urbanicity categories. It is estimated that at least 82 schools, 246 teachers, and 6150 students will be enrolled in the study.1


Eligible schools will be restricted to those with at least two fourth grade classes. By definition eligible schools represent 50% of the schools in each of the two states. Local Education Agencies (LEAs) will be recruited based on meeting these very basic criteria and on the perceived level of commitment of school administrators to the study. This commitment will be evidenced by the willingness of district and school administrators to sign MOUs (See Appendix D for District/School Memorandum of Understanding). These schools (and their teachers and students) represent a sample of convenience and are not intended to generalize beyond the sample itself.


In schools which have agreed to participate, all grade 4 teachers will be invited in May – June 2014 to participate in the study. Those returning a signed consent form will be enrolled in the study and be eligible for random assignment. For students, a waiver of informed consent was granted from the IRB, as the intervention is focused on teachers, the intervention represents a professional development activity typical of general LEA practice, and students will only participate in a short post-test at the end of the year. All students in classrooms where teachers have consented will be asked to participate, unless parents opt-out. If they agree to participate, they will be included in the study (see Appendix G for the Student Assent Form). Information forms will be distributed to all parents at the beginning of the school-year with the opportunity to opt out (See Appendix F for Parent/Guardian Information Letter and Opt-out). If parents return the signed form, their child will be excluded from the fractions assessment. The information and opt-out forms will be translated and made available in Spanish as well.


B2. Describe the procedures for the collection of information including:

Statistical methodology for stratification and sample selection,

Estimation procedure,

Degree of accuracy needed for the purpose described in the justification,

Unusual problems requiring specialized sampling procedures, and

Any use of periodic (less frequent than annual) data collection cycles to reduce

burden.


Full details of the sampling and analytic plan can be found in the attached design document included in Appendix L.


Design


ED intends to conduct a multi-level analysis (confirmatory) of student outcomes, comparing students whose teachers participated in DMI, versus students whose teachers participated in business-as-usual professional development activities. Multi-level models will appropriately account for the natural clustering of students within classrooms and teachers within schools. Another multi-level analysis (exploratory) of teacher knowledge outcomes will be conducted to compare teachers who participated in DMI versus teachers who participated in business-as-usual professional development activities. This multi-level model will account for clustering of teachers within schools. Matched school-pairs will be created (i.e. blocking schools within pairs), then a school in each pair will be randomly assigned to the treatment condition.


A What Works Clearinghouse-type (WWC) literature review was conducted in the domain of mathematics professional development to determine an appropriate minimum detectable effect size (MDES)2 to design the study to detect. Using other parameters from the research literature for R2 between pretest covariates and the outcome measure, and ICCs at different levels (i.e. teachers/schools), a power analysis was conducted and resulted in an estimate of approximately 80 schools required to achieve the desired MDES (g ≈ .12). However, at least 84 schools will be recruited to allow for some school-level attrition. Assuming 82 schools for the student outcome and three participating teachers per school (N=246 teachers), an MDES of g ≈ .33 was estimated for the teacher outcome.


Full details (Appendix L) about the multi-level model, and assumptions made for the power analysis are included from the study proposal approved by NCEE/IES.


B3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

Districts and schools will be enrolled in the study based on perceived commitment, and completion of MOUs which explicitly delineate LEA commitments. Teachers will sign consent forms and be informed that it represents a contract, a commitment to participate in an important scientific activity. In our experience, this requirement helps retain teachers at rates of over 90%. The IRB for this study has granted a waiver of active consent for students. Parents will be sent a letter informing them of the study, of its importance, and of the very small amount of time (30-40 minutes) their child will be asked to participate. Under such circumstances, in previous professional development (PD) research, the research team has experienced a parent refusal rate of < 5%.


Study staff will carefully monitor teacher engagement through the completion of monthly teacher logs. Methods to reduce attrition and nonresponse will include invitation emails prompting teachers to complete monthly logs, use of a monthly PD online survey to maximize convenience, and gathering of students in group settings to administer post-test. Follow-up e-mails and reminder calls will be sent to all non-respondents. Teachers in both the treatment and control group will receive a remuneration of $150 for completion of the consent/demographic form, fractions measures (a pre- and a post-test), and 9 monthly PD surveys, which should help reduce attrition.


Additionally, Teachers in the experimental group will be paid their typical hourly rate (varies by state and district and often by seniority) for any time they spend attending PD sessions outside of their work day (i.e., on Saturdays). If teachers attend sessions on Saturdays, they will receive their hourly rate for the time they spend attending the session, completing the preparation assignment, and traveling to the PD site. (Note that the cost of paying teachers for the time they attend PD sessions or complete assignments has been included in the cost of the study and is not presented as an additional cost or incentive for this study.)



B4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


A small feasibility study was performed with five grade 4 teachers to evaluate both instructions and testing time for the teacher measures.


Teacher Measures:


Five grade 4 teachers were recruited from Southern California school districts via a Craigslist posting. The teacher consent and demographic form and teacher PD survey were administered in sequence in a paper format. Teachers who piloted the forms were paid approximately $150 upon completion of the pilot.


Teachers attended proctored test sessions individually. With the exception of requesting timing information for the demographic form, proctors followed the protocol intended for the large-scale study. Teachers were first given the demographic form to complete. Teachers wrote their own start and completion time on the form. After writing the completion time they were asked to write down any components they found confusing, or any general comments. The demographic forms were returned to the proctors.


The teacher professional development (PD) survey form was distributed. Proctors noted the time of distribution. Teachers read the instructions on the form and completed them individually.


The MKT fractions measure was not piloted, as it has been used extensively in other studies and consistently takes under an hour to complete.


This information was used to estimate burden hours in A12 (see Table A2). The burden on teachers for the data collection instruments is approximately .2 hours for the Teacher Consent Forms with Demographics and .2 hours for each of 9 monthly PD surveys (total 1.8 hours). For these measures, the total burden hours are 2 for each teacher, totaling approximately 492 hours for the 246 teachers in the study. Annualized, the burden on teachers is approximately 164 hours per year.


Note that the MKT fractions measure is strictly an assessment and not included in the burden estimates. A sample of MKT items is provided in Appendix B.


Student Measure (TUF):


The student measure has not been piloted as the final TUF form will not be available from the Center for Improving Learning of Fractions (CILF) until June 2014. CILF has scheduled a pilot of their measure for the Spring of 2014 for the development process where items, instructions, and timing will be revised and refined.


Note that the student TUF measure is strictly an assessment and not included in the burden estimates.


B5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The following individuals were consulted on the statistical aspects of this proposal:

  Dr. Sybilla Beckman, University of Georgia – ph# 706-542-2548

Dr. John Deke, Mathematica Policy Research – ph# 609-275-2230

Dr. Mike Garet, American Institutes for Research – ph# 202-403-5000

Dr. Russell Gersten, Instructional Research Group – ph# 714-826-9600

Dr. Nathan Jones, Boston University – ph# 617-353-3295

Dr. Yaacov Petscher, Florida State University – ph# 850-644-0327

Dr. Eric Rolfhus, Instructional Research Group – ph# 714-826-9600

Dr. Mengli Song, American Institutes for Research – ph# 202-403-5000


The following individual designed the data collection:

Dr. Russell Gersten, Instructional Research Group


The following individual will oversee the data collection:

Dr. Russell Gersten, Instructional Research Group


The following individual(s) will analyze the data:

Dr. Tran Keys, Instructional Research Group – ph# 714-826-9600

Dr. Rebecca Newman-Gonchar, Instructional Research Group – ph# 714-826-9600

Dr. Madhavi Jayanthi, Instructional Research Group – ph# 714-826-9600

Ms. Kelly Haymond, Instructional Research Group – ph# 714-826-9600

Dr. Mengli Song, American Institutes for Research – ph# 202-403-5000



1 Initial power estimates suggest than a minimum of 80 schools is necessary for an MDES of g = .12. However, 84 schools will be recruited to participate in the study to account for possible school-level attrition. We expect that 82 schools will remain in the study after attrition.

2 Minimum detectable effect size (MDES) stands for the minimally detectable differences in means between two groups, expressed in standard deviation units.

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEric Rolfhus
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy