Nov. 2006
OMB Comments on Study of Teacher Preparation in Early Reading Instruction
1850-NEW
General:
In the future, new collections that are tied to previously withdrawn collections should be built upon the ICR reference number of the withdrawn collection. This will link the two collections in ROCIS for perpetuity. If you do not know what the ICR reference number is for the withdrawn collection, please contact Pam Beverly or Rachel and they can give you that number.
RESPONSE FROM ED: We will use the reference number in future similar situations.
ED is proposing a $100 incentive to students for completion of the survey and assessment. The survey is expected to take approximately one hour to complete; how much time do you anticipate needing to complete the assessment? This incentive seems high given that the respondents are college students who are used to taking exams and would not view this as sensitive as employed teachers.
RESPONSE FROM ED: As mentioned in section A9 of the supporting statement, the survey and assessment combined are expected to take two hours to complete. This is longer than most data collection burden expectations in other IES studies. Although the respondents are not yet employed teachers, assessing their knowledge of information that they may not have been presented with could be sensitive. In addition, the spring semester before graduation for pre-service teachers is an extremely busy time, most likely including their student teaching requirement (which is not always located close to campus). Thus, we propose an incentive payment of $100 per participant. This is consistent with the suggested “High Burden” incentive amount for teacher assessments in the NCEE memo “Guidelines for Incentives for NCEE Evaluation Studies,” where high burden is actually defined as one hour.
What statutory authority does ED have to promise confidentiality for this collection?
RESPONSE FROM ED: The Education Sciences Reform Act of 2002, Title I, Part E, Section 183 requires “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.
In addition for student information, “The Director shall ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools, shall remain confidential in accordance with section 552a of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act.
Subsection (c) of section 183 referenced above requires the Director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data”.
Subsection (d) of section 183 prohibits disclosure of individually identifiable information as well as making any the publishing or communicating of individually identifiable information by employees or staff a felony.
Please provide more information on the program-level and institution-level variables that will be examined and the source(s) for this information.
RESPONSE FROM ED: The program-level variable is program type. There are six program types that will be included in the study: (1) General Education; (2) Elementary Education and Teaching; (3) Teacher Education, Multiple Levels; (4) Early Childhood Education and Teaching; (5) Reading Teacher Education, and (6) Multi/Interdisciplinary Studies- Other. The institution-level variables are: school type (public versus private) and minority enrollment (high vs. low, defined by median). The source of this information is the fall 2004 IPEDS Completion data file.
Please provide a justification for the proposed sample size of 2,500 students; specifically, what level of precision is needed for overall national estimates or subgroups that necessitates this large of a sample?
RESPONSE FROM ED: Per ED’s requirements in the RFP, 100 institutions will comprise the institution sample. At each institution 25 students will be selected at random from education programs that yield elementary school teachers. The total sample size for students then is 2500. A 92% response rate would produce a sample of 2300 students.
To be conservative, precision estimates were calculated assuming a sample of 2000 students. Though recruiters will be thoroughly trained and every effort will be employed to achieve a 92% response rate for institutions and students, it is possible that the response rate for students may not be reached. These specifications were designed to ensure that the sample would be large enough so that the 95 percent confidence interval associated with a sample of 100 institutions (92% response rate) will be .093 for a sample proportion of .5 and the 95 percent confidence interval associated with a sample of 2,000 students (80% response rate) will be .029 for a sample proportion of .5. Therefore, even if the study does not achieve its proposed response rates, precise estimates will still be produced that will allow for accurate subgroup analyses.
Please explain the circumstances under which a institution is replaced. The description of the strata for institutions is slightly different in B1 and B3. Please clarify.
RESPONSE FROM ED: Each sampled institution will be assigned two replacement institutions in the sampling frame. However, a sampled institution will not be designated as a replacement institution, and a replacement institution will not be assigned to substitute for more than one sampled institution.
For each sampled institution, the next two institutions immediately following it in the sampling frame will be designated as its replacement institutions. The use of implicit stratification variables, and the subsequent ordering of the institution sampling frame by size, will ensure that any sampled institution’s replacements will have similar characteristics.
Institutions will be replaced in the sample once the assigned recruiter has made contact with the correct institutional representative and that individual declines to participate in the study. A new recruitment package will then be mailed to that institution’s replacement and the recruitment process will begin for the replacement institution.
Regarding the discrepancy between B1 and B3, B1 reflects the correct strata for the sample.
Please clarify the response rate goal of 85%--does that refer to 85% of institutions and then 85% of students within institutions for a cumulative response rate of 72%? What is the basis for the expected response rates?
RESPONSE FROM ED: ED suggested a sample of 100 institutions and a response rate goal of at least 85% in the RFP for this study. The contractor plans to meet an overall study response rate of 85% (not 72%). This assumes a 92% institution response rate and a 92% student response rate (.92*.92=.85).
Based on these specifications, an institution sample of 120 institutions was proposed. Assuming 10% of these institutions will be deemed as ineligible and therefore not count against the response rate (108 eligible institutions), a 92% response rate yields a sample of 100 institutions. Ineligibility could result from institutions no longer offering programs that produce elementary education teachers or producing fewer than 25 graduates during spring 2007.
At each institution 25 students will be selected at random from education programs that yield elementary school teachers. The total sample size for students then is 2500. A 92% response rate would produce a sample of 2300 students.
Please provide more information on how students will be recruited and administered the survey and assessment. What procedures will be used to maximize student-level response rates?
RESPONSE FROM ED: Trained student recruiters will use a mix of electronic mail, regular mail and telephone calls to recruit sampled students. The plan is to handle as much of the recruitment and registration electronically as possible to make it less burdensome on potential participants. Sampled students will receive an email from the study team inviting them to participate in the study and to register for the study online. The email will include a letter of support from the Dean of the Institution to facilitate cooperation. The student recruiters will place telephone calls to sampled students not responding to email.
Given the two-hour burden of the data collection and the busy schedules of pre-service teachers in their last semester of school, the study team will provide multiple data collection session times (to be located on or near campus) to include evening and weekend options. Sampled students will have a choice of session times. If an otherwise willing sampled student cannot attend any of the scheduled sessions, the recruiter will work out these issues on a case-by-case basis.
A sampled student will be replaced if: the student is non-responsive after one letter, two emails, and three phone calls; the student was never reached and new contact information cannot be found; the student refuses to participate after the telephone conversation; or the student is determined to be ineligible.
Please provide a report of the results assessing the validity of the assessment conducted by NCES.
RESPONSE FROM ED: The pilot test report is attached.
How many respondents participated in the focus groups and cognitive interviews described in B4?
RESPONSE FROM ED: There were six respondents at the cognitive lab interview. Four different focus groups (focusing on different questions of interest) were held, each with six respondents.
Letter to the Dean:
Why does this letter only mention the student survey and not the assessment?
RESPONSE FROM ED: The public name for the instrument that will be administered to the students is “Teacher Preparation Program and Knowledge Survey.” The study team was concerned that coining the instrument a survey and assessment would unnecessarily cause some students to feel it is something for which they have to study and may deter them from participation.
Language in the initial invitation letter that will be sent to students states “As a participant, you are invited to take a survey that measures the following: (1) the early literacy-related courses that you have taken at <Name of Institution> (2) your feelings of preparedness to teach early reading, and (3) your knowledge of early reading instruction techniques.”
This information encompasses all that is covered on the survey and assessment.
Will ED need to request administrative records from the schools in order to randomly select the 25 students? If so, please account for this information collection activity in your burden estimate.
RESPONSE FROM ED: To contact and recruit students for the survey and assessment, ED will collect student contact information (Name, Address, Phone Number, email, degree candidacy and program enrollment) from participating institutions. The degree candidacy and program enrollment information will be used to verify student eligibility. The contact information will be used to contact sampled students.
We expect that the student contact information will be complied from one of two sources, from internal records maintained by the appropriate college or department within an institution, or from the institution’s registrar. We expect that the process of collecting and submitting student contact information will take no more than three hours per institution. This burden has been added to the A12 table as follows:
Instrument |
Number of Respondents |
Number of Responses per Respondent |
Ave. Burden (Hours Per Respondent) |
Total Burden Hours |
Total Burden Cost |
Pre-service teacher survey |
2500 |
1 |
1 |
2500 |
$ 0 |
Student contact information |
100 |
1 |
3 |
300 |
$7500 |
Survey Instrument:
(Question 4 on race/ ethnicity) Please split this into two questions, as follows.
Are you Hispanic or Latino?
Yes
No
Which of the following best describes you? Please select one or more.
American Indian or Alaska Native
Asian
Black or African American
Native Hawaiian or Other Pacific Islander
White
RESPONSE FROM ED: We will split Item 4 in to the two items above.
(Questions 5-6) The responses are coded for GPA on a 4.0 scale. How should respondents answer if their GPA is not calculated on a 4.0 scale?
RESPONSE FROM ED: We will include grade and numeric (4.0 [equivalents include A or a 90-100]) equivalents on the scale for clarification. Also, trained proctors will be available to answer questions during each survey session.
(Questions 7-8) This survey is designed to assess the extent to which teacher education programs focus on the essential components of early reading instruction (question 1). Why is ED collecting the student SAT/ ACT scores? How do these scores relate to the content and focus of the teacher education programs?
RESPONSE FROM ED: We included SAT/ACT scores for descriptive purposes. We will report these data when describing the sample that completed the survey.
Why doesn’t this survey ask what grade level/ subject the student intends to teach? Since the focus is on early reading instruction, would it make a difference to know that a student planned to teach math to six graders?
RESPONSE FROM ED: The focus of the survey is early reading instruction (i.e., K-3). Therefore, we propose to add two questions to the survey to address this issue.
Do you intend to teach as an elementary school teacher next year?
YES
NO
If you answered YES, what grade do you intend to teach?
K
1
2
3
4
5
other (please specify__________________)
File Type | application/msword |
Author | Rachel Potter |
Last Modified By | Rachel Potter |
File Modified | 2006-12-06 |
File Created | 2006-12-06 |