Office
of Management and Budget Clearance Request:
Supporting Statement
Part B— Collection of Information Employing Statistical Methods
(DRAFT)
Implementation Evaluation of the Title III National Professional Development Program
PREPARED BY:
American
Institutes for Research®
1000 Thomas Jefferson Street, NW,
Suite 200
Washington, DC 20007-3835
PREPARED FOR:
U.S. Department of Education
Institute of Education Sciences
December 2020
Office
of Management and Budget
Clearance Request
Supporting
Statement Part B
December 2020
Prepared
by: American Institutes for Research®
1000 Thomas Jefferson
Street NW
Washington, DC 20007-3835
202.403.5000 | TTY
877.334.3499
www.air.org
Collection of Information Employing Statistical Methods 1
B1. Respondent Universe and Sampling Design 1
B2. Procedures for Data Collection 1
Statistical Methods for Sample Selection 1
Unusual Problems Requiring Specialized Sampling Procedures 4
Use of Periodic (Less Frequent Than Annual) Data Collection Cycles to Reduce Burden 4
B3. Methods to Maximize Response Rates 4
B4. Expert Review and Piloting Procedures 5
This package requests clearance from the U.S. Office of Management and Budget (OMB) to conduct data collection activities associated with the Implementation Evaluation of the Title III National Professional Development (NPD) Program. The purpose of this evaluation is to better understand the strategies that NPD grantees use to help educational personnel working with English learners (ELs) meet high professional standards and to improve classroom instruction for ELs. The Institute of Education Sciences (IES), within the U.S. Department of Education (the Department), has contracted with the American Institutes for Research® (AIR®) to conduct this evaluation.
This study will include the following two samples, which will provide different types of data for addressing the study’s evaluation questions.
Grantee survey. The grantee survey will be administered to the universe of 2016 and 2017 NPD grantees.
Participant survey. The participant survey will be administered to a representative sample of preservice and in-service educators participating in activities provided by the 2016 and 2017 NPD grantees. The sample frame for the participant survey will be derived from participant rosters collected from the 91 grantees, which will help us identify all educators who participated in a grantee’s professional learning activities and their status as either a preservice educator working toward their initial certification or an in-service educator already serving in the classroom. Based on our analysis of the grantees’ NPD funding applications, we estimate that in the time period of summer 2019 onwards, the grantees will collectively serve approximately 17,350 educators, including 3,350 preservice educators and 14,000 in-service educators. We expect to include approximately 1,400 preservice participants and 1,500 in-service participants in the survey sample. In Section B2, we provide additional details about our approach to selecting the sample.
The procedures for carrying out the grantee and participant survey data collection activities are described in the following section.
For the grantee survey, the study aims to obtain responses from the full population of grantees from the NPD program’s 2016 and 2017 cohorts. Lists collected from the Department’s website of the 92 grantees from these two cohorts will provide the survey administration frame, and the AIR study team will collect contact information for grantee project directors (the primary respondents for this survey) from publicly available grantee applications.
The participant survey will be administered to a representative sample of pre-service educators and a representative sample of in-service educators participating in activities provided by the 2016 and 2017 NPD grantees. However, we will only sample among educators who participated in NPD activities in summer 2019 or more recently, to ensure appropriate respondent recall of grant activities. For each sample, we will stratify the sampling frame by grantee. Within each stratum, we will systematically select educators from a master list of NPD participants (generated from the participant rosters collected from each grantee), which will be sorted by any available educator characteristics to ensure that the selected educators represent different educator characteristics present in the population. We will allocate the sample proportionally with a guaranteed minimum sample size of 4 for each stratum. We plan to select 1,400 preservice educators and 1,500 in‑service educators to include in the survey sample.
Prefield Activities. For successful recruitment and data collection, we will employ a cascading process using a case ownership model for consistent outreach and documentation of the recruitment effort and data collection. In this model, the same staff person will “own” specific grantees throughout recruitment and data collection. This consistency facilitates relationship building and helps us quickly identify and mitigate any concerns throughout the data collection cycle.
A toll-free line and project e‑mail account will be set up before any outreach to respondents to ensure effective communication. These accounts will be monitored in real time during typical business hours, and responses will be sent in response to all inquiries within 24 hours (business days).
NPD Grantee Notification and Survey Administration. To begin the recruitment process following OMB approval (expected early 2021), the Department will send a letter to the project directors for all 91 grantees that explains the study, the timeline of study activities, and the participation we will be requesting of grantees. This letter also will include the contact information for study leadership, in case participants have questions.
The grantee survey will be administered online using the SurveyMonkey® platform. For security purposes, the platform will require a unique hyperlink to access. As noted earlier, a toll-free line and a project e‑mail account will be set up before any outreach to grantees. If respondents encounter problems with the survey, a member of the survey administration team will respond within 24 hours.
Once the Department has issued a notification letter to grantee project directors, the AIR research team will send a follow-up e‑mail with information about the study along with instructions on how to complete the survey, including the survey web link and login information. Research staff will monitor the rates of completion, conducting nonresponse prompting efforts by telephone and e‑mail. Information regarding the status of individual cases will be monitored through a tracking sheet that logs each communication attempt with respondents. Specifically, staff will send weekly reminder e‑mails and will follow up with nonresponding grantees by phone at least twice to encourage their response.
NPD Participant Notification and Survey Administration. To identify NPD participants to be surveyed, we will contact the NPD grantee project director to collect participant roster data for each year in which their grant project delivered professional learning activities to educators. The project directors will be given a simple form to complete and will be asked to provide participants’ names, background characteristics (e.g., whether they are a preservice or in-service participant), and e‑mail addresses. Project directors will be asked to complete the form (a spreadsheet template) within 5 business days (see Appendix D for the Participant Roster Request form). Upon receipt of the roster information, the AIR study team will review the form for completeness. When clarification is needed, a grantee’s case manager will follow up with the project director. Once the rosters are finalized and the sample is drawn, the survey team will load survey participants’ e‑mail addresses into the secure survey software, generate unique identification numbers and survey links, and prepare survey notification e‑mails.
As with the grantee survey, the participant survey will be administered online through the SurveyMonkey® platform. Respondents will receive an e‑mail with information about the study along with instructions on how to complete the survey, including the survey web link and log‑in information. Correspondence to participants regarding their participation in the study will include a statement indicating that participation is voluntary but will also emphasize the importance of each response for the study’s findings. Research staff will monitor response rates and send regular reminder e‑mails. To help boost response rates, the study team may ask NPD project directors or partners to sign a letter of endorsement encouraging their participants to complete the survey.
We will describe implementation of the NPD program through descriptive analyses that draw on the application review data, grantee survey data, participant survey data, and grantee performance report data.
Grantee data. For grantee-level data from the survey, application review, and performance reports, we will primarily report unweighted means because we will have data from the universe of grantees. In some cases, we will disaggregate the data according to policy-relevant subpopulations, such as grantees who served particular types of participants (e.g., preservice teachers vs. only in-service teachers, EL specialists vs. general education or content area teachers), grantees who provided specific types of activities (e.g., coursework vs. job-embedded coaching), or grantees who provided professional development of varying levels of intensity.
Participant data. For the participant survey data, we will report weighted means. For some analyses, the participant survey data will be reported separately for preservice and in-service teacher participants. For instance, preservice teachers’ reports about their preparation experiences related to effective instruction of ELs will be compared with those from a large national sample of preservice teachers, based on data collected by the IES‑funded Study of Teacher Preparation Experiences and Early Teaching Effectiveness (Goodson et al., 2019).
Analyses of the grantee survey, application review, and extant performance data will be based on information collected for all 2016 and 2017 NPD grantees, and we assume a 100 percent response rate. Because the analyses will be based on data from the universe of grantees, descriptive statistics will not be subject to sampling error.
With regard to the participant survey, our proposed sample size of 1,400 pre-service participants and 1,500 in-service participants was selected to ensure a sufficient degree of accuracy in our reporting while promoting efficiency and minimizing burden on respondents. Assuming an 85 percent response rate, an estimate of a 50 percent prevalence rate from a sample of 1,400 pre-service participants will have a margin of error of 0.023, and an estimate of a 50 percent prevalence rate from a sample of 1,500 in-service participants will have a margin of error of 0.026 (see Exhibit B1). Moreover, if we compare prevalence rates between the pre-service participant and in-service participant groups, we will be able to detect—with 95 percent confidence and a power of 80 percent—a difference of at least 5 percentage points between the two groups (See Exhibit B2).
Exhibit B1. Margin of Error (1.96 * Standard Error), by Participant Type and Sample Size
Participant Type |
Sample Size |
||||||||||
1,000 |
1,100 |
1,200 |
1,300 |
1,400 |
1,500 |
1,600 |
1,700 |
1,800 |
1,900 |
2,000 |
|
Pre-Service |
0.029 |
0.027 |
0.026 |
0.024 |
0.023 |
0.022 |
0.020 |
0.019 |
0.018 |
0.018 |
0.017 |
In-Service |
0.033 |
0.031 |
0.030 |
0.028 |
0.027 |
0.026 |
0.025 |
0.024 |
0.024 |
0.023 |
0.022 |
Note: Estimates are based on an 85% response rate assumption and a conservative assumption of 0.5 prevalence rate. Finite population correction is taken into account (pre-service population of 3,350 and in-service population of 14,000).
Exhibit B2. Minimum Detectable Difference with 95% Confidence and a Power of 80% Between Pre-service and In-service Participants, by Sample Size
In-Service Sample Size |
Pre-Service Sample Size |
||||||||||
1,000 |
1,100 |
1,200 |
1,300 |
1,400 |
1,500 |
1,600 |
1,700 |
1,800 |
1,900 |
2,000 |
|
1,000 |
0.062 |
0.061 |
0.059 |
0.058 |
0.057 |
0.056 |
0.055 |
0.054 |
0.053 |
0.053 |
0.052 |
1,100 |
0.061 |
0.059 |
0.057 |
0.056 |
0.055 |
0.054 |
0.053 |
0.052 |
0.051 |
0.051 |
0.050 |
1,200 |
0.059 |
0.057 |
0.056 |
0.054 |
0.053 |
0.052 |
0.051 |
0.050 |
0.050 |
0.049 |
0.048 |
1,300 |
0.058 |
0.056 |
0.054 |
0.053 |
0.052 |
0.051 |
0.050 |
0.049 |
0.048 |
0.048 |
0.047 |
1,400 |
0.057 |
0.055 |
0.053 |
0.052 |
0.051 |
0.050 |
0.049 |
0.048 |
0.047 |
0.046 |
0.046 |
1,500 |
0.056 |
0.054 |
0.052 |
0.051 |
0.050 |
0.048 |
0.047 |
0.047 |
0.046 |
0.045 |
0.044 |
1,600 |
0.055 |
0.053 |
0.051 |
0.050 |
0.049 |
0.047 |
0.046 |
0.046 |
0.045 |
0.044 |
0.043 |
1,700 |
0.054 |
0.052 |
0.050 |
0.049 |
0.048 |
0.047 |
0.045 |
0.045 |
0.044 |
0.043 |
0.042 |
1,800 |
0.053 |
0.051 |
0.050 |
0.048 |
0.047 |
0.046 |
0.045 |
0.044 |
0.043 |
0.042 |
0.041 |
1,900 |
0.053 |
0.051 |
0.049 |
0.048 |
0.046 |
0.045 |
0.044 |
0.043 |
0.042 |
0.041 |
0.040 |
2,000 |
0.052 |
0.050 |
0.048 |
0.047 |
0.046 |
0.044 |
0.043 |
0.042 |
0.041 |
0.041 |
0.040 |
Note: Estimates are based on an 85% response rate assumption and a conservative assumption of 0.5 prevalence rate. Finite population correction is taken into account (pre-service population of 3,350 and in-service population of 14,000).
We do not anticipate any unusual problems that require specialized sampling procedures.
Data collection for this study will occur only once during the 2020–21 school year.
Data collection is a complex process that requires careful planning. The team has developed survey instruments that are tailored appropriately to the respondent group and are designed to place as little burden on respondents as possible. The team will use cognitive interviews with NPD program grantees and participants to pilot the surveys and ensure that they are user‑friendly and easily understandable, all of which increases participants’ willingness to participate in the data collection activities and thus increases response rates.
Recruitment materials will include a letter from the Department followed by e‑mails and phone calls from the AIR study team. The materials will emphasize the social incentive to respondents by stressing the importance of the data collection to provide much‑needed technical assistance and practical information to future grantees. As previously mentioned, we will employ a “case ownership” approach to enhance communication and rapport with grantees. The role of the grantee project manager (or designee) is a critical one, as he or she will serve as a point person for all study requests and encourage respondent participation in the study. To ensure efficient outreach, we will continue to use tracking sheets developed for managing recruitment to provide continuity in staffing and tailored follow-up. These tracking sheets will involve regular monitoring by the staff person who “owns” the grantees for early identification of nonrespondents, enabling us to coordinate with the grantee project manager or other grantee representative to encourage full participation when needed. We will work with grantees and participants as necessary to accommodate their schedules. We have found that this flexibility increases participation, as it acknowledges the burden on respondents.
The study team will develop text to be used as needed in nonresponse prompting. This text will be consistent in messaging and will encourage participation by underscoring the importance of the study. We also will create real-time reports (such as a “percentage complete report”) that will highlight grantees with the lowest percentage of surveys complete, such that targeted prompting can occur efficiently and effectively.
To ensure the quality of the data collection instruments, the AIR study team will pilot-test the draft instruments and convene a technical working group (TWG) to provide input. The study team will conduct cognitive interviews with a limited set of IHE administrators and NPD participants to pilot the survey items, respecting limits regarding the number of respondents before OMB clearance. In addition to providing an estimate of respondent burden time, the cognitive interviews will include a debrief with respondents about survey items or instructions that were difficult to understand, poorly worded, or had other problems. The cognitive interviews will be used to revise and improve the survey.
AIR is the prime contractor for the Implementation Evaluation of the Title III National Professional Development Program. The project director, Dr. Kerstin Le Floch, is supported by an experienced team of researchers leading the major tasks of the project. Contact information for the individuals and organizations involved in the project is presented in Exhibit B3.
Exhibit B3. Organizations, Individuals Involved in Project
Responsibility |
Contact Name |
Organization |
|
Project Director |
Kerstin Le Floch |
AIR |
202-403-5649 |
Application Review and Extant Data Task Lead |
Maria Stephens |
AIR |
202-386-0863 |
Grantee Survey Task Lead |
Andrea Boyle |
AIR |
650-376-6294 |
Participant Survey Task Lead |
Rebecca Bergey |
AIR |
650-376-6419 |
In addition, the AIR study team will convene a TWG of researchers and practitioners to provide input on the data collection instruments developed for this study as well as on other methodological design issues. The TWG will consist of researchers with expertise in issues such as ELs and their acquisition of English, academic performance, and social-emotional health; evidence-based curricula and strategies in language instruction educational programs; and EL teacher preparation, credentialing, and professional development. The study team will consult the TWG throughout the evaluation.
Goodson, B., Caswell, L., Price, C., Litwok, D., Dynarski, M., Crowe, E., . . . Rice, A. (2019). Teacher preparation experiences and early teaching effectiveness (NCEE 2019-4007). Washington, DC: Institute of Education Sciences, U.S. Department of Education. Retrieved from https://files.eric.ed.gov/fulltext/ED598664.pdf
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Information Technology Group |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |