National Study to Inform the 21st Century Community Learning Centers Program
Part B: Collection of Information Employing Statistical Methods
Submitted to:
Institute of Education Sciences
National Center for Education Evaluation and Regional Assistance
Washington, DC 20202
Project
Officer: Erica Johnson
Contract Number: 91990019C0056
Submitted by:
Mathematica
P.O.
Box 2393
Princeton, NJ 08543-2393
Telephone: (609)
799-3535
Facsimile: (609) 799-0005
Project
Director: Susanne James-Burdumy
Reference Number: 50846
PART B. Collection of information employing statistical methods 1
Introduction 1
B1. Respondent universe and sampling methods 1
B2. Statistical methods for sample selection and degree of accuracy needed 5
B3. Methods to maximize response rates and deal with nonresponse 11
B4. Tests of procedures and methods to be undertaken 14
B5. Individuals consulted on statistical aspects of the design 15
REFERENCES 16
Appendix A: Confidentiality Agreement
Appendix B: Afterschool Center Coaching Log
Appendix C: Request for Student Afterschool Attendance Records
Appendix D: Parent/Guardian Questionnaire
Appendix E: Parent/Guardian Permission Form
B.1. Respondent universe, sample, and expected response rate for study data sources
B.2. Estimation methods for each study research question
B.3. Precision of estimates for the national snapshot of afterschool centers’ strategies
B.4. Statistical power of the evaluation of a continuous quality improvement system
This package requests clearance for the initial data collection activities to support a study of afterschool strategies in the Nita M. Lowey 21st Century Community Learning Centers (21st CCLC) program. The U.S. Department of Education (ED)’s Institute of Education Sciences (IES) has contracted with Mathematica and its partners—the Forum for Youth Investment’s Weikart Center for Youth Program Quality, Pemberton Research, Research for Action, and Synergy Enterprises (together, “the study team”)—to conduct this study.
The study will have two components. The first is a national snapshot of strategies that afterschool centers in the 21st CCLC program use to serve their students and families. The national snapshot will complement and extend information from the program’s annual performance measures by providing an in-depth understanding of the outcomes centers aim to promote and the diverse ways their activities and services for students and families, supports for staff, and improvement strategies are designed to promote these outcomes. The second component is an evaluation of a continuous quality improvement system implemented in the program’s afterschool centers (referred to as the “National Study of Continuous Quality Improvement to Inform the 21st CCLC Program”). The evaluation will examine the implementation and effectiveness of a system focused on improving staff practices that promote students’ social and emotional skills.
This package discusses all data collection activities proposed for the two components of the study. However, the package only requests clearance for data collection activities that will occur before March 2022 and impose burden on respondents. These activities, all part of the evaluation of a continuous quality improvement system, involve collecting parent/guardian questionnaires and permission forms, afterschool center coaching logs, and student afterschool attendance records. A separate package will be submitted at a later date to request clearance for data collection activities that will occur in March 2022 or later.
For each administrative and primary data source proposed, Exhibit B.1 summarizes the respondent universe, sampling method, and expected response rate.
Exhibit B.1. Respondent universe, sample, and expected response rate for study data sources
Data source |
Respondent |
Respondent universe |
Type of sample |
Sample size |
Expected response rate |
National snapshot of afterschool centers’ strategies |
|||||
Afterschool center director survey |
Afterschool center directors |
6,700 |
Stratified random sample |
250a |
85% |
Evaluation of a continuous quality improvement system |
|||||
Afterschool center observations |
All centers to be observed by study team (no burden) |
100 |
Census |
100 |
100% |
Afterschool center coaching log |
Afterschool center directors and co-leads in centers assigned to the treatment group |
100 |
Census |
100 |
100% |
Afterschool center director interview |
Afterschool center directors |
100 |
Census |
100 |
85% |
Student afterschool attendance records |
One staff member at each center to provide records |
100 |
Census |
100 |
100% |
District administrative records |
One staff member at each district to provide records |
18 |
Census |
18 |
100% |
Afterschool center staff survey |
Afterschool center staff who lead activities in the study grades |
600 |
Simple random sample |
300 |
85% |
Student survey |
Students who attend the study centers at any time in the first month of the 2021-2022 school year |
3,485 |
Random sample based on oversampling students who attend the center more frequently |
1,300 |
85% |
School-day teacher survey |
School-day teachers of the students in the student survey respondent universe |
1,300 |
Teachers of the random sample of students selected for the student survey |
Determined by student survey sample |
85% |
Parent/guardian questionnaire and permission form |
Parents or guardians |
4,100 |
Census |
4,100 |
Note: Shaded rows indicate the data sources for which clearance is being requested in this submission. Clearance will be requested in a future submission for other rows.
a The study will select 375 afterschool centers to answer a screening question that assesses eligibility for the survey. Of these 375 afterschool centers, the study team estimates that 250 will be eligible for the survey.
The remainder of this section provides details on the respondent universe and an overview of the sampling methods. Section B2 describes additional details on the sampling methods, while Section B3 presents the expected response rates.
For the afterschool center director survey—the sole data collection planned for the national snapshot—the respondent universe will consist of the directors of afterschool centers funded by 21st CCLC grants within the 50 U.S. states, the District of Columbia, and the Bureau of Indian Education in spring 2022. Among those centers, the study will include only those that are receiving 21st CCLC grants as of fall 2020 (the beginning of the prior school year). Centers that receive their grant funding after fall 2020 may not have sufficient opportunity to put their desired strategies into place by spring 2022. To estimate the size of the respondent universe, the study team used the most recent available year (2018–2019) of data from the 21st CCLC program’s annual reporting system and identified 10,093 centers receiving funding. Because the typical length of grants is about 3 years, the study team estimates that about two-thirds of these centers—about 6,700—are in their second year of 21st CCLC grant funding or beyond, the best available approximation of the respondent universe. The study team will randomly select a nationally representative sample of 250 centers that receive 21st CCLC funding in fall 2020 and still receive it in spring 2022 (see Section B2 for additional details).
The evaluation of a continuous quality improvement system will be based on an experimental design. The study is seeking to recruit 100 afterschool centers in the 21st CCLC program serving grades 3-6. To reach this target, the study team expects to recruit 18 grantees in which the study team will identify and recruit the 100 centers needed for the evaluation. Within each grantee, the study team will identify pairs of afterschool centers that are similar on characteristics likely to be correlated with staff practices or students’ social and emotional skills (such as baseline measures of proficiency on state reading and math assessments in the schools that the centers serve). Within each pair, the study team will randomly assign one of the centers to the treatment group that will implement the continuous quality improvement system during the 2021–2022 and 2022–2023 school years. Implementing the system over a two-year period will allow sufficient time for study-provided trainings and supports to influence the practices of afterschool center staff and for those practices to influence student outcomes. The study team will assign the other center in each pair to a control group that will not implement that system and, instead, will continue with its usual strategies for supporting staff and serving students. Within these grantees and centers, respondents will include the school districts that the grantees serve, afterschool centers, afterschool center directors and other center leaders, afterschool center staff, parents or guardians of the centers’ students, the centers’ students, and the students’ school-day teachers.
For most data collection activities in this component of the study, the size of the respondent universe is based directly on the assumption of 18 grantees and 100 centers in the evaluation and does not require further estimation. Specifically, the study team will conduct observations of all 100 centers and collect their student attendance records; interview the directors of all 100 centers; and ask two center leaders (the afterschool center director and a co-lead) in each of the 50 centers assigned to the treatment group to complete afterschool center coaching logs, for a total of 100 center leaders in the respondent universe for the afterschool center coaching logs.
For the remaining data collection activities, the study team has estimated the size of the respondent universe as follows:
For the collection of district administrative records, the study team has estimated a respondent universe of 18 school districts. Although grantees could, in theory, serve more than one district, the study team expects that each of the 18 grantees recruited into the evaluation will serve only one district, for a total of 18 districts from which the study team will collect administrative records.
For the student survey, the study team has estimated a respondent universe of 3,485 students. The respondent universe will consist of students in grades 3–5 who attend a participating center at any time in the first month of the 2021–2022 school year and whose parents or guardians give permission to them being surveyed. The study team will exclude from the respondent universe students who are in the final grade served by their center so the study can follow students in the participating centers for two school years (2021–2022 and 2022–2023). With on-time grade progression, these students are expected to be in grades 4–6 in the second year of the evaluation. Based on average center sizes and grade level distributions reported by James-Burdumy et al. (2005) and the U.S. Department of Education (2018b), the study team expects a typical elementary center serves about 41 students in grades 3–5 at the beginning of the year, implying a total of 4,100 students in the study grades across the 100 centers in the evaluation. Of those students, we expect 85 percent (3,485) to have parental permission to be eligible for the survey.
For the parent/guardian questionnaire and permission form, the study team has estimated a respondent universe of 4,100 parents or guardians. The respondent universe will consist of one parent or guardian of each of the estimated 4,100 students in grades 3-5 who attend a participating center at any time during the first month of the 2021-2022 school year.
For the school-day teacher survey, the study team has estimated a respondent universe of 1,300 school-day teachers. The respondent universe will consist of school-day teachers who teach math or English language arts to the students in the evaluation. Based on average school sizes, grade level distributions, and class sizes reported by the U.S. Department of Education (2007 and 2013) and James-Burdumy et al. (2005), the study team estimates that 13 school-day teachers will teach math or English language arts to the students enrolled at each center in the study grades—a total of 1,300 school-day teachers across the 100 centers in the evaluation.
For the afterschool center staff survey, the study team has estimated a respondent universe of 600 afterschool center staff. The respondent universe will consist of afterschool center staff at the participating centers who lead activities for the grades in the evaluation. Using a 7:1 ratio of staff to students (Smith et al. 2012), the study team estimates that 600 staff will lead activities for the 4,100 students at the participating centers within the grades in the evaluation.
For the evaluation, the study team will only use sampling methods to select (1) the sample of students who will take the student survey and whose teachers will take the school-day teacher survey and (2) the center staff who will take the afterschool center staff survey (see Section B2 for additional details).
National snapshot of afterschool centers’ strategies
For the afterschool center director survey, the study team will select a nationally representative sample of 250 centers that receive 21st CCLC funding in fall 2020 and still receive it in spring 2022. The sample of centers will be randomly selected from a nationwide list of all centers receiving 21st CCLC funding in fall 2020, based on information in the program’s annual performance reporting system. The study team will stratify the center sampling frame into 16 strata based on grades served (elementary and secondary), census region (Northeast, South, Midwest, and West), and grantee type (local education agency [LEA] or non-LEA). The sampling rate will be identical across strata so that the sample is nationally representative.
Because the sampling frame will not have information on whether each center still receives 21st CCLC funding in spring 2022, the study team will initially select more than 250 centers and determine whether they still receive 21st CCLC funding through a single screening question. For centers that the study team does not successfully contact for screening (which may be due to the center not operating anymore), the study team will ask the center’s grantee to provide this information. Centers that the study team verifies are still receiving 21st CCLC funding in spring 2022 will be administered the full survey. The initial sample will consist of 375 centers, of which the study team expects 250 (two-thirds of the initial sample) will be eligible for the full survey.
Evaluation of a continuous quality improvement system
For most data collection activities for the evaluation, the study team will not use sampling methods but will collect data from the census of respondents in the respondent universe. The study team will use sampling methods in two cases: (1) to select the sample of students who will take the student survey and whose teachers will take the school-day teacher survey and (2) to select center staff who will take the afterschool center staff survey.
To select the students who will take the student survey and whose teachers will take the school-day teacher survey, the study team will select students from the population of students who attend a study center on any given day during the first month of the 2021–2022 school year. The study team will randomly select 18 students per center with probability proportional to the number of days they attended the center in the first month, thereby oversampling students who attend more frequently during the first month. The resulting sample will be representative of center attendees on a typical day in the first month of the 2021–2022 school year. Of those 18 students, the first 13 will be designated as the main sample and the next 5 will be designated as a backup sample. During spring 2022, the 13 students in the main sample will be given the student survey, and their teachers will be given the school-day teacher survey. The classroom teacher of the student at the time we administer the student survey will be asked to complete the school-day teacher survey. If a student’s school is departmentalized, we will select a single teacher for each student. If the student’s homeroom teacher also teaches the student math or English language arts, that teacher will be in the sample. If not, we will randomly select either the student’s math or English language arts teacher.
The study team will not survey the backup sample of 5 additional students per center in spring 2022 but could survey them in spring 2023 if needed. In spring 2023, the study team will reassess the sample for each center to ensure that at least 10 of the original 13 students in each center’s sample are still enrolled at the same school. Only students who are still enrolled will be surveyed in spring 2023. If fewer than 10 students are still enrolled, the study team will randomly select students from the backup sample for that center until the study gets 10 enrolled students in the spring 2023 sample. The students who are surveyed in spring 2023 will be the same students whose classroom teachers are asked to complete the school-day teacher survey in spring 2023.
To select staff for the afterschool center staff survey, the study team will randomly select afterschool center staff during each year of the evaluation. Based on staff rosters from early spring of the survey year (2022 or 2023), the study team will identify the staff at each center who lead activities with students who are in the grades of the evaluation (grades 3–5 in spring 2022, and grades 4–6 in spring 2023). Of those staff, the study team will randomly select three from each center to be included in the sample for that year and will administer the afterschool center staff survey to them.
Exhibit B.2 indicates which analysis methods will be used for each research question. To address the research questions, the study team will use three types of analysis methods across the two study components:
National snapshot of afterschool centers’ strategies
Descriptive analyses: The study team will calculate the average values of continuous variables. For categorical variables, the study team will calculate the percentage of sample members in each category. The study team will also report measures of the precision of these estimates, such as confidence intervals.
Comparative analyses: To compare groups, such as comparing characteristics of different types of centers, the study team will report the magnitudes of differences between groups and assess their statistical significance.
Evaluation of a continuous quality improvement system
Descriptive analyses: The study team will apply the same approach for descriptive analyses as in the national snapshot of afterschool centers’ strategies.
Comparative analyses: The study team will apply the same approach for comparative analyses as in the national snapshot of afterschool centers’ strategies.
Regression analyses: The study team will use regression analyses in two ways. First, the study team will use them to estimate the effect of the continuous quality improvement system on afterschool center staff practices and student outcomes. The study team plans to control for covariates that represent baseline characteristics of students and their families, including students’ demographic characteristics and baseline social and emotional skills. Second, the study team will use regression analyses to understand the circumstances under which the continuous quality improvement system may be most successful. To do so, the study team will examine how effects on student outcomes and staff practices relate to key factors, such as characteristics of the centers and the ways they implemented the continuous quality improvement system.
Exhibit B.2. Estimation methods for each study research question
Research question |
Descriptive analyses |
Comparative analyses |
Regression analyses |
National snapshot of afterschool centers’ strategies |
|||
RQ 1. What key outcomes for students and families do afterschool centers in the 21st CCLC program aim to promote, and what strategies do they use to promote them? |
X |
X |
|
|
X |
X |
|
|
X |
X |
|
Evaluation of a continuous quality improvement system |
|||
RQ2. To what extent did centers implement key aspects of the continuous quality improvement system in the first year of the system? What challenges did they encounter in implementing the system, and how did they address these challenges? |
X |
|
|
RQ 3. Across two years, what improvement strategies were used by centers that implemented the continuous quality improvement system? How did these differ from centers that did not implement the system? |
X |
X |
|
|
X |
X |
|
|
X |
X |
|
RQ 4. How did the system affect the practices of center staff, including practices to promote students’ social and emotional skills? How did the system affect students’ social and emotional skills? |
|
|
X |
|
|
|
X |
|
|
|
X |
RQ 5. What is the cost-effectiveness of the continuous quality improvement system? |
X |
|
|
To estimate the effect of the continuous quality improvement system on student outcomes, the study team will estimate the following equation:
(1)
where is the outcome of student in center within randomization block (pair of centers) ; is the number of randomization blocks; is an indicator for whether a center is part of randomization block (in other words, if = ); is an indicator of whether center within randomization block was assigned to the treatment group and was offered the continuous quality improvement system; is a vector of covariates; and is a student-level error term. The study team will use the resulting estimates to form the overall effect of the continuous quality improvement system on outcomes. In Equation (1), is the effect of offering the continuous quality improvement system to a center that was assigned to the treatment group within randomization block . The study team will estimate an overall treatment effect, , as an average across all randomization blocks. The study team will calculate standard errors using methods that account for clustering of students in centers (Schochet et al. 2020) and will report statistical significance based on two-tailed t-tests.
The study team will adapt this basic approach for other analyses to estimate the effect of the continuous quality improvement system. To estimate effects on staff practices, the study team will use a similar equation adapted for the center level. The study team will also estimate similar equations using subgroups of centers defined by the characteristics of centers, staff, and students.
The study team will conduct similar regression analyses to examine the circumstances under which the continuous quality improvement system may be most successful. To do so, the study team will regress the estimated block-level effects ( on other key factors. These analyses will explore the extent to which effects on students’ outcomes relate to effects on staff practices, whether centers with stronger implementation of the quality improvement system have larger effects, whether centers with greater alignment with the school day have larger effects, and whether centers that experience the greatest effects tend to be those that adopted certain improvement practices.
National snapshot of afterschool centers’ strategies
The analysis will describe the characteristics, activities, and strategies of centers in the program with a reasonable level of statistical precision. This precision depends in part on the characteristics being examined. For a relatively rare, binary characteristic—for example, the presence or absence of an activity that 10 percent of centers offer—the expected standard error from the sample would be 2.1 percentage points (Exhibit B.3). The corresponding confidence interval of the estimate would be 5.9 to 14.1 percent. For characteristics present in half of all centers of the full population, the estimated confidence interval would be larger, at about 43.2 to 56.8 percent. Levels of precision for subgroups of half or one quarter of the sample would be correspondingly lower, with larger estimated confidence intervals. For continuous rather than binary characteristics, the sample will lead to an expected standard error of about 0.07 standard deviations. At this level of precision, the corresponding confidence interval for a characteristic with a mean of zero would be -0.14 to 0.14 standard deviations.
Exhibit B.3. Precision of estimates for the national snapshot of afterschool centers’ strategies
|
Full sample (250 centers) |
50% subgroup (125 centers) |
25% subgroup (62 centers) |
|||
Statistic |
Standard error |
95% confidence interval |
Standard error |
95% confidence interval |
Standard error |
95% confidence interval |
Percentage of study centers with a rare characteristic (present in 10 percent of the population) |
2.1 |
5.9 to 14.1 |
2.9 |
4.2 to 15.8 |
4.1 |
1.7 to 18.3 |
Percentage of study centers with a moderately common characteristic (present in 50 percent of the population) |
3.4 |
43.2 to 56.8 |
4.9 |
40.4 to 59.6 |
6.0 |
36.2 to 63.8 |
Mean of a continuous variable with a mean of 0 and standard deviation of 1 |
0.07 |
-0.14 to 0.14 |
0.10 |
-0.19 to 0.19 |
0.14 |
-0.28 to 0.28 |
Notes: The calculations assume an 85% response rate. The standard error was calculated using the following formula: where is the standard deviation of the outcome, is the total number of individuals in the sample, and is the response rate.
Evaluation of a continuous quality improvement system
The evaluation’s random assignment and sampling design will allow the study to detect policy-relevant and realistic effects of the continuous quality improvement system on the quality of staff practices and students’ social and emotional skills, the key outcomes of the study. With a target sample size of 100 centers, the study will be able to detect effects on staff practices as low as 0.45 standard deviations (Exhibit B.4). This level of statistical power would allow the study to detect the effect on the quality of staff practices that prior research found for a similar continuous quality improvement system after just one year of implementation (0.55 standard deviations; Smith et al. 2012). Given that this study will involve two years of implementation, the effects of the continuous quality improvement system could plausibly be larger than that found in the prior research. For a subgroup of up to half of centers participating in the evaluation, the design will allow the study to detect an effect as low as 0.65 standard deviations.
For the key student outcome of social and emotional skills, the design will allow the study to detect an effect of the continuous quality improvement system as small as 0.23 standard deviations for the full sample, and 0.28 for a sample consisting of half of all students. Although no prior research has directly estimated the effect of a similar system on students’ social and emotional skills, past evidence suggests that an effect this large is realistic. A meta-analysis by Durlak et al. (2010) found that students’ skills improved by 0.22 to 0.24 standard deviations more when they attended afterschool centers that used recommended practices for social and emotional learning instruction compared to centers that did not. Most of the studies included in this meta-analysis measured outcomes after less than one year of students’ participation. Given that the study’s continuous quality improvement system will be implemented for two years, its effects on social and emotional skills could realistically be at least as large as those found in the meta-analysis and, therefore, be detected by the study.
Exhibit B.4. Statistical power of the evaluation of a continuous quality improvement system
Outcome (measured at the end of two years of implementation) |
Data source |
Minimum detectable effect size (in standard deviation units) |
||
Full sample |
50 percent subsample of students |
50 percent subsample of centers |
||
Centers’ quality of staff practices |
Afterschool center observations |
0.45 |
n.a. |
0.65 |
Students’ social and emotional skills |
Student survey and school-day teacher survey |
0.23 |
0.28 |
0.33 |
Note: Calculations assume (1) 80 percent power and 5 percent significance level for a two-tailed test; (2) an average of 41 students in grades 3–5 per center (James-Burdumy et al. 2005; U.S. Department of Education 2018b), 10 of whom are in the sample for surveys of students and school-day teachers at the end of the second year of the study and 85 percent of whom we obtain permission to include in the study; (3) a response rate of 100 percent for afterschool center observations and 85 percent for surveys; and (4) a sample of 50 treatment and 50 control centers. Assumptions on center-level clustering of outcomes and explanatory power of covariates come from (1) Social and Character Development Research Consortium (2010) and Schochet (2008) for social and emotional learning skills and (2) Smith (2013) for center observations.
n.a. = not applicable
The study team does not anticipate any unusual problems that require specialized sampling procedures.
To minimize burden, the study team is planning to collect the study’s data as infrequently as possible while fulfilling the study’s analytic requirements. The study team will collect district administrative records one time during fall 2023. Similarly, the study team will collect data on student afterschool attendance records only during the middle of the 2021–2022 school year and end of the 2022–2023 school year. The study team will administer the parent/guardian questionnaire and permission form a single time in fall 2021. By necessity, the study team will collect other data more frequently:
To obtain information on center and student outcomes in both study years, the study team must conduct the interviews and surveys once per year for two years. During spring 2022 and 2023, the study team will conduct the afterschool center director interview, the student survey, the school-day teacher survey, and the afterschool center staff survey.
To ensure accurate reporting and allow for continuous implementation monitoring, the study team needs to collect afterschool center coaching logs when afterschool center directors and co-leads complete them after each coaching session from fall 2021 through spring 2023. Less frequent collection could lead to errors in the logs, because afterschool center directors and co-leads might not remember the coaching sessions accurately. In addition, the study team will use the logs to monitor implementation fidelity, which is necessary to assess frequently to ensure high levels of fidelity.
The study team will collect center observations on staff practices at baseline in fall 2021 and during spring 2022 and spring 2023. The baseline measure of staff practices is critical to the analyses, because it will allow the study to characterize baseline differences between treatment and control centers and conduct key subgroup analyses. The two years of follow-up data will allow the study to estimate effects on staff practices annually.
As mentioned in Exhibit B.1, the study team expects a 100% response rate on the afterschool center observations, afterschool center coaching log, student afterschool attendance records, and district administrative records. The study team expects an 85 percent response rate on the afterschool center director survey (administered via web); afterschool center director interview (conducted in-person); and the afterschool center staff survey, student survey, school-day teacher survey, and parent/guardian questionnaire and permission form (all administered via hardcopy forms). Across all aspects of data collection, the study team will use strategies that have proven to be successful on other large-scale IES evaluations, including:
Developing and testing the web-based survey to maximize ease of completion and reduce respondent burden. The study team will follow processes proven successful for other web-based data collections. The study team will minimize the length of the instrument to gather only key, necessary information. To reduce item nonresponse, the web-based questionnaire may include programmed checks alerting respondents to out-of-range or inconsistent responses they enter. These checks will give respondents the choice of changing their response based on guidance provided on the pop-up screen, or leaving their answer and continuing on to the next question. The study team will thoroughly test the instrument for clarity, accuracy, length, flow, and wording.
Developing relationships with afterschool center and school front office staff. In the study team’s experience, data collection at afterschool centers and schools will require regular interaction with front office staff. Gaining access to the intended respondents of the afterschool center director interview, afterschool center staff survey, and school-day teacher survey will be vital to reaching the desired 85 percent response rate on these data collection activities. Experienced field staff will be trained to cultivate relationships with front office staff to assist in distribution and collection of completed surveys; gain access to the afterschool center staff and school-day teachers to conduct face-to-face nonresponse follow-up, if appropriate; and schedule interviews with afterschool center directors. Front office staff at afterschool centers may also be an essential part of assisting with the parent/guardian questionnaire nonresponse follow-up as well as scheduling the student survey administration.
Tailoring nonresponse follow-up communication. For the afterschool center staff survey and the school-day teacher survey, nonresponse follow-up communication may take several forms. Depending on access to staff and teachers at each center or school, distribution of nonresponse letters may be needed to encourage completed surveys. Field staff may attempt to gain face-to-face access to the respondents as a mean of encouraging participation, or email correspondence may be deemed the more effective method. Study staff will discuss all possible methods and determine the best course of action based on their experiences with the centers and schools.
Specific methods for maximizing response rates and minimizing nonresponse in the collection of data in each study component are as follows:
National snapshot of afterschool centers’ strategies
Afterschool center director survey: The study team anticipates an 85 percent response rate using the following procedures. In advance of administering the web-based afterschool center director survey, the study team will mail a study introduction letter describing the study purpose and topics the survey will cover. The letter will also mention an upcoming invitation email that will include their login information. Throughout the data collection period, the study team will conduct nonresponse follow-up by email and mailed reminder letters.
Evaluation of a continuous quality improvement system
Afterschool center observations: The study team anticipates a 100 percent response rate because (1) grantees and their participating afterschool centers will commit to allowing observations as a condition for being in the evaluation and (2) there is no direct burden on afterschool center staff. In each round (fall 2021, spring 2022, and spring 2023), two observers will visit each center on different days. During visits, observers will use a validated instrument, the Program Quality Assessment, to watch and record notes on the staff’s practices and interactions with students, including those that promote social and emotional skills. Observers will not directly interact with the afterschool center staff or students at the center.
Afterschool center coaching log: The study team anticipates a 100 percent response rate because completion of coaching logs will be part of the continuous quality improvement system for which the study team will provide training and support and monitor implementation closely. The study team will provide a full-day training to afterschool center directors and co-leads in the treatment group on methods for coaching their staff, and as part of the training, the study team will explain the importance and procedures for completing the coaching logs. The study team will instruct them to complete the coaching logs after each coaching session with their staff members. The study team will monitor data collected through the coaching logs and will follow up with afterschool center directors and co-leads as needed to ensure completion of the logs.
Afterschool center director interview: The study team anticipates an 85 percent response rate using the following procedures. Afterschool centers will commit to participating in the afterschool center director interview as a condition for being in the evaluation. The study team will work with the afterschool center directors to schedule the interview at a time convenient to them.
Student afterschool attendance records: The study team anticipates a 100 percent response rate for the following reasons. Afterschool centers will commit to providing student afterschool attendance records as a condition for being in the evaluation. To help ensure full response, the study team will accommodate whatever format of the records (electronic or hard-copy) each center would like to provide. At two points in time (the middle of the 2021–2022 school year and end of the 2022–2023 school year), Mathematica field staff will visit each center to collect records and/or answer the center’s questions on how to submit them.
District administrative records: The study team anticipates a 100 percent response rate for the following reasons. The study team will execute a memorandum of understanding with each district in the evaluation to ensure the study can obtain records on the students in the evaluation. Using approaches the study team has carried out successfully on prior evaluations for IES, the study team will designate one liaison from the study team for each district who will communicate with the district’s staff member responsible for providing the data. Through phone calls and written explanations, the liaison will describe the data fields requested, data security procedures, and procedures for districts to submit the data. For all students on whom the study requests records, the study team will have obtained parental permission to collect these data.
Afterschool center staff survey: The study team anticipates an 85 percent response rate using the following procedures. For the afterschool center staff survey, the study team will use experienced, local field staff to administer the hardcopy survey to the selected afterschool center staff. As mentioned above, a key to the success of this data collection is building a strong working relationship with the center’s front office staff who may be the direct link to the afterschool center staff. Field staff will drop off survey packets for each center’s selected staff and periodically visit the centers to collect any completed surveys and encourage survey completion by afterschool center staff who have not yet done so. Field staff will have $30 gift cards on hand for immediate distribution for completed surveys.
Student survey: The study team anticipates an 85 percent response rate using the following procedures. The student surveys will be administered in a group setting at each center by experienced field staff. Field staff will be trained on skills necessary to put the students at ease and on general group-administered survey procedures. Field staff will coordinate with the afterschool center front office staff to schedule the student survey administration, secure a location suitable for proper group administration, and conduct any additional visits to administer surveys to students who were not in attendance on the day of the original administration.
School-day teacher survey: The study team anticipates an 85 percent response rate using the following procedures. Following a similar approach to that used for the afterschool center staff survey, field staff will drop off survey packets to schools where identified school-day teachers work. Field staff will have $30 gift cards on hand for immediate distribution for completed surveys.
Parent/guardian questionnaire and permission forms: The study team anticipates an 85 percent response rate using the following procedures. Administration of the parent/guardian questionnaire and permission form will rely heavily on building a working relationship with afterschool center administrative staff as well as organizing and tracking outgoing and incoming questionnaires and permission forms. A questionnaire and permission form will be included in the application packet that parents and guardians normally complete to enroll their child in the afterschool center. Field staff will work with afterschool center front office staff to track returned questionnaires and permission forms. The study team will adapt strategies for nonresponse follow-up including, but not limited to, asking centers to communicate with parents/guardians during and after the application process or reaching out to parents/guardians by email.
For each respondent group, if response rates fall below 85 percent, the study team will conduct nonresponse analyses. The study will compare any available data on the baseline characteristics of individuals who completed the surveys and interviews to those of the original sample. If these analyses point to the possibility of nonresponse bias, the study team will construct and use nonresponse weights based on the observable baseline characteristics.
To inform data collection activities for which clearance is being requested in this submission, the study team pretested the parent/guardian questionnaire and permission form. The purpose of the pretests was to confirm the level of burden and identify problems that study respondents might have with providing the requested information. For example, the pretests assessed the content and wording of individual questions, organization and format of the instruments, the amount of time it took to respond, and potential sources of response error.
The parent/guardian questionnaire and permission form were pretested with eight parents or guardians of students in grades 3 and 4 attending a 21st CCLC program. The program director distributed packets with the questionnaire and permission form to the pretest respondents, collected the completed packets, and mailed them back to the study team. The study team reviewed the completed packets and conducted debriefing interviews by phone with each respondent to review any issues they may have encountered. Interviewers followed a protocol to probe on particular items to be sure the items were phrased clearly and collected accurate information. Respondent burden to complete the parent/guardian questionnaire and permission form averaged 10 minutes as reported by pretest respondents. The results of the pretest were used to revise and improve the parent/guardian questionnaire. The pretest feedback did not identify any issues or require any revisions to the permission form.
The study team did not pretest the afterschool center coaching log because afterschool center directors and co-leads will receive specific training on how to complete these logs as part of implementing the study’s continuous quality improvement system. The study team did not pretest the request for student afterschool attendance records because it was closely modeled on forms that have been effectively used for other studies.
To inform data collection activities for which clearance will be requested in a future submission, the study team will also pretest each of the following instruments with up to nine pretest respondents: (1) afterschool center director survey, (2) afterschool center director interview, (3) afterschool center staff survey, (4) student survey, and (5) school-day teacher survey. The study team will not pretest the request for district administrative records because it is closely modeled on forms that have been effectively used for other studies.
The following individuals were consulted on the statistical aspects of the study:
Name |
Title |
Telephone Number |
Hanley Chiang |
Associate Director, Mathematica |
617-674-8374 |
Susanne James-Burdumy |
Vice President, Mathematica |
609-275-2248 |
Tim Kautz |
Senior Researcher, Mathematica |
609-297-4544 |
Philip Gleason |
Associate Director and Senior Fellow, Mathematica |
202-264-3443 |
Allen, P.J., R. Chang, B.K. Gorrall, L. Waggenspack, E. Fukuda, T.D. Little, and G.G. Noam. “From Quality to Outcomes: A National Study of Afterschool STEM Programming.” International Journal of STEM Education, vol. 6, 2019.
Black, A., M. Somers, F. Doolittle, R. Unterman, and J. Grossman. “The Evaluation of Enhanced Academic Instruction in After-School Programs: Final Report.” NCEE 2009-4077. Washington, DC: U.S. Department of Education, Institute of Education Sciences National Center for Education Evaluation and Regional Assistance, September 2009.
Cunha, F., J. J. Heckman, and S. M. Schennach. “Estimating the Technology of Cognitive and Noncognitive Skill Formation.” Econometrica, vol. 78, no. 3, 2010, pp. 883–931.
Durlak, J. A., R. P. Weissberg, and M. Pachan. “A Meta-Analysis of After-School Programs That Seek to Promote Personal and Social Skills in Children and Adolescents.” American Journal of Community Psychology, vol. 45, nos. 3–4, 2010, pp. 294–309.
Dynarski, M., S. James-Burdumy, M. Moore, L. Rosenberg, J. Deke, and W. Mansfield. “When Schools Stay Open Late: The National Evaluation of the 21st-Century Community Learning Centers Program: New Findings.” Washington, DC: U.S. Department of Education, Institution of Education Sciences, National Center for Education Evaluation and Regional Assistance, 2004.
Elliott, S. N., C. J. Anthony, J. C. DiPerna, P. Lei, and F. M. Gresham. “SSIS Brief Scales Series.” Scottsdale, AZ: SAIL CoLab, 2020.
James-Burdumy, S., M. Dynarski, M. Moore, J. Deke, W. Mansfield, and C. Pistorino, C. “When Schools Stay Open Late: The National Evaluation of the 21st Century Community Learning Centers Program: Final Report.” Washington, DC: U.S. Department of Education, Institute of Education Sciences National Center for Education Evaluation and Regional Assistance, April 2005.
Kautz, T., J. J. Heckman, R. Diris, B. Ter Weel, and L. Borghans. “Fostering and Measuring Skills: Improving Cognitive and Non-Cognitive Skills to Promote Lifetime Success.” Working Paper 20749. Cambridge, MA: National Bureau of Economic Research, 2014.
Naftzger, N. “A Summary of Three Studies Exploring the Relationships Between Afterschool Program Quality and Youth Outcomes.” Conference paper presented at 2014 Ready by 21 National Meeting. Washington, DC: American Institutes for Research, 2014.
Naftzger, N., S. Sniegowski, C. Smith, and A. Riley. “Exploring the Relationship between Afterschool Program Quality and Youth Development Outcomes: Findings from the Washington Quality to Youth Outcomes Study.” Naperville, IL: American Institutes for Research, 2018.
Penuel, W. R., and R. McGhee, Jr. “21st Century Community Learning Centers Descriptive Study of Program Practices.” Washington, DC: U.S. Department of Education, Office of Planning, Evaluation and Policy Development, 2010.
Social and Character Development Research Consortium. “Efficacy of Schoolwide Programs to Promote Social and Character Development and Reduce Problem Behavior in Elementary School Children.” NCER 2011–2001. Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Research, 2010.
Schochet, P. Z. “Statistical Power for Random Assignment Evaluations of Education Programs.” Journal of Educational and Behavioral Statistics, vol. 33, no. 1, 2008, pp. 62–87.
Schochet, P., N. Pashley, L. Miratrix, and T. Kautz. “Design-Based Ratio Estimators and Central Limit Theorems for Clustered, Blocked RCTs.” Working paper. Princeton, NJ: Mathematica, 2020.
Smith, C. “Moving the Needle on ‘Moving the Needle’: Next Stage Technical Guidance for Performance Based Accountability Systems in the Expanded Learning Field with a Focus on Performance Levels for the Quality of Instructional Services.” Ypsilanti, MI: Weikart Center for Youth Program Quality, 2013.
Smith, C., T. Akiva, S. Sugar, Y. Lo, K. Frank, S. Peck, K. Cortina, and T. Devaney. “Continuous Quality Improvement in Afterschool Settings: Impact Findings from the Youth Program Quality Intervention Study.” Washington, DC: Forum for Youth Investment, 2012.
U.S. Department of Education. “Table 95: Public Elementary Schools, by Grade Span, Average School Size, and State or Jurisdiction: 2005-06.” 2007. Available at https://nces.ed.gov/programs/digest/d07/tables/dt07_095.asp. Accessed November 11, 2020.
U.S. Department of Education. “Table 7. Average Class Size in Public Primary Schools, Middle Schools, High Schools, and Schools with Combined Grades, by Classroom Type and State: 2011–12.” 2013. Available at https://nces.ed.gov/surveys/sass/tables/sass1112_2013314_t1s_007.asp. Accessed November 11, 2020.
U.S. Department of Education. “21st Century Community Learning Centers.” March 2018a. Available at https://www2.ed.gov/programs/21stcclc/index.html. Accessed November 11, 2020.
U.S. Department of Education. “21st Century Community Learning Centers (21st CCLC) Analytic Support for Evaluation and Program Monitoring: An Overview of the 21st CCLC Performance Data: 2016–17.” 13th report. Washington, DC: U.S. Department of Education, 2018b.
U.S. Government Accountability Office. “Education Needs to Improve Oversight of its 21st Century Program.” GAO-17-400. Washington, DC: U.S. Government Accountability Office, 2017.
Vandell, D. L. “Afterschool Program Quality and Student Outcomes: Reflections on Positive Key Findings on Learning and Development from Recent Research.” In W. S. White & T. K. Peterson (Eds.), Expanding minds and opportunities: Leveraging the power of afterschool and summer learning for student success, 2013. Retrieved 12/8/20 from https://www.expandinglearning.org/expandingminds/article/afterschool-program-quality-and-student-outcomes-reflections-positive-key.
www.mathematica-mpr.com
Improving
public well-being by conducting high quality,
objective
research and data collection
Princeton, NJ ■ Ann Arbor, MI ■ Cambridge, MA ■ Chicago, IL ■ Oakland, CA ■ TUCSON, AZ ■ Washington, DC ■ WOODLAWN, MD
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Subject | OMB |
Author | MATHEMATICA |
File Modified | 0000-00-00 |
File Created | 2021-05-04 |