International Computer and Information Literacy Study (ICILS 2018) MAIN STUDY
OMB# 1850-0929 v.6
Supporting Statement Part B
Submitted by:
National Center for Education Statistics (NCES)
Institute of Education Sciences (IES)
U.S. Department of Education
Washington, DC
November 2017
The respondent universe for the ICILS main study is all students enrolled in grade 8, provided that the mean age at the time of testing is at least 13.5 years of age. The teacher target population in ICILS consists of all teachers that are teaching regular school subjects to 8th grade students (regardless of the subject or the number of hours taught) during the ICILS testing period and since the beginning of the school year. The universe for the selection of schools is all types of schools in approximately six populous states. A sample of 38 schools was selected for the field test, with the goal of obtaining participation from a minimum of 32 schools. A sample of 353 schools will be selected for the main study, with the goal of obtaining participation from a minimum of 300 schools.
For the field test, in each sampled school, a minimum of 20 students were selected across classes. If fewer than 20 students could be found in a school, they were all included in the survey. Also, if the number of eligible students is greater than 20, but less or equal to 25, all students will be selected in order to prevent the situation that a small number of students are the only ones not being surveyed. For the main study, a minimum of 30 students will be selected across classes. If fewer than 30 students can be found in a school, they will all be included in the survey. Also, if the number of eligible students is greater than 30, but less or equal to 32, all students will be selected in order to prevent the situation that a small number of students are the only ones not being surveyed.
Within schools, the minimum sample size requirements for teachers are identical in the field test and main study. In each sampled school, a minimum of 20 teachers will be selected. If fewer than 20 eligible teachers can be found in a school, they will all be included in the survey. Also, if the number of eligible teachers is greater than 20, but less or equal to 22, all teachers will be selected in order to prevent the situation that a small number of teachers are the only ones not being surveyed.
School administrators of each school will also be asked to complete a school questionnaire.
Main Study Sampling Plan and Sample
The school sample design for the main study must be more rigorous than that for the field test. It must be a probability sample of schools that fully represents the entire United States. At the same time, to ensure maximum participation it must be designed so as to minimize overlap with other NCES studies involving student assessment that will be conducted around the same time.
The main study will take place in the spring of 2018, at the same time as the Trends in International Mathematics and Science Study (TIMSS) 2019 field test, at grades 4 and 8. As well, some NAEP testing will be taking place, but 2018 is not a Main NAEP year. The number of overlapping schools between ICILS and other studies will be kept to a minimum. The Program for International Student Assessment (PISA) 2018 main data collection, for 15-year-olds, will take place in the fall of the subsequent school year to ICILS. PISA will include a small number of grade 8 schools in the sample, but very few grade 8 students will be assessed in PISA, given that very few grade 8 students meet the PISA 15 years of age requirements. Thus overlap control between the ICILS and PISA samples will not be necessary. Overlap control procedures in studies such as this, where stratified probability proportional to size samples of schools are selected, can be implemented via a procedure that applies Bayes Theorem to modify the conditional probability of selection of a given school for one study, depending upon its selection probability for a second study, and whether or not it was selected for that study. This approach was first documented in a survey sampling application by Keyfitz (1951)1. The principles involved can be extended to more than two studies simultaneously, and a procedure for doing this is described by Chowdhury et al. (2000)2.
The minimum sample size for the ICILS main study will be 300 schools. For each original sample school, two replacement schools will also be identified. The sampling frame will be obtained from the most current versions of NCES’s Common Core of Data (CCD) and Private School Survey (PSS) files, restricted to schools having grade 8, and eliminating schools in Puerto Rico, U.S. territories, and Department of Defense overseas schools.
The sample will be stratified according to school characteristics such as public/private, Census region, poverty status (as measured by the percentage of students in the school receiving free or reduced-price lunch in the national free and reduced-price lunch (FRPL) program). This will ensure an appropriate representation of each type of school in the selected sample of schools.
Determining school eligibility, student eligibility, and student sampling will be accomplished as described below.
Schools will be selected with probability proportional to the school’s estimated grade enrollment of eighth graders. A minimum of 30 students will be selected within each school with equal probability (unless there are 30 or fewer grade 8 students, in which case all grade 8 students will be taken with certainty). The use of a probability proportional to sample design ensures that all students have an approximately equal chance of selection.
Student sampling and teacher sampling will be accomplished by selecting a minimum of 30 grade 8 students and a minimum of 20 grade 8 teachers per school. Each school selected will be asked to prepare a list of grade 8 students and a list of grade 8 teachers in the school. As described above, schools will submit these student lists and teacher lists via secure E-filing. Students will be selected from the comprehensive list of all target grade students using a systematic random sample, and teachers will be randomly selected from the teacher list.
Nonresponse Bias Analysis, Weighting, and Sampling Errors
It is inevitable that nonresponse will occur at both levels: school and student. We will analyze the nonrespondents and provide information about whether and how they differ from the respondents along dimensions for which we have data for the nonresponding units, as required by NCES standards. After the calculation of weights, sampling errors will be calculated for a selection of key indicators incorporating the full complexity of the design, that is, clustering and stratification (see Appendix C for detailed).
B.3 Maximizing Response Rates
The most significant challenge in recruitment for international assessments has been engaging the schools and gaining their cooperation. Historically, other assessments such as TIMSS which select classrooms for student sampling have student participation rates that never go below 90 percent (see Table 1). However, student participation may be more of a challenge for ICILS than other international assessments because ICILS requires a random student sample within schools rather than a selection of classrooms. In addition, it is important to U.S. ICILS that students are engaged and try to do well on the assessment.
Table 1. Historical TIMSS school and student participation rates
Year |
Grade |
School Participation Rate |
Overall Student Participation Rate |
|
Before Replacement |
After Replacement |
|||
2015 |
8 |
78 |
84 |
94 |
2011 |
8 |
87 |
87 |
94 |
2007 |
8 |
68 |
83 |
93 |
2003 |
8 |
71 |
78 |
94 |
1999 |
8 |
83 |
90 |
94 |
Our approach to school recruitment is to:
Obtain endorsements about the value of schools’ participation in ICILS from relevant organizations;
Work with NAEP state coordinators;
Inform Chief State School Officers and test directors about the sample of schools in their state. Enclose a sample letter of endorsement they can send to schools;
Send letters and informational materials to schools and districts. These letters will be customized by the type of school;
Train experienced school recruiters about ICILS;
Implement strategies from NAEP’s Private School Recruiting Toolkit, which was developed for NAEP and includes well-honed techniques used to recruit a very challenging type of schools;
Follow-up mailings with telephone calls to explain the study and schools involvement, including placing the ICILS assessment date on school calendars;
Offer schools $200 for participation;
Maintain continued contact until schools have built a relationship with the recruiter and fully understand ICILS;
Offer a $100 incentive to the individual at the school identified to serve as the school coordinator; and
Make in-person visits to some schools, as necessary.
B.4 Purpose of Field Test and Data Uses
An ICILS field test was conducted in spring 2017 across all participating education systems in order to evaluate items, study operations, and the electronic delivery method. The IEA delayed the field test data collection period for all participating countries from the originally planned collection period of March through May 2017 to the end of May through January 2018, because the IEA’s electronic delivery method, called the “eAssessment system,” for administering the ICILS Player (or ICILS electronic assessment), was not ready to be fielded in March. Only data collected in May and June 2017 are eligible to be included in field test data analysis to inform the main study test items. Field testing after June 2017 will be used to evaluate study operations only.
NCES attempted to recruit all schools from the original sample for field test data collection at the end of May, but recruiting in this timeframe was very difficult, as many schools were at the last few days of the schoolyear or already closed before the end of May. Thus, a total of 13 schools in the U.S. participated in the 2017 field test. In addition to this schedule delay burden, the IEA’s ICILS Player still had a number of glitches, freezes, and the data capture for scoring of large tasks did not work properly. The U.S. team scored some items as planned and IEA had to score some of the items that did not capture properly. No issues were found with U.S. specific operations, such as the use of Surface Pros for test administration. Internationally, the IEA collected enough data for a solid field test of the items in order to perform reliable analyses of the items. Particular country data were examined for any issues with items, but only as related to the whole set of international data. In September 2017, at an international meeting of all participating countries the field test analyses were discussed and decisions were made on fixes to items based on the discussions.
In addition, given the difficulty with the IEA’s eAssessment system for the ICILS Player, IEA made the decision to change back to the previous version of the ICILS Player that was used in the 2013 administration of ICILS – the SoNET System (developed by SoNET contractors). Currently, SoNET is working to move the new items into the SoNET System ICILS Player. Given this change, the IEA is allowing countries to continue field testing from a study operational standpoint only, to test and pilot this new system, and they have extended the Field Test of study operations into January 2018. The U.S. is conducting pretesting using the SoNET System (OMB# 1850-0803 v.207). The U.S. will also go into one field test school in January 2018 to test operations using the field test materials in order to test the SoNET System in a real setting. Additionally, the electronic delivery method will be fully tested internally and through pretesting prior to the main study.
B.5 Individuals Consulted on Study Design
Overall direction for ICILS in the United States is provided by Lydia Malley, ICILS National Research Coordinator at NCES, within the U.S. Department of Education, with support from Stephen Provasnik, the TIMSS National Research Coordinator at NCES.
The IEA studies are developed as a cooperative enterprise involving all participating countries. An international panel of computer and information literacy and measurement experts provide substantive and technical guidance for the study and National Research Coordinators participate in extensive discussions concerning the projects, usually with advice from national subject matter and testing experts.
The majority of the consultations (outside NCES) involve the Australian Council for Educational Research (ACER), the international study center for ICILS. ACER staff are responsible for designing and implementing the study in close cooperation with the IEA Secretariat, the IEA Data Processing and Research Center, and the national centers of participating countries. Key staff from ACER include: Dr. John Ainley (project coordinator), Mr. Julian Fraillon (research director); and Dr. Wolfram Schulz (assessment coordinator), all of whom have extensive experience in developing and operating international education surveys (especially related to ICILS). Key staff from the IEA include Sabine Tieck (sampling statistician) and Michael Jung (IEA DPC).
1 Keyfitz, N. (1951). Sampling with Probabilities Proportional to Size: Adjustment for Changes in Probabilities. Journal of the American Statistical Association, 46, 105-109.
2 Chowdhury, S., Chu, A., & Kaufman, S. (2000). Minimizing overlap in NCES surveys. Proceedings of the Survey Methods Research Section, American Statistical Association, 174-179. Retrieved from http://www.amstat.org/sections/srms/ proceedings/papers/2000_025.pdf.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Calvin Choi |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |