International Computer and Information Literacy Study (ICILS 2018) MAIN STUDY
OMB# 1850-0929 v.8
Supporting Statement Part A
Submitted by:
National Center for Education Statistics (NCES)
Institute of Education Sciences (IES)
U.S. Department of Education
Washington, DC
November 2017
revised April 2018
PREFACE 2
A. JUSTIFICATION
A.1 Importance of Information 3
A.2 Purposes and Uses of Data 3
A.3 Improved Information Technology (Reduction of Burden) 5
A.4 Efforts to Identify Duplication 5
A.5 Minimizing Burden for Small Entities 6
A.6 Frequency of Data Collection 6
A.8 Consultations outside NCES 6
A.9 Payments or Gifts to Respondents 6
A.10 Assurance of Confidentiality 8
A.13 Total Annual Cost Burden 10
A.14 Annualized Cost to Federal Government 10
A.15 Program Changes or Adjustments 11
A.16 Plans for Tabulation and Publication 11
SUPPORTING STATEMENT PART B
APPENDICES
A: Main Study Recruitment and Parent Materials (approved April 20, 2017, OMB # 1850-0929 v.5)
B.1 – Changes from ICILS U.S. Field Test Questionnaires to U.S. Main Study Questionnaires
B.2 –ICILS U.S. Main Study Questionnaires
C: Non-response Bias Analysis Plan
PREFACE
The International Computer and Information Literacy Study (ICILS) is a computer-based international assessment of eighth-grade students’ computer and information literacy (CIL) skills. ICILS was first administered internationally in 2013 in 21 education systems and will be administered again in 2018. The United States will participate for the first time in the 2018 administration. U.S. participation in this study will provide data on students’ skills and experience using technology to investigate, create, and communicate, and will provide a comparison of U.S. student performance and technology access and use with those of the international peers. This study will also allow the U.S. to begin monitoring the progress of its students compared to that of other nations and to provide data on factors that may influence student computer and information literacy skills. The data collected through ICILS will provide valuable information with which to understand the nature and extent of the “digital divide” and has the potential to inform understanding of the relationship between technology skills and experience and student performance in other core subject areas.
ICILS is conducted by the International Association for the Evaluation of Educational Achievement (IEA), an international collective of research organizations and government agencies that create the assessment framework, assessment, and background questionnaires. The IEA decides and agrees upon a common set of standards and procedures for collecting and reporting ICILS data, and defines the study timeline, all of which must be followed by all participating countries. As a result, ICILS is able to provide a reliable and comparable measure of student skills in participating countries. In the U.S., the National Center for Education Statistics (NCES) conducts this study and works with the IEA and Westat to ensure proper implementation of the study and adoption of practices in adherence to the IEA’s standards. Participation in ICILS will allow NCES to meet its mandate of acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations [The Educational Sciences Reform Act of 2002 (ESRA 2002) 20 U.S.C. §9543].
The ICILS Assessment Framework defines computer and information literacy (CIL) as an “individual’s ability to use computers to investigate, create, and communicate in order to participate effectively at home, at school, in the workplace, and in the community” (Fraillon, Schulz, & Ainley, 2013, p. 18). ICILS reports on eighth-grade students’ abilities to collect, manage, evaluate, and share digital information, as well as their understanding of issues related to the safe and responsible use of electronic information. Achievement scores are reported across four proficiency levels of computer and information literacy (CIL). ICILS also collects a variety of data to provide context and investigate student access to, use of, and engagement with ICT at school and at home, school environments for teaching and learning CIL, and teacher practices and experiences with ICT.
In preparation for the ICILS 2018 main study, all countries were asked to implement a field test in 2017. The purpose of the ICILS field test was to evaluate new assessment items and background questions, to ensure practices that promote low exclusion rates, and to ensure that classroom and student sampling procedures proposed for the main study are successful. The U.S. ICILS main study will be conducted from March 5 through May 25, 2018, and will involve a nationally-representative sample of at least 9,000 eighth-grade students from a minimum of 300 schools.
This request is to conduct the ICILS 2018 main study data collection. The request for the field test and for recruitment for the main study (also referred to as the full scale data collection) were approved in August 2016 (OMB# 1850-0929 v.1), with the latest change request approved in April 2017 (OMB# 1850-0929 v.1-4). The materials to be used in the main study are based upon those that were approved most recently in April 2017. With that submission, NCES adequately justified the need for and overall practical utility of the main study as proposed and an overarching plan for the phases of the data collection over the next 3 years, and provided as much detail on the measures to be used as was available at the time of the submission. Thus OMB approved all aspects of the initial phase of this collection, and now NCES published a notice in the Federal Register allowing a 30-day public comment period on the details of the subsequent phase of this collection (the ICILS 2018 main study) described in this submission.
This submission describes the overarching plan for all phases of the data collection, including the 2018 main study. The Supporting Statements Parts A and B are largely the same as the approved versions (OMB# 1850-0929 v.1-5), with an updated burden table to reflect main study data collection, and updated cost to the federal government (Part A.14). Please note that the field test data collection window was extended to January 2018 in order to allow for countries to test operations of a change in the electronic delivery method from the field test to the main study, and the U.S. will conduct additional pretesting (approved in September 2017; OMB# 1850-0803 v.207; see Part B.4for more details). In addition to the supporting statements Parts A and B, Appendix A provides main study recruitment materials consisting of letters to state and district officials and school principals, text for an ICILS field test brochure, “Frequently Asked Questions,” a “Summary of Activities,” parental notification and consent materials, and student and teaching listing instructions. Appendix A was first approved in August 2016 (OMB# 1850-0929 v.1) and updated materials were approved in April 2017 (OMB# 1850-0929 v.5).
Because ICILS is a collaborative effort among many parties, the United States must adhere to the international schedule set forth by the IEA, including the availability of draft and final questionnaires. The content of Appendix B2 has been updated with the U.S. versions of the 2018 ICILS main study questionnaires. The main study questionnaires are a subset of the field test questionnaires, which were approved in an earlier submission. Additionally, a description of the modifications that have been made to the previously approved U.S. versions of the field test instruments (approved in April 2017, OMB # 1850-0929 v.5) is also included in this submission (Appendix B1). Lastly, the Non-Response Bias Analysis Plan has been added in Appendix C.
A. Justification
Benchmarking of U.S. student achievement against other countries continues to be of high interest to education policymakers, and informs policy discussions of economic competitiveness and workforce and post-secondary preparedness. ICILS provides a unique opportunity to compare U.S. eighth-grade students’ computer and information literacy skills and access to and use of technology with that of their peers in countries around the world. ICILS was developed internationally as a response to the increasing use of information and communication technology (ICT) in modern society and the need for citizens to develop relevant skills in order to participate effectively in the digital age.
Moreover, many international assessments and the National Assessment of Educational Progress (NAEP) are undergoing transitions from paper-based format to technology-based format. The Trends in International Mathematics and Science Study (TIMSS) is in the process of making this transition for its next administration in 2019 (eTIMSS). An important question that is not currently addressed by an existing national data collection is the extent to which students’ computer skills and experience with digital devices and instruction based on technology matters for their performance on technology-based assessments like eTIMSS. In order to support these transitions, ICILS will provide information to inform NCES about eighth-grade students’ computer skills in an effort to better understand the possible “digital divide” that may impact U.S. student performance in other subject areas, such as mathematics and science as measured by eTIMSS.
ICILS identifies the strengths and weaknesses of student computer and information literacy skills relative to participating countries around the world. It also provides valuable benchmarking information about educational polices enacted in other countries and policies that could be applied to U.S. educational practices.
Based on other similar international assessment data releases (such as TIMSS and the Progress in International Reading Literacy Study (PIRLS)), it is likely that the results of this study will draw great attention in the United States and elsewhere. It is therefore expected that ICILS will contribute to ongoing national and international debates and efforts to improve computer and information literacy and support access to and use of technology.
ICILS assesses computer and information literacy knowledge and skills at grade 8 cross-nationally. ICILS asks how well students are prepared for life in the information age, exploring several key questions about student CIL and its contexts: 1) How does student computer and information literacy vary within and between countries; 2) What factors influence students' computer and information literacy; and 3) What can education systems and schools do to improve students' computer and information literacy? In order to gather data to explore such questions, ICILS also collects background information on students, teachers, schools, and official education policies.
Data compiled and collected from ICILS 2018 will allow for evidence-based decisions to be made for the purposes of educational improvement. The study will provide policymakers and education systems with an important data source on the contexts and outcomes of ICT-related education programs, and the role of schools and teachers in supporting students’ computer and information literacy achievement.
Through participation in ICILS and other international assessment programs, NCES is able to provide comparative indicators on student performance and school practices across countries in order to benchmark U.S. student performance, and to suggest hypotheses about the relationship between student performance and factors that may influence performance as well as areas in which students have strengths or weaknesses. The international studies identify differences among countries that can inform discussions about how to improve educational contexts and outcomes.
NCES’s mandate [Section 406 of the General Education Provisions Act, as amended (20 U.S.C. 1221e-1)] specifies that "The purpose of the Center [NCES] shall be to collect and analyze and disseminate statistics and other information related to education in the United States and in other nations." and the Educational Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543) specifies that NCES shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including acquiring and disseminating data on educational activities and student achievement in the United States compared with foreign nations. ICILS is essential for any international perspective on students’ computer and information literacy, and U.S. participation in ICILS is aligned with both the national and international aspects of NCES' mission.
ICILS consists of a computer-based student assessment and background questionnaires for students, teachers, school principals.
The CIL student assessment framework for 2018 is based on the structure of the 2013 framework with some modifications. Under the 2018 framework, CIL will be assessed as two dimensions with the working titles: computational thinking (dimension 1) and digital information (dimension 2). Each dimension is divided into strands, which refer to the overarching conceptual category for framing the skills and knowledge addressed by the CIL instruments. Under computational thinking, two strands will be measured: conceptualizing problems and operationalizing solutions. Computational thinking is a new strand added to the framework for 2018, which is optional for participating countries. The United States will be participating in this strand. Under the digital information dimension, four strands will be measured: understanding computers, gathering information, producing information, and digital communication. The digital information dimension is largely equivalent to the CIL construct reported in ICILS 2013 and trend analysis will be based on this dimension.
Assessment Instruments
The student CIL assessment is composed of modules which include three types of tasks: (i) multiple-choice or constructed response items based on realistic stimulus material; (ii) software simulations of generic applications requiring students to complete an action in response to an instruction; and (iii) authentic tasks that require students to modify and create information products using 'live' computer software applications. The CIL assessment will include 3 trend modules (used in the 2013 administration of ICILS) and 4 new modules (2 of the new modules will fall under the computational thinking dimension), which are expected to take 30 minutes each to complete. To reduce burden on individual students, each student will complete 4 assessment modules according to a module rotation design, for a total of 120 minutes.
The ICILS assessment will be administered to students on a digital device. The type of device and specific model –computer or tablet with a keyboard and mouse attached – is to be determined by each country. Based on usability testing by the international contractor, countries were provided specifications and chose a single device for the field test and main study that meets the requirements and works within their country. The United States used the Windows Surface Pro tablet with a mouse and keyboard attached to administer the field test and will use the same for the main study.
Questionnaires
The background questionnaires for ICILS 2018 were developed to address the issues outlined in the ICILS context questionnaire framework. In accordance with international study procedures, the United States will use the international questionnaire, but will adapt some questions to fit the U.S. education context, as appropriate, and will add a few questions, such as student race/ethnicity.
School Questionnaire. The school questionnaire consists of two parts, one of which is completed by the principal and is expected to take 15 minutes, and one of which is intended to be completed by the ICT-coordinator (or school administrator indicated by the principal who is familiar with ICT in the school) and is also expected to take 15 minutes to complete (thus, 30 minutes total). The questionnaire will provide information on computer use, ICT resources, and relevant policies and practices in the school context. The school questionnaire will be offered online, with a paper-and-pencil backup.
Teacher Questionnaire. A teacher questionnaire will be administered to a random sample of 8th-grade teachers in each school and is expected to take about 30 minutes to complete. This questionnaire will provide information on computer use, ICT resources, and relevant policies and practices in the school context. The teacher questionnaire will be offered online, with a paper-and-pencil backup.
Student Questionnaire. A student questionnaire, which is computer-based, is expected to take 30 minutes, and is completed by each student after the student assessment. The student questionnaire gathers information about computer use in and outside of school, attitudes to technology, self-reported computer proficiency, and background characteristics such as gender and race/ethnicity.
A.3 Improved Information Technology (Reduction of Burden)
The ICILS 2018 design and procedures are prescribed internationally and data collection involves computer-based student assessments and questionnaires, as well as online or paper-and-pencil questionnaires for schools and teachers. Each participating nation is expected to adhere to the internationally prescribed design. In the United States, the school and teacher questionnaires will be made available to school administrators and teachers online as the main mode of administration, with a paper-and-pencil backup to facilitate user preference for participation.
A communication website will be used for ICILS 2018 in order to provide a simple, single source of information to engage and maintain high levels of school involvement. This portal will be used throughout the assessment cycle to inform schools of their tasks and to provide them with easy access to information tailored for their anticipated needs. We plan to gather eighth-grade student and teacher lists from participating schools electronically using a secure electronic filing process. Electronic filing is an electronic system for submitting lists of student information, including student background information in school records. Instructions to school coordinators on how to submit student and teacher lists are included in Appendix A. E-filing has been used successfully in NAEP for more than 10 years, and was used in TIMSS 2015 and the Program for International Student Assessment (PISA) 2012 and 2015 assessments. The electronic filing system provides advantageous features such as efficiency and data quality checks.
A.4 Efforts to Identify Duplication
In the United States, the National Assessment of Educational Progress (NAEP) technology and engineering literacy (TEL) assessment was administered to a nationally-representative sample of eighth-grade students in 2014. TEL refers to the capacity to use, understand, and evaluate technology as well as to understand technological principles and strategies needed to develop solutions and achieve goals. NAEP TEL was completely computer-based and included interactive scenario-based tasks. ICILS does not duplicate the NAEP TEL assessment, as the focus on computer and information literacy differs from the focus on engineering literacy.
ICILS 2018 is part of a program of international cooperative studies of educational achievement supported and funded, in part, by the U.S. Department of Education. These studies represent the U.S. participation in international studies involving a broad range of countries. As part of international cooperative studies, the United States must collect the same information at the same time as the other nations for purposes of making both valid international comparisons with other countries and with the potential future ICILS data collections. While some studies in the United States may collect similar, though not identical, kinds of information (e.g., NAEP TEL), the data from those studies cannot be substituted for the information collected in ICILS in that they do not allow for comparisons outside the United States. Furthermore, the data collected through ICILS is based on a unique framework that is not shared by any other state, national, or international data collection effort. In order to participate in these international studies, the United States must agree to administer the same core instruments that are administered in the other countries. Because the items measuring computer and information literacy have been developed with intensive international coordination, any changes to the instruments require international coordination and approval.
A.5 Minimizing Burden for Small Entities
The school samples for ICILS contain small-, medium- and large-size schools, including private schools, selected based on probability proportionate to their size. All school sizes are needed to ensure an appropriate representation of each type of school in the selected sample of schools. Burden will be minimized wherever possible. In addition, national contractor staff will bring laptops or tablets to the schools to conduct the assessment, will conduct all test administrations, and will assist with parental notification, consent forms, sampling, and other tasks as much as possible within each school.
A.6 Frequency of Data Collection
The IEA delayed the field test data collection period for all participating countries from the originally planned collection period of March through May 2017 to the end of May 2017 through January 2018. NCES’s preference was to collect data from schools at the end of May/early June in order to have more time to analyze the data, evaluate the study operations, and make any needed adjustments before the 2018 main study. Field testing after June 2017 is to be used to evaluate study operations only and will not contribute to analysis of field test data. The main study data collection will take place as originally scheduled, from March through May 2018. This timeline is prescribed by the international contractor for ICILS, and adherence to this schedule is necessary to establish consistency in survey operations among participating countries as well as to maintain potential trend lines. Future ICILS data collections are not yet determined.
None of the special circumstances identified in the Instructions for Supporting Statement apply to the ICILS study.
A.8 Consultations outside NCES
Consultations outside NCES have been extensive and will continue throughout the life of the project. The IEA studies are developed as a cooperative enterprise involving all participating countries. An international panel of computer and information literacy and measurement experts provide substantive and technical guidance for the study and National Research Coordinators participate in extensive discussions concerning the projects, usually with advice from national subject matter and testing experts.
The majority of the consultations (outside NCES) involve the Australian Council for Educational Research (ACER), the international study center for ICILS. ACER staff are responsible for designing and implementing the study in close cooperation with the IEA Secretariat, the IEA Data Processing and Research Center, and the national centers of participating countries. Key staff from ACER include: Dr. John Ainley (project coordinator), Mr. Julian Fraillon (research director); and Dr. Wolfram Schulz (assessment coordinator), all of whom have extensive experience in developing and operating international education surveys (especially related to ICILS).
A.9 Payments or Gifts to Respondents
In order to achieve acceptable school response rates in international studies, schools in the U.S. are usually offered $200 to thank them for their participation and the time they invest in and the space they make available for the international assessments. High response rates are required by both IEA and NCES, and are difficult to achieve in school-based studies. The U.S. has historically had difficulties in achieving sufficient participation levels. As in other international assessments, such as PIRLS and TIMSS, schools will be offered $200 for their participation in ICILS. State education agencies will be contacted in April 2017 and school districts in May 2017 to let them know that ICILS 2018 sample includes schools in their jurisdiction and to ask for their support in achieving 100% school participation rate. School recruitment will begin in May 2017 and continue until the end of data collection in May 2018. Data collection will begin in March 2018 and end in May 2018. If by April 2018, a month into the 3-months data collection window, we find that we are not able to secure school cooperation at a level required by IEA for U.S. results to be included in international comparisons, we propose to initiate a second-tier school incentive of $800, as has been effective in PISA and PIRLS. The second-tier incentive would be offered only after all other refusal conversion methods have been exhausted to schools that: (a) have accepted participation but are not following through, (b) are pending or have not responded, and (c) are not final refusals. In addition, if by April 2018, the un-weighted school response rate with replacement schools is below 85%, a sufficient number of additional replacement schools will be selected to meet the minimum response rate requirements for U.S. data to be reported (i.e., an un-weighted school response rate with replacement of at least 85%, after at least 50% of original schools response rate). Given that these newly selected replacement schools will be approached late in the short data collection window, they will be offered the second-tier $800 incentive.
The school staff serving as School Coordinators will receive $100 for their time and effort. The School Coordinator serves a critical role in data collection, functioning as the central school contact and facilitating arrangements for the assessments. They are asked to file class and student listing forms; arrange the date, time and space for the assessment; and disseminate information and consent forms to parents and students.
A check will be mailed to each school in the amount of $200 (or $800) and to each school coordinator in the amount of $100, once the ICILS assessment has been conducted in their schools.
Consistent with other international assessments, as a token of appreciation for their participation, students will receive a small gift valued at approximately $4. In the ICILS field test, each participating student received a small cloth backpack. Students will also receive a certificate with their name thanking them for participating in ICILS and representing the United States. Some schools also offer recognition parties with pizza or other treats for students who participate; however these are not reimbursed by NCES or the contractor.
Teachers will be offered $20 for completing the ICILS teacher questionnaire to encourage their participation. In order to avoid sending up to 20 checks to the school for the school coordinator to distribute to teachers who complete the questionnaire, electronic Amazon gift cards in the amount of $20 will be used. Teacher email addresses are not collected prior to the assessment. Teacher invitation cards that provide information about how to access the online teacher questionnaire are distributed to selected teachers by the school coordinator. This card will include instructions for the teacher to email the ICILS Staff Help Desk upon completion of the questionnaire and provide his or her email address. Once completion of the questionnaire will be confirmed, the code to access the Amazon electronic gift card will be emailed to the teacher. In this way, teachers will see the direct link between completing the questionnaire and receiving the $20 thank you token, and will receive the incentive very quickly after survey completion. Amazon gift cards will be used because they have no associated fees, unlike other cash card programs.
If teacher response rates are below 85% when second-tier incentives are being activated, in the schools offered the second-tier incentive described above, where teacher recruitment will begin late in the data collection window, we will offer teachers $40 for participation. The 85% teacher response rate follows international guidelines for sufficient response rates for U.S. data to be reported. These teachers will have less time in the school year to complete the survey, given that they will be recruited later than other teachers, and at this point we will need them to participate at a high rate to meet the 85% teacher response rate. Because at the time the second-tier school incentives will be initiated, in April 2018, teachers in the second-tier schools will not have been contacted by their schools to complete the survey, the $40 incentive offer will be the first offer they will receive. As such, although not a randomized experiment, this will allow us to compare response rates under the offer of the $40 vs. the $20 incentive and thus inform the efficacy of potential teacher incentive experiments in future ICILS.
Historically, participation is high among school administrators without offering incentives; therefore, no incentive will be offered for completion of the school administrator questionnaire.
A.10 Assurance of Confidentiality
Data security and confidentiality protection procedures have been put in place for ICILS to ensure that Westat and its subcontractors comply with all privacy requirements, including:
The statement of work of this contract;
Privacy Act of 1974 (5 U.S.C. §552a);
Family Educational and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));
Privacy Act Regulations (34 CFR Part 5b);
Computer Security Act of 1987;
U.S.A. Patriot Act of 2001 (P.L. 107-56);
Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);
Confidential Information Protect and Statistical Efficiency Act of 2002;
E-Government Act of 2002, Title V, Subtitle A;
Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);
The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);
The U.S. Department of Education Incident Handling Procedures (February 2009);
The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;
NCES Statistical Standards; and
All new legislation that impacts the data collected through the inter-agency agreement for this study.
Furthermore, Westat will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/.
The laws pertaining to the use of personally identifiable information are clearly communicated in correspondence with states, districts, schools, teachers, students, and parents. Letters and information materials will be sent to parents and school administrators describing the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential (see copies in appendix A):
NCES is authorized to conduct this study under the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). All of the information you provide may only be used for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).
The following statement will appear on the login page for ICILS and the front cover of the printed questionnaires (the phrase “search existing data resources, gather the data needed” will not be included on the student questionnaire):
The National Center for Education Statistics (NCES), within the U.S. Department of Education, conducts ICILS in the United States as authorized by the Education Sciences Reform Act of 2002 (20 U.S.C. §9543). All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).
According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0929. The time required to complete this information collection is estimated to average [XX] minutes per [respondent type], including the time to review instructions [, search existing data resources, gather the data needed,] and complete and review the information collection. If you have any comments or concerns regarding the accuracy of the time estimate(s), suggestions for improving the form, or questions about the status of your individual submission of this form, write directly to: International Computer and Information Literacy Study (ICILS), National Center for Education Statistics, PCP, 550 12th St., SW, 4th floor, Washington, DC 20202.
OMB No. 1850-0929, Approval Expires xx/xx/20yy.
The ICILS confidentiality plan includes signing confidentiality agreements and notarized nondisclosure affidavits by all contractor and subcontractor personnel and field workers who will have access to individual identifiers. Also included in the plan is personnel training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses; controlled and protected access to computer files under the control of a single data base manager; built-in safeguards concerning status monitoring and receipt control systems; and a secured and operator-manned in-house computing facility. Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. Neither names nor addresses will be included on any data file.
NCES understands the legal and ethical need to protect the privacy of the ICILS respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure analysis of the ICILS 2018 data when preparing the data files for use by researchers, in compliance with 20 U.S.C. §9573. Schools with high disclosure risk will be identified and, to ensure that individuals may not be identified from the data files, a variety of masking strategies will be used, including swapping data and omitting key identification variables (i.e., school name and address) from both the public- and restricted-use files (though the restricted-use file will include an NCES school ID that can be linked to other NCES databases to identify a school); omitting key identification variables such as state or ZIP Code from the public-use file; and collapsing or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files.
The questionnaires do not have items considered to be of sensitive nature.
This package shows estimated burden to respondents for all ICILS 2018 activities, and requests approval for burden to respondents for the main study data collection. Burden estimates are shown in Table A.1.
Some districts are known as “special handling districts” which require completion of a research application before they will allow schools under their jurisdiction to participate in a study. Based on an initial assessment of previous data collections of similar studies such as TIMSS, we have estimated the number of special handling districts in the main study samples (shown in Table A.1). Contacting special handling districts begins with updating district information based on what can be gleaned from online sources. Calls are then placed to verify the information about where to send the completed required research application forms, and, if necessary, to collect contact information for this process. During the call, inquiry is also made about the amount of time the districts spend reviewing similar research applications. The estimated number of such districts represents those with particularly detailed application forms and lengthy processes for approval. This operation began in March 2017 to allow sufficient time for special handling districts’ review processes. We will continue to work with these districts until we receive a final response (approval or denial of request) up until the end of data collection in May 2018.
For the main study the target sample size for the United States are 300 schools and 9,000 students. The minimum sample size requirements for the field test were 32 schools and 570 students. The burden table assumes exceeding the minimum requirements and is based on a sample of 11,250 students in the main study. The time required for students to respond to the assessment (cognitive items) portion of the study and associated directions are shown in gray font and are not included in the totals because they are not subject to the PRA. Student, administrator, and teacher questionnaires are included in the requested burden totals. Recruitment and pre-assessment activities include the time to review study plans by the school districts that require research application and approval before contacting their schools, and the time involved in a school deciding to participate, completing teacher and student listing forms, distributing parent notification and consent materials, and arranging assessment space. Burden estimates for the main study data collection are also provided for information purposes in table A.1.
The hourly rates for teachers/instructional staff, noninstructional staff/coordinators, and principals ($28.75, $22.58, $45.86 respectively) are based on Bureau of Labor Statistics (BLS) May 2016 National Occupational and Employment Wage Estimates1, assuming 2,080 hours per year. The federal minimum wage of $7.25 is used as the hourly rate for students. For the ICILS main study, a total of 79,451 burden hours are anticipated, resulting in an estimated burden time cost to respondents of approximately $177,264.
Table A.1. Burden estimates for ICILS 2018 Main Study.
Data collection instrument |
Sample size |
Expected response rate |
Number of respondents |
Number of responses |
Minutes Per respondent |
Total burden Hours |
|
|
|
|
|
|
|
School Recruitment (Original Schools) |
353 |
0.76* |
270 |
270 |
90 |
405 |
School Recruitment (Replacement Schools) |
120 |
0.25* |
30 |
30 |
90 |
45 |
School Coordinator |
353 |
0.85 |
300 |
300 |
240 |
1,200 |
District IRB Staff Study Approval |
40 |
1 |
40 |
40 |
120 |
80 |
District IRB Panel Study Approval |
240 |
1 |
240 |
240 |
60 |
240 |
Student Directions |
11,250 |
0.85 |
9,562 |
9,562 |
15 |
2,391 |
Student Assessment |
11,250 |
0.85 |
9,562 |
9,562 |
120 |
19,124 |
Student Questionnaire |
11,250 |
0.85 |
9,562 |
9,562 |
30 |
4,781 |
School Administrator Questionnaire |
300 |
1 |
300 |
300 |
30 |
150 |
Teacher Questionnaire (20 teachers / school) |
6000 |
0.85 |
5,100 |
5,100 |
30 |
2,550 |
Total Burden Main Study |
-- |
-- |
15,842 |
15,842 |
-- |
9,451 |
Note: Total Burden Requested in this Submission includes the already approved burden associated with ICILS 2018 recruitment and pre-assessment activities, as well as the data collection burden (items in black font). Total student burden does not include the time for the cognitive assessment and its associated instructions (in gray font). Minutes per respondent for School Recruitment include time for the second-tier recruitment effort. * Satisfactory sampling participation rate includes a final unweighted school response rate of at least 50% of original schools and at least 85% of original plus replacement schools, with original school sample as the denominator. |
No cost to respondents is anticipated beyond the estimated burden cost described in Section A.12.
A.14 Annualized Cost to Federal Government
The cost to the federal government for conducting ICILS 2018, including 2018 main study data collection and scoring, is estimated to be $4,583,626 over a 2-year period (see table breakdown below). The cost for the main study data collection is $2,837,231. These figures include all direct and indirect costs.
Components with breakdown |
Estimated costs |
FIELD TEST (2017) |
|
Recruitment |
125,000 |
Preparations (e.g., adapting instruments, sampling) |
307,650 |
Data collection, scoring, and coding |
712,643 |
MAIN STUDY (2018) |
|
Recruitment |
149,885 |
Preparations (e.g., adapting instruments, sampling) |
451,217 |
Second-tier recruitment |
31,000 |
Data collection, scoring, and coding |
2,837,231 |
Current package components |
2,868,231 |
Grand total |
$4,614,626 |
A.15 Program Changes or Adjustments
The apparent increase in respondent burden is due to the fact that the last approval was for the ICILS main study recruitment and field test, while this request reflects burden for the ICILS 2018 main study recruitment and data collection.
A.16 Plans for Tabulation and Publication
Based on the data collected in the main study, ACER will prepare an international report to be released in November 2019. As has been customary, NCES will also release a report at the same time as the international reports are released, interpreting the results for the U.S. audience. NCES reports on initial data releases are generally limited to simple bivariate statistics. There are currently no plans to conduct complex statistical analyses of either dataset. An example of a similar report on another international assessment can be found at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2013010rev. In the spring of 2020, ACER will also release the international technical report, describing the design and development of the assessment as well as the scaling procedures, weighting procedures, missing value imputation, and analyses. After the release of the international data, NCES plans to release the national data and an accompanying User’s Guide for the study.
Electronic versions of each publication are made available on the NCES website. Schedules for tabulation and publication of ICILS 2018 results in the United States are dependent upon receiving data files from the international sponsoring organization. With this in mind, the expected data collection dates and a tentative reporting schedule are provided in table A.2.
Table A.2. Schedule of Activities for ICILS 2018 Field Test and Main Study.
Dates |
Activity |
April—December 2016 |
Prepare data collection manuals, forms, assessment materials, questionnaires |
October 2016—December 2016 |
Contact and gain cooperation of states, districts, and schools for field test |
December 2016—March 2017 |
Select student samples |
May/June 2017 ( and through January 2018 for operations testing only) |
Collect field test data |
July 2017—October 2017 |
Deliver raw data to international sponsoring organization |
August 2017—November 2017 |
Review field test results |
March 2017—May 2018 |
Prepare for the main study/recruit schools |
April 2017 |
Begin to contact and gain cooperation of state education agencies with sampled schools |
May 2017 |
Begin to contact and gain cooperation of school districts with sampled schools |
October 2017 |
Resume school contacts |
March 2018—May 2018 |
Collect main study data |
June 2019 |
Receive final data files from international sponsors |
June 2019—November 2019 |
Produce report |
A.17 Display OMB Expiration Date
The OMB expiration date will be displayed on all data collection materials.
A.18 Exceptions to Certification Statement
No exceptions to the certifications are requested.
1 The average hourly earnings of teachers/instructional staff in the May 2016 National Occupational and Employment Wage Estimates sponsored by the Bureau of Labor Statistics (BLS) is $28.75 (hourly rate of Middle School Teachers), of noninstructional staff is $22.58, and of principals/education administrators is $45.86. If mean hourly wage was not provided it was computed assuming 2,080 hours per year. The exception is student wage, which is based on the federal minimum wage. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ data type: Occupation codes: Middle School Teachers, except special Education and Career/Technical Education (25-2022)); Education, Training, and Library Workers, All Other (Elementary and Secondary Schools) (25-9099); and Education Administrators, Elementary and Secondary Schools (11-9032); accessed on October 6,2017.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Calvin Choi |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |