Program for International Student Assessment 2021 (PISA 2021) Main Study Recruitment and Field Test
Supporting Statement Part B
OMB# 1850-0755 v.23
National Center for Education Statistics (NCES)
U.S. Department of Education
Institute of Education Sciences
Washington, DC
August 2019
revised October 2019
TABLE OF CONTENTS
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
B.2 Procedures for the Collection of Information 1
B.2a Statistical Methodology 1
B.3 Maximizing Response Rates 9
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
PISA 2021 assesses students nearing the “end of their compulsory school experience” and, as all prior administrations of PISA, is conducted in the United States by the National Center for Education Statistics (NCES) within the U.S. Department of Education. For international comparability, the target population is defined as students who are 15 years old, in grades 7 or higher. A range of exact birthdates is specified by the international coordinating committee based on the months in which the data will be collected. However, students must be between the ages of 15 years and 3 completed months and 16 years and 2 completed months at the beginning of the testing period. In the U.S., the universe for the selection of schools is all types of schools in the 50 states and the District of Columbia. Within sampled schools, students will be selected for participation by drawing a random sample among the 15-year-old students.
This section presents information on the PISA international standards and description for school and student sampling, recruitment, and data collection procedures for the PISA 2021 main study. Gaining schools’ and students’ cooperation in voluntary research is increasingly challenging and employing effective strategies for gaining the cooperation of schools is central to the data collection effort. PISA 2021 main study states, districts, and schools will be recruited beginning in October 2020 and data collection will be conducted from September-November 2021.
The Technical Standards for PISA 2021 main study established by the international governing board include the following:
Standard 1.8 The student sample size for the computer-based mode is a minimum of 6,300 assessed students, and 2,100 for additional adjudicated entities, or the entire PISA Defined Target Population where the PISA Defined Target Population is below 6,300 and 2,100 respectively. The student sample size of assessed students for the paper-based mode is a minimum of 5,250. The minimum student sample size for financial literacy in the national sample is an additional 1,650 students, for a total of a minimum of 6,900 students that need to be assessed in PISA 2021 in the United States. If individual states participate in the U.S. to obtain state-level estimates, each state administering financial literacy would add approximately 550 students.
Standard 1.9 The school sample size needs to result in a minimum of 150 participating schools, and 50 participating schools for additional adjudicated entities, or all schools that have students in the PISA Defined Target Population where the number of schools with students in the PISA Defined Target Population is below 150 and 50 respectively. Countries not having at least 150 schools, but which have more students than the required minimum student sample size, can be permitted, if agreed upon, to take a smaller sample of schools while still ensuring enough sampled PISA students overall.
Standard 1.10 The minimum acceptable sample size in each school is 25 students per school (all students in the case of school with fewer than 25 eligible students enrolled).
Standard 1.11 The final weighted school response rate is at least 85 percent of sampled eligible and non-excluded schools. If a response rate is below 85 percent, then an acceptable response rate can still be achieved through agreed upon use of replacement schools.
Standard 1.12 The final weighted student response rate is at least 80 percent of all sampled students across responding schools.
Standard 1.13 The final weighted sampling unit response rate for any optional cognitive assessment is at least 80 percent of all sampled students across responding schools. In addition, NCES has a standard in which student response rate should be at least 85 percent, and the sampling design described below is based on that rate.
The design for this study will be self-weighting, stratified, consist of two stages, and will use probability proportional to size (PPS). There will be no oversampling of schools or students. Schools will be selected in the first stage with PPS and students will be sampled in the second stage yielding overall equal probabilities of selection.
Target Populations
The national PISA target population is 15-year-old students attending education institutions located within the U.S. in grades 7 and higher. The target population for any participating state is the same. The plan is to implement the main survey in the fall of 2021, with a field test in the spring of 2020. The specific definition of age eligibility that will be used in the survey is “…between 15 years and 3 (completed) months to 16 years and 2 (completed) months at the beginning of the testing window.”
Sampling Frame of Schools
The population of schools for PISA 2021 is defined as all schools containing any 15-year-olds in grades 7 through 12. As in previous PISA cycles, the school sampling frame for the PISA 2021 main study sample will be developed from the most up to date NCES Common Core of Data (CCD) and Private Schools Survey (PSS) datasets. For the PISA 2021 field test, we will use the school sampling frame prepared for the National Assessment of Educational Progress (NAEP) 2019, which uses the 2017-2018 CCD and the 2017-2018 PSS school data. We will minimize overlap, to the degree possible, with NAEP and the Progress in International Reading Literacy Study (PIRLS) which will be collecting data in schools during the 2019-2020 (NAEP LTT 2020 and PIRLS 2021 Field Test) and 2020-21 (NAEP 2021 and PIRLS 2021 Main Study) school years.
The grade structure of the school is a key stratification variable designed to reduce sampling error, but this is especially so in PISA because data analyses have shown that achievement is highly related to grade. Other stratification variables may include public/private, region of the country, location (urban/suburban/town/rural, etc.), and enrollment by race/ethnicity.
Field Test Sampling
International standards do not require a formal probability sample of schools for the PISA field test. It is sufficient that the samples of schools be representative of a broad range of schools from across the U.S.. The national field test requires a minimum student sample of 2,400 students. The U.S. plans to select a sample of 50 schools each with two substitute schools, with the expectation that 50 schools will ultimately participate, to provide for an adequate participating student sample. Among the 50 schools, 45 will be public schools and 5 will be private schools. This allows for school and student non-response and also for school level and within-school level exclusions.
To obtain a school sample that is broadly representative of schools across the U.S., we will target a convenience sample of schools with grade 9 and above and enrollment of at least 60 students in grades 9 and 10 (where most 15-year-olds are found) excluding schools with grades 7 and 8 only, small public schools, and schools sampled by NCES for other educational studies in 2019 and 2020. We will use the sample stratification characteristics used in previous PISA cycles including census region, locality (city/urban fringe/town/rural MSA), school type (public/private), grade span, and minority enrollment. The sample will be a stratified systematic sample, with sampling probabilities proportional to measures of size, where the measure of size is the estimated number of 15-year-olds.
The Maple sampling software provided by the PISA International Consortium will be used to select the student sample in each school. For the field test, the target cluster size of students per school will be 60 students with the goal of assessing at least 48 students per school (after refusals and student ineligibility). For the PISA 2021 main study, we expect to have a target cluster size of 52, the same as it was in PISA 2018.
Both in the field test and the main study, each school will have a school coordinator to provide assistance with arranging for the study and prepare a list of all eligible students in the school using a standardized Student Listing Form. Each completed list will be submitted to Westat and the information entered into the Maple sampling software. The collected data will be used only to select the student sample in each school. Once no further follow-up with sampled students is necessary, all student listing data will be destroyed.
Field Test Instrumentation and Design
The PISA field test has the following goals:
evaluation of the invariance of item parameters compared to previous PISA cycles for the 2021 cycle (CBA and PBA);
evaluation of the invariance of item characteristics (in preparation for adaptive testing) when delivered in fixed unit order vs. variable unit order;
estimation of initial item parameters to evaluate the quality of the new items in preparation for the selection of main study items and for the implementation of the adaptive design for mathematics in the main study;
evaluation of sampling and survey operations aspects; and,
assessment of how well the computer platform functions within and across participants.
Cognitive Design. The PISA 2021 design is planned to follow closely that of PISA 2018, in that the field test will contribute information to construct a multistage adaptive testing (MSAT) design for mathematics, as was done for reading in 2018. The reading MSAT will be carried forward in PISA 2021.
For the PISA 2021 field test, there are a total of 66 forms in the field test containing a combination of clusters in trend mathematics and science, trend mathematics and reading MSAT, trend mathematics and new mathematics, trend mathematics and financial literacy, financial literacy and reading, or only new mathematics. For the assessment, which will be administered in a 2-hour session, students will receive one form with a combination of clusters depending on the form. The forms are combined and organized in three distinct groups in order to ensure adequate coverage of newly developed items and to examine the psychometric properties of the items. There will be MSAT in 2021 for reading as a minor domain. MSAT was successfully developed and administered for reading as a major domain in PISA 2018. As stated above, one of the goals to be met by the field test is collecting information in preparation of the planned introduction of MSAT for the major domain of mathematical literacy, the use of a reduced reading MSAT, and the use of previously used nonadaptive designs for the other minor and innovative domains. The field trial design will thus include variable unit positioning within clusters and will investigate the effects of variable unit positioning versus fixed positions in preparation for the main study, the hypothesis being that item parameter invariance is only supported when using intact clusters.
Cognitive items to be administered in the field trial consist of the following subjects and number of clusters (groups of items/units):
Mathematics = 6 intact trend clusters and 12 new clusters
Reading = a reduced version of the 2018 reading MSAT;
Science = 6 intact trend clusters
Financial Literacy = 2 clusters assembled from new and trend items
The field test assessment design utilizes 6 trend clusters and 12 new clusters of mathematics items, 6 trend clusters of science, 2 clusters combining new and trend items for financial literacy, and for reading, a reduced version of the reading MSAT. These clusters are organized in a rotation within three groups of students. Within a school, sampled students will be assigned to each of the three groups.
Group 1 will receive 4 trend clusters of combinations of science and mathematics, mathematics and reading, mathematics and financial literacy, or and reading and financial literacy. These clusters will be fixed unit order. Group 1 is expected to yield 137 responses per mathematics and science item and 80 responses per reading item per participant.
The approach from 2015 and 2018 to reduce the distinction between major and minor domains has been supported by analytic approaches that utilize data from multiple cycles. Data collected during the major domain cycle provides a basis for the analyses of two subsequent minor domain cycles. In terms of the field test data analysis, Group 1 forms can be directly linked to the existing data from prior cycles, The variability in the psychometric characteristics of the 2021 field test data collected relative to the 2018 analysis results gives a baseline for the magnitude of error expected across data collections when moving to mathematics adaptive testing. The variability in Group 1, given fixed unit order within a cluster with full construct coverage is the lower bound that can be used to evaluate the variability of psychometric characteristics observed for Group 2 (variable unit order) and Group 3 (fixed unit order) of the new mathematics clusters. Also, the same variability will be referred to for comparisons of mathematics trend clusters of Group 1 (fixed unit order) and Group 2 (variable unit order).
Group 2 will receive new and trend items in mathematics. These items will apply variable unit ordering within clusters. The design will provide variations in unit ordering within clusters and can be examined relative to Group 1. Each of the 24 Group 2 forms contains a combination of one to six trend mathematics clusters and 3 of the 12 new mathematics clusters. Every trend cluster will be paired with a new cluster once and appears once in each position. The design is expected to yield, per participant, 103 responses per trend item and 154 responses per new item.
Group 3 contains new mathematics clusters and is based on a fixed order of units to provide a basis of comparison to the varying unit orders in Group 2. Each form will be administered to 757 students total.
Financial Literacy Design. The U.S. is again participating in the optional financial literacy assessment in 2021. The 2021 assessment design is the same as the one used in PISA 2018, with an expanded student sample used to assess financial literacy in the same session that mathematics, science, and reading are assessed. Approximately 12 students will be assigned financial literacy assessment items while the remaining 48 students with receive combinations of mathematics, reading and science items. Each student sampled for financial literacy will receive two clusters of math or reading and two clusters of financial literacy. The design for financial literacy is based on a yield of 384 assessed students. These students will be selected separately from the core assessment but will be administered the assessment in the same session as the students taking the core assessment and will be included in Group 1 (see explanation above). The financial literacy instrument will contain 2 clusters with trend items from 2012, 2015, and 2018 as well as new interactive items.
Background Questionnaire Instruments. The questionnaires have been developed to address the questionnaire framework developed for PISA 2021. The framework defines 19 modules across the school and student questionnaires comprising student background characteristics, teaching and learning practices, school governance, and non-cognitive/metacognitive constructs dealing with reading-related outcomes, attitudes, and motivational strategies. In addition, the questionnaires include items that have been included in multiple cycles of PISA, allowing the investigation of patterns and trends over time.
School questionnaire. The principal or designate from each participating school will be asked to provide information on basic demographics of the school population and more in-depth information on one or more specific issues (generally related to the content of the assessment in the major domain, which is mathematics in 2021). Basic information to be collected includes data on school location; measures of socio-economic context of the schools’ student population, including location, school resources, facilities, and community resources; school size; staffing patterns; instructional practices; and school organization. The in-depth information is designed to address a very limited selection of issues that are of particular interest and that focus primarily on the major content domain, mathematics. For both the field test and main study, it is anticipated that the school questionnaire will take approximately 45 minutes. It will be available to respondents online. The intent is that the school principal will respond to the questionnaire, but they may designate someone to complete the questionnaire. Principals, or their designate, access the questionnaire through a secure website with a username and password provided to them in their invitation (see Appendix A). Responding to the instrument is flexible. Respondents may break off and return to continue responding to the questionnaire by logging in again with their credentials. In addition to completing questions, respondents may review and change their responses to previously answered questions.
Student questionnaires. In the 2021 cycle, the U.S. will administer three student questionnaires that will be completed in a single student questionnaire session following the assessment session: the core student questionnaire, financial literacy questionnaire and the information and communication technology (ICT) familiarity questionnaire. The content of these instruments is described below. Students access the questionnaires through the Student Delivery System (SDS), in the same way they access the assessment, approximately 15 minutes after the assessment session has ended. The three questionnaires are designed to look like a seamless one-hour questionnaire, with no hard breaks between them. Unlike the school questionnaire, the student questionnaires are only administered from the SDS and students only access the questionnaires in the questionnaire session. That is, student access to the questionnaires can only be accomplished during the in-school session; there is no access to the student questionnaire outside of school hours.
Student core questionnaire. Participating students will be asked to provide information pertaining primarily to the major assessment domain in 2021, mathematics. Information to be collected includes demographics (e.g., age, gender, language, race, and ethnicity); socio-economic background of the student (e.g., parental education, economic background); student's education career; and access to educational resources and their use at home and at school, which have been standard questions in PISA since the earliest rounds. Domain-specific information will include instructional experiences and time spent in school, as perceived by the students, and student attitudes towards mathematics. The goal is for the student questionnaire to take approximately 30 minutes to complete in the field test and the main study. Although two forms of the student questionnaire were attempted in the PISA 2018 field test, the PISA 2018 main study reverted to fielding a single form. The PISA 2021field test will implement a matrix sampling design where different respondents will receive different sets of items to reduce student burden while maintaining content coverage across relevant areas. This approach is viable for PISA 2021 due to the limited time available for the questionnaire and the large student sample size in large-scale assessments. This design is expected to be carried to the main study.
The approach being proposed for PISA 2021 will utilize an alternative matrix sampling design that rotates questions within constructs instead of across constructs. In the PISA 2021 proposed within-construct matrix sampling design, every student will receive questions on all constructs but only answer a subset of all questions for each construct, thus resulting in a complete database in terms of construct-level indices. This approach will be implemented for a select number of scales in the field test. Using within-construct matrix sampling is an innovation to the questionnaire design in PISA 2021 that has not been used in previous PISA cycles. A decision on the use of this design for the main study will be made based on an empirical evaluation of the PISA 2021 field test results.
Financial Literacy (FL) questionnaire. The FL questionnaire aims to examine students’ experience with money matters, such as having savings accounts, debit or prepaid cards, as well as whether they have experienced financial-related lessons in their school careers. Many of the items in the FL questionnaire were previously administered in 2012, 2015, and 2018, with a handful of new items being piloted in the field trial. The FL questionnaire for students is expected to take approximately 15 minutes to complete.
Information and Communication Technology (ICT) Familiarity questionnaire. The ICT questionnaire aims to examine students’ ICT activities and domain-specific attitudes including access to and use of ICT at home and at school, students’ attitudes towards and self-confidence in using computers, self-confidence in doing ICT tasks and activities; and navigation indices extracted from log-file data (number of pages visited, number of relevant pages visited). The ICT questionnaire for students is expected to take approximately 15 minutes to complete. The U.S. successfully administered the ICT questionnaire in the 2018 cycle of PISA.
Main Study Sampling
For the core computer-based assessment in mathematics, reading, and science, the international required minimum number of completed assessments is 6,300 students in 150 schools. An additional 1,650 assessed students are required for education systems assessing financial literacy for total of 7,950 assessed students. In past PISA rounds up until 2018, the U.S. typically assessed between 5,600 and 5,900 students in 165 schools and sampled 42 students per school. However, as was already done in PISA 2018 main study, in PISA 2021, in order to achieve a larger number of students assessed as required for financial literacy, and to account for anticipated nonparticipation and student ineligibility, wherever possible, we will sample 52 students per school (42 students for the core assessment + 10 students for financial literacy). Assuming the same response level as in PISA 2018, the initial target is a total sample of about 288 schools, with estimated 256 schools eligible, to yield about 218 participating schools (assuming a total 85 percent participation rate among schools, after replacement). As allowed under the international sampling standards, to achieve the target final school response rate, we will use replacement schools to complete the sample.
The student-per-school target for the core assessment is at least 42 completed student assessments per school. Assuming a within-school student assessment response rate of 85 percent (rates were 85 percent in 2000, 82 percent in 2003, 91 percent in 2006, 86 percent in 2009, 89 percent in 2012, 89 percent in 2015, and 85 percent in 2018), the original sample size of students within schools will be 52.1 In schools that do not have 52 PISA-eligible students, all eligible students will be sampled. Should any states participate in the 2021 assessment, each state would have a sample of 54 schools and 2,808 students to yield approximately 2,330 assessed students. As the main study plans for states and subnational jurisdictions are finalized, this information will be updated in the respondent burden table in the Supporting Statement Part A.
Nonresponse Bias Analysis, Weighting, Sampling Errors
It is inevitable that nonresponse will occur at the school and student level. We will analyze the nonrespondents and provide information about whether and how they differ from the respondents along dimensions for which we have data for the nonresponding units, as required by NCES statistical standards. After the international contractor calculates weights, sampling errors will be calculated for a selection of key indicators incorporating the full complexity of the design, that is, clustering and stratification.
Based on recruitment knowledge gained in PISA 2018 and other NCES studies, states and districts will be notified about the study before the sampled schools in their jurisdictions are contacted. School staff often wants to be assured that their school district knows about the study before they agree to participate. The planned PISA 2021 approach to state, district, and school recruitment is described in this section, and all of the respondent recruitment materials for the field test are provided in Appendix A and B.
State Recruitment. For the field test, in December 2019, state education agencies (SEAs) in states that contain schools sampled for the PISA 2018 main study will be mailed a package that includes the state letter and PISA 2021 field test advanced materials. We are working with the NAEP State Coordinators (NSCs) in this state recruitment effort. Some NSCs will send the state letter and personally follow-up to answer any questions; otherwise, Westat school recruiters will mail the package and follow-up with the districts and schools being recruited. The main study contact will begin in October 2020 and follow the same procedures.
District Recruitment. Also, in December 2019, shortly after the state mailing, advance packages to district superintendents will be mailed. Each package contains an introductory letter, including a list of sampled schools in the district’s jurisdiction, and the PISA 2021 field test advanced materials. The district mailings will come from the NSC or Westat, depending on each NSC’s preference. Shortly after the mailing, the district superintendent will be contacted by phone to inform him/her of the study, ensure they received the PISA 2021 field test package, and answer any questions they may have. Any issues with approaching schools in the district are also discussed at the time. The PISA 2021 main study will follow the same procedures with district contact beginning in October 2020, shortly after the state contact.
The PISA 2021 study staff responds to districts’ requirements such as research applications or meetings to provide more information about the study. If a district chooses not to participate, the recruiter documents all concerns listed by the district so that a refusal conversion strategy can be formulated. As in PISA 2018 and previous NCES studies, some NSCs talk to district staff themselves, others mail the package but do not further contact districts, while still others do not want to be involved in district recruitment at all. In cases where the NSCs do not wish to follow-up, Westat’s recruiters will work directly with the districts.
Based on prior recruitment experience on a variety of NCES studies, some districts are designated as “special handling districts.” Contacting special handling districts begins with updating district information based on researching online sources, followed by calls to verify the information about where to send the completed required research application forms, and, if necessary, to collect contact information for this process. During the call, inquiries are also made about the amount of time the district spends reviewing similar research applications.
School Recruitment. After each district of sampled public schools has been informed of the study and has confirmed the receipt of the PISA 2021 field test package, the school mailings will be triggered. All of the school mailings, taking place from December 2019 through February of 2020, contain an introductory letter offering each sampled school a $250 check as a thank you for participation (to be mailed with a thank you letter after the end of the data collection); the PISA brochure and timeline; and school administrator and student Frequently Asked Questions (FAQs) sheets. Shortly after the mailing, the NSCs or Westat recruiters phone the school administrator to discuss the study, gain cooperation, and assign a school staff person to serve as the PISA 2021 field test school coordinator who will work with the PISA staff to manage the data collection in the school. The school coordinator will act as the liaison between study staff and their school.
In cases where recruitment proves more difficult, school recruiters consult with the PISA home office to evaluate a conversion plan for each school. Typically, the types of general issues are principals who are difficult to reach, school staff who are considering participation but not providing a final decision, principals and/or staff who express a concern and need follow-up, and principals who may require additional appreciation for agreeing to participate. The PISA 2021 main study contact of schools will follow the same procedures, with contact anticipated to begin in late October or early November 2020 and continue through the winter and spring of 2021.
School Coordinator Contact. Shortly after permission has been granted from the school administrator, Westat emails the school coordinator the MyPISA website registration information. The MyPISA website (described in more detail below) contains the e-filing instructions and templates, and other logistical information for the assessment. Each school will receive a unique ID that will support multiple school users. Each user must then provide their own contact information and set up their own unique account. This registration process has been used across all of NCES’ international studies and in NAEP.
For the field test, beginning in February 2020, school coordinators of participating schools will receive a handbook detailing the procedures for administering the PISA 2021 survey in the school, and providing timelines and instructions for submission of the list of students via MyPISA. Westat PISA staff will also call the school coordinator to discuss the PISA activities at the school, including when to begin constructing and e-filing the student list. School coordinators of participating schools in the main study will receive these materials beginning in August of 2021.
Student sampling. Beginning in February 2020 for the field test, student lists will be collected to draw a student sample in each school. The student lists are designed to collect students’ names, grade, gender, and month and year of birth. Edit checks will be run during sampling to check column mappings and completeness of data to ensure that all student listing forms are constructed properly for sampling (see Appendix A).
After student sampling has been completed, student tracking forms will be generated in Maple and distributed to the school coordinator via MyPISA. These forms will also be used by the PISA field staff for administering the assessment and recording student participation status.
After students are sampled and school coordinators are notified of the sampled students, each school coordinator will notify and provide consent materials to parents of sampled students using the materials provided in Appendix B. Prior to the assessment, the test administrator assigned to each school will collect a copy of the notification and consent materials the school sent to parents as proof that the school took care of parental notification and consent. The process for the main study is the same as described for the field test. The main study student sampling will begin in August 2021, closer to start of the main study data collection in October-November 2021.
Role of the MyPISA Website. The central purpose of MyPISA is to provide a way for the schools to securely upload a list of students and to provide the tracking forms to the school coordinator and PISA field staff with the sampled students. MyPISA is also used as a source for information about PISA such as copies of advanced materials and descriptions of survey activities, school actions, and instructions and templates for submitting lists. School registrants are encouraged to update their contact information and access information about the study in MyPISA.
PISA 2021 field test data collection will occur in March –April 2020. Prior to data collection, Westat PISA staff will complete various pre-survey activities.
Pre-survey Activities
In February 2020, the school coordinators will receive a handbook and instructions for assembling a student list. The lists will be submitted to Westat via MyPISA and the samples will be drawn during February 2020. Beginning in February 2020, school coordinators will be asked to do the following activities:
Create and submit a list of eligible students via MyPISA;
Receive and distribute to sampled study materials from the PISA Home Office;
Encourage the principal to complete their questionnaire; and
Meet with the PISA test administrator to review assessment logistics and hold a brief meeting with the sampled students to show a PISA video presentation explaining the study and the students’ role and contribution and to answer questions about PISA.
Data Collection
The school questionnaire will be available electronically, with a hard copy available upon request. The principal will be given links to their questionnaires as part of invitation to PISA and will also be emailed their personalized link to the questionnaire. The student questionnaire will be administered as part of the SDS provided to countries by the International Consortium. The SDS includes the PISA main assessment, the financial literacy assessment, and the student questionnaires. The field test PISA questionnaires are provided in Appendix C.
The PISA assessments are administered to students by trained PISA test administrators hired and trained by Westat. The PISA test administrators will bring all assessment equipment to the school including student laptops and peripheral equipment (power cords and mice). They are responsible for set-up and breakdown of equipment and administration of the assessment to the students. All that is required from the school is an adequate space to set up the equipment and hold the assessment.
Students begin the data collection activities by entering a room containing desks and PISA laptops. Upon entering, each student is handed a slip of paper by the PISA test administrator, which contains each student’s unique log-in information. The first screen that students see is the SDS launch screen (see Appendix C, p. 83), which acts as a portal for all PISA student data collection activities. The PISA test administrator gives the students the verbal instruction to select the assessment, and students then click through and use their log-in information to begin. The log-in information is saved and does not need to be re-entered again. Students complete the cognitive assessment, take a short (about 15 minutes) break, and return to complete the student questionnaires, which are designed to be experienced as a single questionnaire., and which end the student data collection.
Throughout the data collection period, PISA staff and the school coordinator will monitor the return of school questionnaires and, working in conjunction, will follow-up with non-responders as needed. A ‘window-is-closing’ non-response follow-up effort will be utilized, gently reminding principals to complete their questionnaires. Near the end of data collection, the two reminders before the last are designed to establish a deadline effect and will be followed with an extension email. This campaign-style approach is designed to provide soft reminders across the data collection window, while creating a sense of urgency to respond towards the end.
School and school coordinator incentive checks will be distributed after the assessment is completed. Incentives will be mailed to schools on a weekly basis throughout the data collection period. Student incentives will be distributed to the students at the end of the questionnaire session.
Our approach to maximizing school and student response rates in the main study includes the following:
Use of a fall test administration, to avoid major conflicts with state testing;
Selecting and notifying schools at least a year in advance;
Communicating with state and district officials early in the process and applying a more proactive approach with states by coordinating with NAEP State Coordinators to gain assistance with sampled schools;
Assigning personal recruiters for specific schools and use of personal visits to districts and/or schools early in the contact process;
Monetary incentives for schools, school coordinators, and students (see Section A.9) and below;
School report incentive for schools;
Volunteer service certificate of 4 hours from the U.S. Department of Education incentives for students;
Contact with schools and school coordinators at set intervals throughout the year preceding the assessment;
Use of an informational video about PISA 2021 to motivate student participation and full effort during the assessments; and
Use of individually tailored refusal conversion strategies to encourage participation (as detailed above in the section entitled “Respondent Recruitment”).
Our approach to gaining cooperation from respondents is multipronged and takes into consideration feedback and advice that NCES has received over the years. Beyond monetary incentives, we offer feedback to the school (in the form of the school report) and service credit for student participants (in the form of volunteer service time). In addition, we try to engage districts and schools in the study by informing them of findings and results from previous rounds in the form of short videos and focused findings offered by OECD and NCES.
PISA 2021 main study schools that meet the criteria for receiving a school report (see section A.9 of Supporting Statement Part A), will be provided school-level PISA 2021 results. While individual-level scores cannot be produced from PISA data, a school level report showing comparative data for the school can be produced when the school has a participation rate of 85 percent or better and at least 10 assessed students. The results in the school-level report will be comparative results that do not provide actual school scores, but rather indicate how the school performed compared to country averages and to other US schools with similar demographic characteristics. For PISA 2021, we are also attempting to design a second, alternate report to provide information from the contextual questionnaires for schools that do not meet the requirements for receiving the standard school report. This second “B” report will need to be reviewed and approved by the NCES chief statistician prior to implementation.
These approaches are based on recommendations from an NCES panel, experience with previous PISA administrations, as well as extensive discussions with NCES’ chief statistician.
Participation in the field test is an international requirement for participating in the PISA 2021 main study. The main focus of the field test is to collect enough assessment data to perform reliable tests of the items, to evaluate newly developed assessment items and new or revised multi-stage adaptive test designs, and to test the survey operations. The field test is also used to evaluate recruitment, data collection, and data management procedures in preparation for the main study, including recruitment methods for obtaining school and student participation. The results of the field test are analyzed by OECD. In the U.S., NCES uses the field test results to: (a) determine the final main study design and decide in which international options the U.S. will participate, (b) improve recruiting strategies and materials for the main study in the U.S., and (c) finalize all study procedures.
Many people at OECD, ETS, and other organizations around the world have been involved in the design of PISA. Some of the lead people are listed in section A8. Overall direction for PISA conducted in the U.S. is provided by Patrick Gonzales, the PISA National Project Manager, and other staff at NCES.
1 For the national main study sample, we expect to draw an initial sample of 288 schools. Taking into account closed, merged, and ineligible schools (historically, around 11% of sampled schools), as well as the historical school-level response rate, we anticipate interacting with/recruiting about 256 of these schools, of which, we estimate, 218 will participate in the PISA 2021 main study. To obtain the required number of students, we will ask to sample up to 52 students in each school. However, some of the smaller schools will not have 52 students available. We estimate: 218 schools x 52 students sampled x 0.945 (a factor used to account for variations in student population sizes across the schools) = 10,713 starting student sample size that we will work to recruit. Based on historical student assessment rates (~83%), we estimate that, in the end, we will assess about 8,892 students (10,713 x 0.83), which will assure that we meet the minimum required 7,950 assessed students.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | PREFACE |
Author | Janice Bratcher |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |