National Center for Education Statistics
National Assessment of Educational Progress
Volume I and II
Supporting Statement
NAEP Science Interactive Computer Task and Hands-on Task Tablet Study
OMB# 1850-0803 v.91
November 27, 2013
1 Submittal-Related Information 3
2 Background and Study Rationale 3
3 Recruitment and Sample Characteristics 4
4 Study design and data collection 5
5 Consultations outside the agency 6
6 Justification for Sensitive Questions 6
7 Paying Respondents 6
8 Assurance of Confidentiality 6
9 Estimate of Hourly burden 7
10 Cost to federal government 8
11 Project Schedule 8
Appendix A: Sample School Contact Script
Appendix B: Sample Parent/Guardian Notification Letter
Appendix C: Sample School Debriefing Script
This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB #1850-0803) that provides for NCES to conduct various procedures (such as field tests, cognitive interviews, usability studies) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.
The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is administered by NCES, part of the Institute for Education Sciences, in the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the various subject areas and to also collect survey questionnaire (i.e., non-cognitive) data to provide context for the reporting and interpretation of assessment results.
NAEP has been assessing science at grades 4, 8, and 12 since 1969. The first assessments were administered via paper-and-pencil, and they have since evolved to include interactive computer tasks (ICTs) and hands-on performance tasks (HOTs). ICTs challenge students to solve scientific problems, often by simulations. They allow students to demonstrate the skills involved in solving problems through scientific investigations without the logistical constraints associated with a natural or laboratory setting. Through HOTs, students have the opportunity to physically manipulate objects and perform scientific investigations. Each hands-on task allows students to demonstrate how well they are able to plan and conduct scientific investigations, reason through complex problems, and apply their knowledge in real-world contexts.1
As digital tools continue to play an important role in today’s classroom, NCES is exploring new ways to use technology—including tablets or other touch-enabled devices—to gauge what students know and can do. This study will help inform how NCES can use and maximize touch-enabled devices to capture how students respond to scientific challenges in ICTs and HOTs. It will also allow NCES to explore the logistical challenges related to administering NAEP assessments on touch-enabled devices.
Previously, NAEP science ICTs were administered on a computer without touch-screen capabilities. For this study, ICTs will be administered on touch-enabled devices. In addition, NAEP has previously administered science HOTs in which students worked with laboratory materials and other equipment to perform scientific investigations, while receiving instructions and recording their responses using paper-and-pencil booklets. For this study, students will continue to complete HOTs by working with laboratory materials and performing scientific investigations, but they will use a touch-enabled device rather than a paper-and-pencil booklet to receive directions, record results, and respond to questions. Finally, NAEP has previously administered ICTs and HOTs to separate samples of students. For this study, students will respond to both types of tasks in one sitting.
Another NAEP study2 is examining the usability implications of administering on touch-enabled devices, e.g., the impact of changes in screen size and input device (touch screen vs. mouse) on students’ ability to interact with assessment items. In this study, NCES will
examine the administration implications of administering ICTs and HOTs on touch- enabled devices;
examine the administration implications of administering HOTs that combine interactions with laboratory materials and touch-enabled devices;
test a new assessment delivery system optimized for touch-enabled devices; and
collect student response information for the ICTs and HOTs that will inform future item development.
For the study, each student will be administered two ICTs and one HOT on a touch-enabled device.3 Students will not be asked to complete any survey questions as part of the assessment, but NAEP field administration staff will conduct a short 5-10 minute group debriefing session with students to ask for their feedback on the assessment experience (see Volume II).
Results from this study will not be released, but they will be used to inform future NAEP assessments, including an administration of the ICTs and HOTs in 2015.
Volume I of this submittal contains descriptions as well as design, sampling, burden, cost, and schedule information for the study. Volume II contains the student debriefing questions, while the appendices contain sample scripts and notification documents.
The study requires a minimum of 200 students per grade at grades 4, 8, and 12. Depending on the number of schools and students who are able to participate, up to 400 students per grade may participate, for a maximum total of approximately 1,200 students. Only schools that have not been selected to participate in NAEP assessments in the 2013–2014 school-year will be selected for this study so that students will not have had recent experience with NAEP assessments. A sample parental notification letter (see appendix B) will be provided to the schools for their use to notify parents or guardians of students in the study. The school principal may edit it; however, the information regarding confidentiality and the appropriate law reference will remain unchanged.
Up to 27 schools will participate in the study. NAEP State Coordinators (see section 5) will leverage their relationships within the states to contact schools and identify those schools willing to participate in the study. The NAEP State Coordinators will forward the contact information for participating schools to Westat, the NAEP data collection contractor (see section 5). Westat field administration staff will contact each school to make the arrangements for students from that school to participate (see appendix A for sample school contact script).
From each school, approximately 40–50 students will be asked to participate. To ensure a sample that is diverse with respect to gender, race, and familiarity with technology, Westat will request a random mix of students. However, schools will also have the option of providing an intact class, such as a homeroom, for convenience. The parents/guardians of the participating students will be sent a notification letter in advance of the study (see appendix B for a sample notification letter).
Prior to the study (see timeline in Table 3), Westat field administration staff will contact cooperating schools to confirm student sampling and make logistical arrangements. Westat field administration staff who are familiar with technology-based administrations will conduct the study. They will bring all necessary materials, including the touch-enabled devices, to the schools on the assessment days.
The study will be conducted in April through May 2014. On the day of the assessment, a field administration team will administer the science ICT and HOT tablet study via touch-enabled devices with two groups of up to 25 students each, scheduled consecutively.
Students will be asked to complete one 15-minute ICT, one 30-minute ICT, and one 30-minute HOT. The ICTs are extended performance tasks, which embed multiple items into a scenario, providing context and motivation. Scenarios will be administered as interactive computer-based tasks allowing students to, for example, use simulated laboratory equipment in order to conduct scientific investigations. ICTs consist of long and short varieties that differ in length and complexity. Students will be allowed 30 minutes to complete the long ICT and 15 minutes to complete the short ICT. Each student will also answer one 30-minute HOT, for which they will receive a kit of laboratory materials from the field administration staff. Students will use the touch-enabled device to check the kit of laboratory materials, receive instructions for how to work with the laboratory materials, and record their responses to questions and results from their experiments.
The study will require approximately 100 minutes (15 minutes for getting students situated and logged on to the touch-enabled devices, 75 minutes of assessment time, and another 10 minutes for a short classroom debriefing session).4
During the study administration, field administration staff will observe student interactions with the tasks and equipment and record student questions and issues that may arise during the sessions. After all students have completed the two ICTs and one HOT, they will engage in a 5–10 minute verbal classroom debriefing session with the field administration staff. The field administration staff will ask the students questions about their experiences working with the equipment, kit of laboratory materials, and instructions (see Volume II for the classroom debriefing questions). The field administration staff will record the students’ responses to the questions. In addition, after each assessment, the field administration staff will telephone the school coordinator for a debriefing interview. The purpose of this interview is to obtain feedback on how well the assessment went in that school and any issues that were noted. (See appendix C for the debriefing interview script.)
In addition to the touch-enabled device, students will be provided with all other necessary equipment, including an attached keyboard, a stylus, and earbuds, as well as the laboratory materials for the HOT.
The response data for the study will be processed and scored for statistical analysis. Analyses will be performed at the overall task level as well as on items within individual tasks in order to assess difficulty, reliability, and common and uncommon response patterns.
Responses from the classroom debriefing session will be used, in combination with the school debriefing responses and feedback collected from field administration staff, to inform logistical planning for future test administrations.
Educational Testing Service (ETS)
ETS serves as the NAEP Item Development contractor (including development of the science HOTs) and as the NAEP Science ICT Item Development contractor. In addition, ETS serves as the Design, Analysis, and Reporting (DAR) contractor. ETS staff will be involved in item development, scoring, and analysis activities for the science ICT and HOT tablet study.
Fulcrum IT LLC (Fulcrum IT)
Fulcrum IT is the NAEP contractor responsible for the development and ongoing support of NAEP technology-based assessments for NCES, including the system to be used for the science ICT and HOT tablet study.
NAEP State Coordinators
The NAEP State Coordinator serves as the liaison between the state education agency and NAEP, coordinating NAEP activities in his or her state. As previously noted, NAEP State Coordinators will work with schools within their states to identify participating schools.
Westat is the Sampling and Data Collection (SDC) contractor for NAEP. Westat will administer the science ICT and HOT tablet study.
Throughout the item and debriefing question development processes, effort has been made to avoid asking for information that might be considered sensitive or offensive.
The study will take place during regular school hours, and thus there will not be any monetary incentive for the student participants. However, students will be permitted to keep the earbuds used during the study.
The study will not collect any personally identifiable information. Prior to the start of the study, participants will be notified that their participation is voluntary and that their answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (20 U.S.C. §9573)].
Written notification will be sent to parents/guardians of students before the study is conducted (see appendix B). Participants will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form.
School administrators and personnel provide pre-assessment information and help with the logistics of student and room coordination and other related duties. The school administrator burden is estimated at 10 minutes for the pre-assessment contact. The school personnel burden is estimated at one hour for administration support and 10 minutes for the post-assessment debriefing interview.
Parents of participating students will receive a letter explaining the study, for which the parent’s burden is estimated at 3 minutes. An additional burden (15 minutes) is estimated for a small portion of parents (up to 120) who may write refusing approval for their child or may research information related to the study.
Up to approximately 1,200 students from 27 schools will participate in the study. Student burden is calculated based on the up to 10 minutes of time for the classroom debriefing questions and 15 minutes for setup, out of the total study time of 100 minutes.5
Estimated hourly burden for the participants is described in Table 1, below.
Table 1. Estimate of Hourly Burden
Person |
Task |
Number of Individuals |
Individual Participant Burden |
Total Burden |
School Administrator |
Initial Contact by NAEP State Coordinator |
27 |
10 minutes |
5 hours |
School personnel |
Scheduling & Logistics |
27 |
70 minutes |
32 hours |
Parents |
Initial Notification |
1,320 |
3 minutes |
66 hours |
Parents* |
Refusals or Additional Research |
120* |
15 minutes |
30 hours |
Students |
Tablet Study |
1,200 |
25 minutes |
500 hours |
Total (2,694 responses) |
2,574 |
NA |
* These parents are a subset of those who were initially notified.
Table 2, below, provides the overall project cost estimates.
Activity |
Provider |
Estimated Cost |
Design, preparation of tasks, scoring and analysis of data |
ETS |
$105,000 |
Recruitment and data collection activities |
Westat-SDC |
$100,000 |
Development and support of technology-based system delivery activities |
Fulcrum IT |
$155,600 |
Total |
|
$360,600 |
Table 3, below, provides the overall schedule.
Table 3: Schedule
Date |
Event |
December 2013–April 2014 |
Task and System Development and Tablet Preparation |
December 2013–March 2014 |
Recruiting |
April–May 2014 |
Data Collection |
June–July 2014 |
Scoring and Data Analysis |
1 The science framework is available at the following link: http://www.nagb.org/content/nagb/assets/documents/publications/frameworks/science-2011.pdf
2 Usability Study for Use of Touch-Screen Tablets in NAEP Assessments (OMB #1850-0803 v.87, October 2013).
3 The science ICT and HOT tablet study will be administered on Microsoft Surface Pro 2 touch-enabled devices with attachable keyboards.
4 Communications to schools and parents indicate 120 minutes to allow for transition time to and from the study classroom.
5 Similar to main NAEP assessments, the cognitive item portions of the study are not included in the burden calculation.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | TabletStudyUsability_Vol1_9-10-13 |
Subject | Operational Analysis |
Author | Fulcrum IT |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |