Justification

Vol 1 NAEP 2018 Assessment Delivery Study.docx

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [docx]
Download: docx | pdf


National Center for Education Statistics

National Assessment of Educational Progress



Volume I

Supporting Statement



The National Assessment of Educational Progress (NAEP)

2018 Assessment Delivery Study - Revised



OMB# 1850-0803 v.210

(revised v.203)







revised September 2017





Table of Contents






  1. Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB# 1850-0803), which provides for NCES to conduct various procedures (e.g., focus groups, cognitive interviews, usability testing) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments and study procedures, in order to improve the resulting data quality, utility, and study participant experience.

  1. Background and Study Rationale

The National Assessment of Educational Progress (NAEP) is a federally authorized survey (the National Assessment of Educational Progress Authorization Act; 20 U.S.C. §9622) of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is conducted by NCES, which is part of the Institute of Education Sciences, within the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the different subject areas and collect survey questionnaire (i.e., non-cognitive) data from students, teachers, and principals to provide context for the reporting and interpretation of assessment results.

NAEP is exploring options for how the assessment will be delivered in the future including what type of digital devices might be used. One option under consideration is transitioning from using NAEP-provided digital devices (currently tablets) to using the equipment available in schools (e.g., desktops, laptops, tablets, etc.). Under the current NAEP administration model, field staff bring in all equipment needed for the administration, including the tablets and a router. However, as a cost-saving measure and to allow students to test on the devices with which they are most familiar, NAEP is researching the feasibility of using school-based equipment. Another option under consideration is transitioning to a less-expensive NAEP-provided digital device, such as Chromebooks. Regardless of whether school equipment or NAEP-provided equipment will be used, NAEP field staff will continue to come into each sample school to setup and administer the assessment.

A special study is planned for 2018 in which a sample of schools in Virginia will be tested using three different administration models:

  • Model 1: School-owned equipment running Pearson’s TestNav delivery system1,

  • Model 2: School-owned Chromebooks running the eNAEP test delivery system, and

  • Model 3: NAEP-provided Chromebooks running the eNAEP test delivery system.

The purpose of the three different models is to explore their feasibility and to begin to determine the pros and cons of each model. Model 1 will allow NAEP to gain logistical experience with preparing for and using the wide range of devices currently in schools. Model 2 will allow NAEP to make a first step in running the current delivery system (eNAEP) on non-NAEP-provided devices. Model 3 will allow NAEP to explore the possibility of continuing to provide digital devices to schools using a less expensive device than the current tablets NAEP provides.

In this study, a sample of fourth, eighth, and twelfth grade students will be given NAEP content on a range of digital devices. The assessment will consist of cognitive blocks covering mathematics content and a survey questionnaire consisting of (1) debrief questions about students’ experience in this special study, (2) core questionnaire items, and (3) subject-specific items (see Volume II for survey questionnaire items).

The purpose of the study is to understand the logistical implications of transitioning to a new administration model and not to do statistical analysis on student results. Key areas of focus for the study will be to understand:

  • The support needed to work with schools during the pre-assessment window to ensure that school equipment and infrastructure is ready to deliver NAEP during the administration;

  • To troubleshoot the diverse challenges we expect to encounter using non-uniform equipment and infrastructure;

  • To train field staff for a varied administration experience; and

  • To collect information about the challenges encountered during a multi-mode administration.

Results from this study will not be publicly released, but they will be used to inform future decisions about the NAEP administration model.

  1. Recruitment and Sample Characteristics

Virginia schools that are not part of the main NAEP 2018 sample will be asked to participate on a voluntary basis. The Virginia NAEP State Coordinator (see section 5) will leverage relationships within the state to contact schools (see Appendices A and B for sample school division and school notification letters) and identify those willing to participate in the study. The NAEP State Coordinator will forward the contact information for participating schools to Westat, the NAEP data collection contractor (see section 5). Westat field administration staff will contact each school to make arrangements for students from the school to participate (see Appendices D and E for sample school contact scripts).

A total of 99 schools will be contacted during recruitment about participating in the study. The schools will take a technology survey (see Volume 2) to gather data about the equipment and infrastructure in their school. Based on the results of the survey, a final sample of 69 schools will be selected to participate in the study (see Appendices F and G). Of the 69 schools selected to participate in the study, 3 schools will be recruited to participate in each, round 1 and round 2 testing (which do not include students), and 63 schools will be recruited to participate in round 3 testing (which will include 50 students per school). The sample will target participation from schools with various demographic characteristics including a mix of urban/suburban/rural, socioeconomic backgrounds, and race/ethnicity.

  1. Study design and data collection

The study will include three rounds of testing. The first two rounds will consist of preliminary testing of school infrastructure and will not include students (only Model 2 will be included in rounds one and two).2 In round one, NAEP staff will make an on-site visit to a total of three schools (one school each at grades 4, 8, and 12) to do a test run of administering NAEP on school-owned Chromebooks using the school’s infrastructure (e.g., Internet). The purpose of this test is to do a dry run to detect any issues we might anticipate in other schools (e.g., bandwidth limitations, firewalls, unexpected software updates, etc.). We will then take the lessons learned from round one testing and complete a second round of the same type of testing in a total of three schools (one school each at grades 4, 8, and 12). The purpose of round two is to mitigate any challenges encountered during round one and to test solutions and/or workarounds. Students will not be included in either round one or two. The third round of testing will include the administration of all three models with students following normal NAEP administration procedures. See table 1 for a summary of the three rounds of testing.

Table 1. Summary of participation in rounds one, two, and three:


Model 1

Model 2

Model 3

Round One (testing of school infrastructure/no student involvement)

N

Y

N

Round Two (testing of school infrastructure/no student involvement)

N

Y

N

Round Three (test administration with students)

Y

Y

Y


Schools that participate in the study will complete a school technology survey in advance of the assessment so that NAEP can gather information about the technology currently in schools. The information collected from these surveys will be used to prepare for the administration of Model 2 of this study and also to inform NCES about the types of technology in schools participating in Models 1 and 3 (see Volume II for school technology surveys).

Prior to the study, Westat field administration staff will contact cooperating schools to make logistical arrangements (see Appendices D and E for the contact scripts). Westat field administration staff will be trained on all three administration models so that they can report on their experience with each. In Models 1 and 2, the field staff will do a pre-assessment visit to work with school staff to ensure that school equipment and infrastructure is ready for the administration. In Model 3, field staff will bring all necessary materials, including digital devices and a router.

Students will be provided a tutorial on the test delivery system they will be using (eNAEP or TestNav depending on to which condition they are assigned), and then asked to complete two 30-minute cognitive blocks of NAEP mathematics content and one 20 minute questionnaire.

The study will require approximately 95 minutes (15 minutes for getting students situated and logged on to the digital devices) and 80 minutes of assessment time (60 minutes of cognitive time and 20 minutes for questionnaire). Please note that communications to schools and parents indicate 110 minutes of student time to allow for transition time to and from the study classroom.

In addition, after each assessment, the field administration staff will conduct a debriefing interview with school personnel who participated in pre-assessment and assessment activities. The purpose of this interview is to obtain feedback on how well the assessment went in that school and any issues that were noted (see Volume II for the school debriefing script).

Qualitative data about the logistics of administering NAEP using these different models will be reported and evaluated to inform decision-making about how to most effectively deliver NAEP in the future. Student response data will not be subject to any formal psychometric analyses due to the small sample size and given that the purpose of the study is to gather information on the logistics of three administration models, not on the psychometric implications for changing the administration model.

  1. Consultations outside the agency

The NAEP State Coordinator in Virginia will serve as the liaison between the state education agency and NAEP, coordinating NAEP activities in the state. The NAEP State Coordinator will work with schools within the state to identify participating schools. Westat is the Sampling and Data Collection (SDC) contractor for NAEP. Westat will administer the 2018 Assessment Delivery Study. Fulcrum IT Services, LLC is the NAEP contractor for the development and ongoing support of NAEP digitally based assessments, including the test delivery system used for Models 2 and 3 of this study. Pearson is the printing and scoring contractor for NAEP and will, for the purposes of this study, be preparing and programming the content for delivery Model 1.

  1. Justification for Sensitive Questions

Throughout the item and debriefing question development processes, effort has been made to avoid asking for information that might be considered sensitive or offensive.

  1. Paying Respondents

Schools will be given a $50 gift card to an office supply store (e.g., Staples or Office Depot) for completing the school technology survey. In addition, schools selected to participate in the assessment will receive an additional $150 gift card to encourage participation and to thank them for their time and effort. The study will take place during regular school hours, and thus there will not be any monetary incentive for the student participants. However, students will be permitted to keep the earbuds or headphones used during the study.

  1. Assurance of Confidentiality

The study will not retain any personally identifiable information. Prior to the start of the study, students will be notified that their participation is voluntary. As part of the study, students will be notified that the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).

Written notification will be sent to parents/guardians of students before the study is conducted (see Appendix C). Participants will be assigned a unique identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form.

  1. Estimate of Hourly Burden

School administrators and personnel will provide pre-assessment information and will help with the logistics of student and room coordination and other related duties. The school administrator burden is estimated at 20 minutes for the pre-assessment contact. For schools not selected for the study, the school personnel burden is estimated at 30 minutes to complete the technology survey. For Models 1 and 2, the school personnel burden is estimated at three hours including pre-assessment and assessment day support and for participating in a post-assessment debriefing interview. For Model 3, the school personnel burden is estimated to be only 1.5 hours given that no pre-assessment visit is needed because NAEP will bring in equipment and routers and will not be using school equipment or infrastructure.

Parents of participating students will receive a letter explaining the study, for which the parent’s burden is estimated at 3 minutes. An additional burden (15 minutes) is estimated for a small portion of parents (up to 60) who may write to refuse approval for their child or may research information related to the study.

Approximately 3,150 students from 63 schools will participate in the study. Student burden is calculated based on 15 minutes for setup and tutorial and 20 minutes to respond to survey questionnaire (35 minutes totally) from a total study session time of 95 minutes.3

Estimated hourly burden for the participants is described in Tables 2, 3, 4, 5, and 6.

Table 2. Estimate of Hourly Burden for Schools that Complete the School Technology Survey but are not Selected to Participate in the Study

Person

Task

Number of Respondents

Number of Responses

Hours per Respondent

Total Burden (in hours)

School administrator

Initial Contact by Westat

30

30

0.33

10

School personnel

Completing School Technology Survey

30

30

0.50

15

Total

60

60

​N/A

25

Note: Numbers have been rounded which may affect totals

Table 3. Estimate of Hourly Burden for Model 2 Schools in Round 1 and 2 testing

Person

Task

Number of Respondents

Number of Responses

Hours per Respondent

Total Burden (in hours)

School administrator

Initial Contact by Westat

6

6

0.33

2

School personnel

Scheduling, Logistics, and Debriefing Interview

6

6

1.50

9

Total

12

12

​N/A

11

Note: Numbers have been rounded which may affect totals

Table 4. Estimate of Hourly Burden for Model 1 and Model 2 Schools in Round 3 testing

Person

Task

Number of Respondents

Number of Responses

Hours per Respondent

Total Burden (in hours)

School administrator

Initial Contact by Westat

42

42

0.33

14

School personnel

Scheduling, Logistics, and Debriefing Interview

42

42

3.00

126

Parents

Initial Notification

2,100

2,100

0.05

105

Parents*

Refusals or Additional Research

40*

40

0.25

10

Students

NAEP Assessment Delivery Study

2,100

2,100

0.58

1,218

Total

4,284

4,324

​ N/A

1,473

* These parents are a subset of those who were initially notified.

Note: Numbers have been rounded which may affect totals

Table 5. Estimate of Hourly Burden for Model 3 Schools in Round 3 testing

Person

Task

Number of Respondents

Number of Responses

Hours per Respondent

Total Burden (in hours)

School administrator

Initial Contact by Westat

21

21

0.33

7

School personnel

Scheduling, Logistics, and Debriefing Interview

21

21

1.50

32

Parents

Initial Notification

1,050

1,050

0.05

53

Parents*

Refusals or Additional Research

20*

20

0.25

5

Students

NAEP Assessment Delivery Study

1,050

1,050

0.58

609

Total

2,142

2,162

​ N/A

706

* These parents are a subset of those who were initially notified.

Note: Numbers have been rounded which may affect totals

Table 6. Total Hourly Burden

Person

Task

Number of Respondents

Number of Responses

Hours per Respondent

Total Burden (in hours)

School administrator

Initial Contact by Westat

99

99

0.33

33

School personnel in schools not included in final selection

Completing School Technology Survey

30

30

0.50

15

School personnel - Model 2 Round 1 & 2

Scheduling, Logistics, and Debriefing Interview

6

6

1.50

9

School personnel - Model 1 & 2 Round 3

Scheduling, Logistics, and Debriefing Interview

42

42

3.00

126

School Personnel -Model 3 Round 3

Scheduling, Logistics, and Debriefing Interview

21

21

1.50

32

Parents

Initial Notification

3,150

3,150

0.05

158

Parents*

Refusals or Additional Research

60*

60

0.25

15

Students

NAEP Assessment Delivery Study

3,150

3,150

0.58

1,827

Total

6,498

6,558

​ N/A

2,215

* These parents are a subset of those who will be initially notified.

Note: Numbers have been rounded which may affect totals

  1. Cost to federal government

Table 7 (below) provides the overall project cost estimates.

Table 7: Estimate of Costs

Activity

Provider

Estimated Cost

Data collection activities

Westat

$607,000

Development and support of the test delivery system used for Models 2 and 3

Fulcrum

$250,000

Development and support of the test delivery system used for Model 1

Pearson

$900,000

Total


$1,757,000


  1. Project Schedule

Table 8 (below) provides the overall schedule.

Table 8: Schedule

Date

Event

August 2017December 2017

Task and System Development and Preparation

September 2017November 2017

Recruitment

September 2017October 2017

Round 1 testing of Model 2

October 2017December 2017

Round 2 testing of Model 2

January 2018

Pre-assessment school visits for Models 1 and 2

January 2018March 2018

Round 3 Data Collection (Models 1, 2, and 3)


1 Pearson currently holds the state testing contract in Virginia and administers the state tests on the TestNav system. NAEP is using TestNav in this study since VA schools are already set up to use TestNav, and this will allow NCES to explore the feasibility of using school equipment without needing to do extensive development and programming work in the current NAEP test-delivery system.

2 Models 1 (TestNav on school owned equipment) and 3 (eNAEP on NAEP-provided Chromebooks) do not require preliminary testing, because Model 1 has already been tested in VA schools through their state testing program; Model 3 is similar to the current NAEP administration model wherein field staff bring in all equipment and routers.

3 Similar to main NAEP assessments, the cognitive item portions of the study are not included in the burden totals, because they are not subject to the Paperwork Reduction Act (PRA).

8

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTabletStudyUsability_Vol1_9-10-13
SubjectOperational Analysis
AuthorFulcrum IT
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy