Justification

Vol 1 NAEP Reading and Math SBT Pretesting.doc

NCES Cognitive, Pilot, and Field Test Studies System

Justification

OMB: 1850-0803

Document [doc]
Download: doc | pdf


National Center for Education Statistics

National Assessment of Educational Progress



Volume I

Supporting Statement



National Assessment of Educational Progress (NAEP) Pretesting of Scenario-based Tasks (SBTs) for Pilot in 2019 Grades 4 and 8



OMB# 1850-0803 v.193









April 2017


Table of Contents


Submittal-Related Information

This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB# 1850-0803), which provides for NCES to conduct various procedures (such as pilot tests, cognitive interviews, and usability studies) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments.

Background and Study Rationale

The National Assessment of Educational Progress (NAEP) is a federally authorized survey of student achievement at grades 4, 8, and 12 in various subject areas such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is conducted by the National Center for Education Statistics (NCES), which is part of the Institute of Education Sciences, within the U.S. Department of Education. NAEP’s primary purpose is to assess student achievement in the different subject areas and collect survey questionnaire (i.e., non-cognitive) data to provide context for the reporting and interpretation of assessment results.

This request is to conduct, as part of NAEP assessment development process, pretesting activities including playtesting, cognitive interviews, and tryouts to collect data on newly developed scenario-based tasks (SBTs) for the 2021 grades 4 and 8 reading and mathematics assessments. These methods are used to obtain data about new digitally enhanced items, tasks, and stimuli during the NAEP development process. Pretesting is intended to enhance the efficiency of the development of assessment instruments. Before piloting, it helps to identify and eliminate problems with items and tasks. This can mean fewer challenges in scoring and analysis and higher pilot item survival rates. This request is also to test two wording versions referring to the Cybersecurity Enhancement Act of 2015 that are to follow the Confidential Information Protection and Statistical Efficiency Act (CIPSEA) confidentiality pledge cited in NAEP national studies.

The focus of pretesting overall is to determine whether items and tasks elicit targeted knowledge and skills and whether any item or task content or presentation causes confusion or introduces construct-irrelevant variance. A range of pretesting methods allows tailoring the selected approach to the purpose to be addressed. For example, some mathematics SBTs will go through playtesting at storyboard stage to provide early data on how well students grasp the tasks, to be sure there are no points of confusion, and to learn about how engaged students are by the task scenarios. Playtesting with draft programmed builds of reading SBTs has shown that useful information can also be gathered later in the development process about whether task purpose is clear and engaging, and whether items are eliciting intended responses. Cognitive labs have been used successfully to gather information about how students are processing items and tasks at draft build stage. The larger samples and timed testing conditions of tryouts are especially useful for gathering quantitative data about task timing and item performance. Tryouts have proved useful for exploring possible effects of including video in reading discrete block passages, and avatars in reading SBTs.

Recruitment and Data Collection

PLAYTESTING

Overview

In playtesting, an innovation adapted from the game‐design industry, a diverse set of students working individually or in small teams of two to four will work through and discuss storyboards or draft builds of tasks with a facilitator/observer and/or with one another. Playtesting may take place early in the process using storyboards or programmed builds. The main purpose of playtesting is to find out how diverse students react and respond to, and interact with, assessment instruments. Playtesting allows assessment developers to identify construct-irrelevant features, such as inaccessible language in item stems, uninteresting or unfamiliar scenarios, or unfamiliar interactions. Playtesting early in the development cycle can allow for refinements that can be incorporated into later item and task stages, and schedule permitting, for testing of those refinements in subsequent pretesting activities such as cognitive laboratories or tryouts.

During playtesting, students will be encouraged to talk about items and issues they confront, while observers note reactions to and potential problems with content or format. Observers may query students to draw them out, facilitate deeper reactions, or probe areas of possible confusion.

Sampling and Recruitment Plan

ETS, the contractor carrying out under a contract this pretesting for NCES, will recruit students from a range of demographic groups. Students will be recruited from areas near the ETS Princeton, New Jersey campus for scheduling efficiency and flexibility. A minimum of five students will participate in storyboard playtesting sessions for each task. For playtesting using storyboards, five students have been shown to be sufficient given that the key purpose is to identify students’ reactions to early content and signs of construct-irrelevance1. For later playtesting using programmed tasks, it has proven desirable to have 10-12 students. Playtesting sessions will be scheduled for 60 minutes for 4th grade and 90 minutes for 8th grade. Based on prior experience with similar studies, it is anticipated that some students may return to participate in multiple sessions. Playtesting is expected to involve a maximum of 136 students across mathematics and reading. Playtesting group sizes are too small to reflect a nationally representative sample. However, we will make every effort to include a diverse group representing a mix of gender, race/ethnicity, socioeconomic background, and urban and suburban students.

ETS will recruit students using existing ETS contacts with teachers and staff at local schools and afterschool programs. Email or letters will be used to contact these teachers/staff; paper flyers and consent forms for students and parents will be distributed through these teachers/staff. During this communication, the parent/legal guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort, as well as the activities that it entails. Confirmation emails will be sent to participants (see Appendices A-I). Only after ETS has obtained written consent from the parent/legal guardian (see Appendix T) will the student be allowed to participate in the playtesting session. While the tasks in development will be administered to students in grades 4 and 8, students in fifth, seventh, and ninth grade will be considered for inclusion in the study because recruiting students is challenging, and there will be a greater likelihood of success in meeting recruitment goals if students in these other grades participate.

Data Collection Process

Playtesting will take place in a range of locations so that staff can maximize opportunities to work with students. Depending on scheduling and participants, sessions could take place at ETS, schools, and organizations from which students will be drawn (e.g., Boys and Girls Clubs).

Students will first be welcomed and introduced to the facilitator/observers. Students will be reassured that their participation is voluntary and their responses will be kept confidential (in accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107- 347).

Staff will then give an overview of the activity to students and provide guidance on what they should reflect on while looking at the task. Assessment specialists and other staff (e.g., cognitive scientists or task designers) from ETS will act as facilitators and observers, taking notes on what students say and possibly interjecting occasional questions aimed at eliciting students’ reactions, including places of confusion, and ways of thinking about the answers to the questions in the tasks. Each observer may choose to stay with an individual or one group of two to three students looking at and working through the task, or they may choose to move around to observe several groups or other individual students.

For the most part, students will be allowed to explore the storyboard or programmed tasks by themselves with little intrusion on the part of the interviewer. However, at a few strategic points, or after students complete work on a task, the interviewer may introduce questions meant to explore students’ reactions, such as:

  • Did you find this question interesting—why or why not?

  • Are there any questions or words that seem confusing here? Did you understand that part?

  • How would you answer this question?

  • How could this part of the task be improved? How could it be clearer?

Prior to each playtesting session, ETS staff may identify some key focus areas for each task. If students do not provide sufficient comments on targeted issues, a staff member may ask a group of students if they had any thoughts about a particular item, section in the task, or issue using questions such as those described above. Student responses may be recorded through audio or screen capture as appropriate.

Student feedback from a playtesting session is immediate and can be evaluated after the session. Tasks can then proceed with development with little interruption.

Documentation Plan

Notes from observers in each session will be aggregated for understanding task success and potential useful revisions. If draft builds are used, student responses will be collected and evaluated for the same purpose. Since playtesting is a relatively informal process that generates somewhat unstructured information, no formal analyses of these data will be performed.

COGNITIVE INTERVIEWS

Overview

In cognitive interviews (often referred to as a cognitive laboratory study or cog labs), an interviewer uses a structured protocol in a one-on-one interview drawing on methods from cognitive science. In NAEP studies to date, two methods have been combined: think-aloud interviewing and verbal probing techniques. With think-aloud interviewing, respondents are explicitly instructed to "think aloud" (i.e., describe what they are thinking) as they work through questions. With verbal probing techniques, the interviewer asks probing questions, as necessary, to clarify points that are not evident from the “think-aloud” process, or to explore additional issues that have been identified a priori or during the process as being of particular interest. This combination of allowing the students to verbalize their thought processes in an unconstrained way, supplemented by specific and targeted probes from the interviewer, has proven to be flexible and productive.

Cognitive interview studies produce largely qualitative data in the form of verbalizations made by students during the think-aloud phase and/or in response to interviewer probes. The main objective is to explore how students are thinking and what reasoning processes they are using as they work through items and tasks. Some informal observations of behavior and verbalizations are also gathered; behavioral observations may include nonverbal indicators of affect, suggesting emotional states such as frustration or engagement, and interactions with tasks, such as prolonged time on one item or ineffectual or repeated actions suggesting misunderstanding.

Cognitive interviews may be conducted for draft programmed builds, based on processes developed at ETS. EurekaFacts, under a subcontract to ETS, will carry out the interviews. The general approach will be to have a small number of participants work individually through tasks. Data will then be synthesized in the form of lessons learned about students’ thinking, observed student behaviors, and whether the task items appear to be eliciting the constructs of interest. These lessons will then inform ongoing task development.

Sampling and Recruitment Plan

Existing research and practice do not offer a methodological or practical consensus regarding the minimum or optimal sample size necessary to provide valid results for cognitive interviews and similar small-scale activities2. Nonetheless, a sample size of five to fifteen individuals has become the standard. Several researchers have confirmed the standard of five as the minimum number of participants per subgroup for analysis for the purposes of exploratory cognitive interviewing3.

Accordingly, seven to ten per task should be sufficient given that the key purpose of the cognitive interview is to identify qualitative patterns in how students are reasoning at different points when doing tasks. At grade 4, cognitive interview sessions are 60 minutes, so it is anticipated that one task may be tested in a cog lab session. Grade 8 cognitive interviews are 90 minutes and will allow testing of up to two tasks in a session. Accordingly, cognitive interviewing is expected to involve a maximum of 60 students for mathematics and reading SBTs (20 students at grade 4, and 10 at grade 8 for each subject). As suggested for playtesting, while the tasks in development will be administered to grades 4 and 8 students, students in fifth, seventh, and ninth grade will be included in the cognitive interviews.

For the cognitive interviews, students will be recruited by EurekaFacts staff from the following demographic populations:

  • A mix of race/ethnicity (Black, Asian, White, Hispanic);

  • A mix of socioeconomic background; and

  • A mix of urban/suburban/rural.

Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics.

EurekaFacts will perform the recruiting for cognitive interviews from the District of Columbia, Maryland, Virginia, Delaware, and Southern Pennsylvania. EurekaFacts will also administer interviews in other venues besides their Rockville, Maryland site, such as after-school activities organizations or community-based organizations. This allows them to accommodate participants recruited from areas other than Rockville, MD and will help to obtain a sample population including different geographical areas (urban, suburban, and rural).

While EurekaFacts will use various outreach methods to recruit students to participate, the bulk of the recruitment will be administered by telephone and based on their acquisition of targeted mailing lists containing residential address and landline telephone listings. EurekaFacts will also use a participant recruitment strategy that integrates multiple outreach/contact methods and resources such as newspaper/Internet ads, outreach to community organizations (e.g., Boys and Girls Clubs, Parent-Teacher Associations), social media and mass media recruiting (such as postings on the EurekaFacts website).

Interested participants will be screened to ensure that they meet the criteria for participation in the tryout (e.g., their parents/legal guardians have given consent and they are from the targeted demographic groups outlined above). When recruiting participants (see Appendix R), EurekaFacts staff will first speak to the parent/legal guardian of the interested minor before starting the screening process. During this communication, the parent/legal guardian will be informed about the objectives, purpose, and participation requirements of the data collection effort as well as the activities that it entails. After confirmation that participants are qualified, willing, and available to participate in the research project, they will receive a confirmation email/letter and phone call. Informed consent from parents/legal guardians will be obtained for all respondents who are interested in participating in the data collection efforts.

Data Collection Process

Cognitive interviews will take place at a range of suitable venues. In all cases, a suitable environment such as a quiet room will be used to administer the interviews, and there will be more than one adult present.

Participants will first be welcomed by staff, introduced to the interviewer and the observer, and told they are there to help answer questions about how students do mathematics or reading tasks. Students will be reassured that their participation is voluntary and their responses will be kept confidential (in accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107- 347). Interviewers will explain the cognitive interview process and, to the extent that the think-aloud process is used, conduct a practice session with a sample question. An interviewer will ask students what they were thinking as they completed the questions, and whether they believe the questions are clear and understandable. EurekaFacts staff may record audio and screen activity for analysis and take notes about students’ reactions to these questions to revise and refine test content. No personal identifying information will be recorded or retained.

Protocols for cognitive interviews (see Volume II) will include probes for use as students work through item sets and probes for use after students finish answering items. Probes will include a combination of pre-planned questions and ad hoc questions that the interviewer identifies as important from observations during the interview, such as clarifications or expansions on points raised by the student. For example, if a student paused for a long time over a particular item, appeared to be frustrated at any point, or indicated an ‘aha’ moment, the interviewer might probe these kinds of observations further, to find out what was going on. To minimize the burden on the student, efforts will be made to limit the number of verbal probes that can be used in any one session or in relation to any set of items. The welcome script, cognitive interview instructions, and hints for the interviewers are provided in Volume II.

Analysis Plan

For the cognitive interview data collections, documentation will be grouped at the task level. Task items and components will be analyzed across participants.

The types of data collected about task items and components will include

  • think-aloud verbal reports;

  • process data (e.g., time spent on items, keystrokes, tools utilized);

  • behavioral data (e.g., signs of frustration or interest);

  • responses to generic questions prompting students to think out loud;

  • responses to targeted questions specific to the item(s);

  • additional volunteered participant comments; and

  • answers to debriefing questions.

The general analysis approach will be to compile the different types of data to facilitate identification of patterns of responses for specific items or task components; for example, patterns of responses to probes or debriefing questions, or types of actions observed from students at specific points in a task. This overall approach will help to ensure that the data are analyzed in a way that is thorough, systematic, and that will enhance identification of problems with items and provide recommendations for addressing those problems.

SMALL-SCALE TRYOUTS

Overview

In tryouts, students will work uninterrupted through selected draft programmed tasks. These studies will be carried out by EurekaFacts, which will recruit participants, administer and observe the sessions, record interactions as appropriate, and report the results to ETS. Tryouts provide a small-scale snapshot of the range of responses and actions tasks elicit, which can be gathered much earlier in the assessment development process and with fewer resource implications than piloting. As noted above, previous experience, for example with the Reading assessment, shows that tryout-based insights are very informative.

Sampling and Recruitment Plan

EurekaFacts will use the same recruitment methods for tryouts as describing in the cognitive interview section.

Tryout sessions will be scheduled for 60 minutes at grade 4 and 90 minutes at grade 8. Given the tryout out session time frames, it is expected that students at grade 4 will try out one task per session, and students at grade 8 can try out two tasks per session. Our target is for 60 students to participate per task, with a maximum of 360 students to be recruited for small-scale tryouts across mathematics and reading (120 students at grade 4 and 60 students at grade 8 per subject). While the tasks in development will be administered to grades 4 and 8 students, fifth, seventh, and ninth-grade students will be included in the cognitive interviews.

Data Collection Process

EurekaFacts will administer tryouts in small groups at their Rockville, Maryland site or another suitable venue (e.g., after-school activities organization, or community-based organization). Because during tryouts students complete tasks on their own without any interruption, it is possible and most efficient to have several students complete tasks at the same time. A proctor will be present during the session and will follow a strict protocol to provide students with general instructions, guide the group through the tryout, administer debriefing questions, and assist students in the case of any technical issues. The proctor will take notes of any potential observations or issues that occur during the tryout session. Finally, it may be desirable once students have completed their work, and time allowing, for proctors to present students with follow-up verbal or written probes (see Volume II). This has been done successfully in mathematics and social sciences, and questions typically ask students about their reactions, areas of confusion, and background knowledge.

Analysis Plan

The focus of tryout data may vary, as noted above. However, score data and time to complete tasks will certainly be captured and analyzed. Student responses to task items will be compiled into spreadsheets to allow quantitative and descriptive analyses of the performance data. Completion times and non-completion rates will also be quantified and entered into the spreadsheets.

Consultations outside the agency

Educational Testing Service (ETS) is the Item Development, Data Analysis, and Reporting contractor for NAEP and will develop the scenario-based tasks, analyze results, and draft a report with results, as well as administer playtesting activities. EurekaFacts, a research and consulting firm based in Rockville, Maryland, a subcontractor for ETS, will administer cognitive interviews and tryouts.

Justification for Sensitive Questions

Throughout the user survey development processes, effort has been made to avoid asking for information that might be considered sensitive or offensive.

Paying Respondents

To encourage participation, a $25 gift card from a major credit card company will be offered to each student who participates in each pretesting session as a thank you for their time and effort. For sessions that take place in locations other than schools, a parent or guardian of each student will also be offered a $25 gift card to thank them for bringing their participating student to and from the testing site.

Assurance of Confidentiality

The passage of the Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151) requires the installation of the Department of Homeland Security’s Einstein cybersecurity protection system on all Federal civilian information technology systems. Consequently, NCES can no longer pledge under CIPSEA that respondents’ data will be seen only by a statistical agency’s employees or sworn agents. NCES collaborated with other federal statistical agencies to cognitively study two revised versions of the pledge (approved in August 2016; OMB# 1850-0803 v.162), and has updated versions of the wording pertaining to the Cybersecurity Enhancement Act of 2015 to reflect the results of that cognitive testing. To be compliant with the Cybersecurity Enhancement Act of 2015, NCES will test with parents in this study the two versions of the confidentiality pledge. During the initial phases of recruitment, materials for all audiences across all three test methods (playtesting, cognitive interviews, and tryouts) will include a single brief reference to confidentiality and CIPSEA. At the screening phase of recruitment, parents/guardians of potential participants will be divided into two randomly selected groups with one group being read version A of the full CIPSEA pledge and one group being read version B of the full CIPSEA pledge (see Appendix R). If the parent/guardian still wants their child to participate in the study after the screening is complete, they will sign a consent form with the full CIPSEA pledge matching the version that they were read during the screening process (see Appendix T). The two versions of the statement being tested are:

Version A: The information your child provides will be used for statistical purposes only. In accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and other applicable Federal laws, your child’s responses will be kept confidential and will not be disclosed in identifiable form to anyone other than employees or agents. By law, every NCES employee as well as every agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of $250,000, or both if he or she willfully discloses ANY identifiable information about your child. Electronic submission of your child’s information will be monitored for viruses, malware, and other threats by Federal employees and contractors in accordance with the Cybersecurity Enhancement Act of 2015.

Version B: The information your child provides will be used for statistical purposes only. In accordance with the Confidential Information Protection provisions of Title V, Subtitle A, Public Law 107-347 and other applicable Federal laws, your child’s responses will be kept confidential and will not be disclosed in identifiable form to anyone other than employees or agents. By law, every NCES employee as well as every agent, such as contractors and NAEP coordinators, has taken an oath and is subject to a jail term of up to 5 years, a fine of $250,000, or both if he or she willfully discloses ANY identifiable information about your child. Electronic submission of your child’s information will be monitored for viruses, malware, and other threats by Homeland Security in accordance with the Cybersecurity Enhancement Act of 2015.

The results will be analyzed to examine whether the different versions affect rates of participation at two points in time (after the screening questions and after being presented with the consent form). The expectation is that approximately 1500 parents/legal guardians will complete the screener across all three test methods (for a sample size of approximately 750 per treatment), as seen by summing the “flyer and consent form review” line items in Tables 1, 2, and 3, below. Screener participation rates will be calculated as the number of parents/legal guardians who agree that their child may participate in the NAEP research interviews (playtesting, cognitive interviews, and tryouts) divided by the number of parents/legal guardians who went through the screener in total. These rates will be calculated for each treatment separately and a difference of proportions test will be conducted to determine if parents are more likely to agree to participate under one treatment than the other. Consent form participation rates will be calculated as the number of parents/legal guardians who actually sign a consent form divided by the number of parents who were presented with the consent form for signature. Approximately a 10 percent difference between participation rates in the two treatments is detectable with a sample size approaching 1500 at the 0.7 power of the test and the 0.05 level of the test. The results of these analyses will aid in the selection of the best performing pledge for the national NAEP collections.

For all participants, written consent will be obtained from legal guardians (of minor students) before interviews are administered. Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files, secured for the duration of the study, and will be destroyed after the final report is released. Pretesting activities may be recorded using audio or screen capture technology. The only identification included on the files will be the participant ID. The recorded files will be secured for the duration of the study and will be destroyed after the final report is completed.

Estimate of Hourly burden

The estimated burden for recruitment assumes attrition throughout the process. Assumptions for approximate attrition rates are 50 percent from initial contact to consent form completion and 25 percent from submission of consent form to participation. Playtesting, cognitive interviews, and tryout sessions are expected to take 60 minutes for grade 4 students and 90 minutes for grade 8 students.

Table 1. Estimate of Hourly Burden for Playtesting

Respondent

Number of respondents

Number of responses

Hours per respondent

Total hours

Student Recruitment via Teachers and Staff

Initial contact with staff: e-mail, flyer distribution, and planning

19

19

0.33

7

Parent or Legal Guardian

Flyer and consent form review

364

364

0.08

30

Consent form completion and return

182*

182

0.13

24

Confirmation to parent via email or letter

182*

182

0.05

10

Recruitment Totals

383

747


71

Student

Grade 4

68

68

1.0

68

Grade 8

68

68

1.5

102

Interview Totals

136

136


170

Total Burden

519

883


241

*Subset of initial contact group

Note: numbers have been rounded and therefore may affect totals


Table 2. Estimate of Hourly Burden for Cognitive Interviews

Respondent

Number of respondents

Number of responses

Hours per respondent

Total hours

Student Recruitment via Teachers and Staff

 

 


 

Initial contact with staff: e-mail, flyer distribution, and planning

8

8

0.33

3

Parent or Legal Guardian





Flyer and consent form review

160

160

0.08

13

Consent form completion and return

80*

80

0.13

11

Confirmation to parent via email or letter

80*

80

0.05

4

Recruitment Totals

168

328


31

Student

Grade 4

40

40

1.0

40

Grade 8

20

20

1.5

30

Interview Totals

60

60


70

Total Burden

228

388


101

*Subset of initial contact group

Note: numbers have been rounded and therefore may affect totals



Table 3. Estimate of Hourly Burden for Tryouts

Respondent

Number of respondents

Number of responses

Hours per respondent

Total hours

Student Recruitment via Teachers and Staff

 

 


 

Initial contact with staff: e-mail, flyer distribution, and planning

48

48

0.33

16

Parent or Legal Guardian





Flyer and consent form review

960

960

0.08

77

Consent form completion and return

480*

480

0.13

63

Confirmation to parent via email or letter

480*

480

0.05

24

Recruitment Totals

1008

1,968


180

Student

Grade 4

240

240

1.0

240

Grade 8

120

120

1.5

180

Interview Totals

360

360


420

Total Burden

1,368

2,328


600

*Subset of initial contact group

Note: numbers have been rounded and therefore may affect totals


Table 4. Total Burden across All Pretesting Activities

Pretesting Activity

Number of respondents

Number of responses

Burden hours

Play Testing

519

883

241

Cognitive Interviews

228

388

101

Tryouts

1,368

2,328

600

Total

2,115

3,599

942


Cost to federal government

The total cost of the pretesting is $1,402,231, as detailed in Table 5.

Table 5: Cost to the Federal Government

Activity

Provider

Estimated Cost

Design, prepare for, and administer playtesting sessions (including recruitment, incentive costs, data collection, and summary of findings)

ETS

$ 262,679

Design, prepare for, conduct analyses on findings, and prepare report for cognitive interviews

Prepare for and administer cognitive interviews (including recruitment, incentive costs, data collection, analysis, and reporting)

ETS


EurekaFacts

$ 209,280


$ 319,435

Design, prepare for, conduct scoring and analysis, and prepare report for task tryouts
Prepare for and administer task tryouts (including recruitment, incentive costs, data collection, reporting)

ETS


EurekaFacts

$ 234,641


$ 376,196


Project Schedule

Table 6 provides the overall schedule.

Table 6: Schedule

Activity

Each activity includes recruitment, data collection, and analyses

Dates

Playtesting

April-November 2017

Cognitive interviews

September-December 2017

Small-scale tryouts

September-December 2017

Pretesting reports submitted

January 2018




1 See Nielson, J. (1994). Estimating the number of subjects needed for a think aloud test. Int J. Human-computer Studies. 41, 385-397. Available at: http://www.idemployee.id.tue.nl/g.w.m.rauterberg/lecturenotes/DG308%20DID/nielsen-1994.pdf

2 See Almond, P. J., Cameto, R., Johnstone, C. J., Laitusis, C., Lazarus, S., Nagle, K., Parker, C. E., Roach, A. T., & Sato, E. (2009). White paper: Cognitive interview methods in reading test design and development for alternate assessments based on modified academic achievement standards (AA-MAS). Dover, NH: Measured Progress and Menlo Park, CA: SRI International. Available at: http://www.measuredprogress.org/documents/10157/18820/cognitiveinterviewmethods.pdf

3 See Van Someren, M. W., Barnard, Y. F., & Sandberg, J. A. C. (1994). The think-aloud method: A practical guide to modeling cognitive processes. San Diego, CA: Academic Press. Available at: ftp://akmc.biz/ShareSpace/ResMeth-IS-Spring2012/Zhora_el_Gauche/Reading%20Materials/Someren_et_al-The_Think_Aloud_Method.pdf


File Typeapplication/msword
AuthorDresher, Amy R
Last Modified ByKubzdela, Kashka
File Modified2017-04-05
File Created2017-03-30

© 2024 OMB.report | Privacy Policy