National Center for Education Statistics
Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Eye-tracking Cognitive Laboratory
Volume I
Supporting Statement
OMB# 1850-0803 v. 148
November 2015
The Middle Grades Longitudinal Study of 2017–18 (MGLS:2017) is the first study conducted by the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the U.S. Department of Education, to follow a nationally representative sample of students as they enter and move through the middle grades (grades 6–8). In preparation for the main study, the data collection instruments and procedures must be field tested.
This package requests clearance to conduct cognitive laboratory testing concurrent with the MGLS:2017 Item Validation Field Test (IVFT), for which the data collection is scheduled to begin in January 2016. The primary purpose of the IVFT is to determine the psychometric properties of items and the predictive potential of assessment and survey items so that valid, reliable, and useful assessment and survey instruments can be developed for the main study. During the cognitive laboratories, designed to amend the information that will be gathered in IVFT, students will complete the MGLS:2017 assessments and student surveys, which will be administered on tablets that include eye-tracking technology. This technology provides information on the point of gaze (where one is looking) and the motion of the eyes as information is processed. By analyzing eye-tracking movements as students interact with the assessments and surveys, information will be provided on assessment item functioning, clarity of instructions, and navigability, and on the clarity of the surveys.
Volume 1 of this submission presents information on the basic design of the IVFT cognitive interviews. Volume 2 presents the student questionnaire items that will be used in the interview process. Volume 3 provides descriptions of the protocols to be used during the cognitive laboratories. Volume 4 provides recruitment materials and scripts.
Study Description
MGLS:2017 is the first study conducted by NCES to follow a nationally representative sample of students as they enter and move through the middle grades (grades 6–8). A study of the middle grades will complement NCES’s plans for implementing a multi-cohort sequence for its longitudinal studies series. This means that the Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), the MGLS:2017, and the High School Longitudinal Study of 2020 (HSLS:2020) will work toward synchronizing within a given 10-year span to collect the full range of data on students’ school experiences as they transition from elementary school into high school. The federal government is uniquely positioned to undertake this needed comprehensive large-scale longitudinal study of a nationally representative sample of middle grade youth that includes measures of known critical influences on adolescents’ academic and socioemotional trajectories. NCES is authorized to conduct the MGLS:2017 under the Education Sciences Reform Act of 2002 (20 U.S. Code, Section 9543).
MGLS:2017 will be conducted with a nationally representative sample of students enrolled in sixth grade during the 2017–18 school year, with the baseline data collection taking place from January through June of 2018. Annual follow-ups are planned for springs of the 2018-19 and 2019-20 school years, when most of the students in the sample will be in grades 7 and 8, respectively. The MGLS:2017 will provide a rich descriptive picture of the academic experiences and development of students during these critical years and will allow researchers to examine associations between contextual factors and student outcomes. There is a wealth of research highlighting the importance of mathematics and literacy skills for success in high school and subsequent associations with later education and career opportunities. Thus, the study will focus on student achievement in these areas, along with measures of student socioemotional well-being, executive function, and other outcomes. The study will also include a sample of students with different types of disabilities (with a focus on students with a specific learning disability, autism, and/or emotional disturbance) that will provide descriptive information on their outcomes, educational experiences, and special education services.
MGLS:2017 will rely on a set of longitudinal and complementary instruments to collect data from several types of respondents to provide information on the outcomes, experiences, and perspectives of students across grades 6, 7, and 8; their families and home lives; their teachers, classrooms, and instruction; and the school settings, programs, and services available to them. At each wave of data collection in the main study, students’ mathematics and reading skills, socioemotional development, and executive function will be assessed. Students will also complete a survey that asks about their engagement in school, out-of-school experiences, peer group relationships, and identity development. Parents will be asked about their background, family resources, and involvement with their child’s education and their school. Students’ mathematics teachers will complete a two-part survey. In part 1, they will be asked about their background and classroom instruction. In part 2, they will be asked to report on the academic behavior, mathematics performance, and classroom conduct of each study child in their classroom. For students receiving special education services, their special education teacher or provider will also complete a survey questionnaire similar in structure to the two-part mathematics teacher instrument, consisting of a teacher-level questionnaire and student-level questionnaire, but with questions specific to the special education experiences of and services received by the study child. School administrators will be asked to report on school programs and services, as well as on school climate.
In short, the MGLS:2017 will provide data on the development and learning that occur during students’ middle grade years (grades 6–8) and that are predictive of future success, along with the individual, social, and contextual factors that are related to positive outcomes. A key goal of the study is to provide researchers and policymakers with the information they need to better understand the school and nonschool influences associated with mathematics and reading success, socioemotional health, and positive life development during the middle grade years and beyond. To support the development of the study, the MGLS:2017 is conducting two field tests, the IVFT beginning in January 2016, followed by the Operational Field Test (OFT) that will begin in January 2017.
The study’s success is dependent on the development of reliable, valid measures. The goal of the IVFT is to collect data to support examination of the mathematics assessment, reading assessment, executive function assessment, student survey, parent survey, and school staff surveys. The IVFT will provide the data needed to determine the psychometric properties of items and the predictive potential of assessment and survey items so that valid, reliable, and useful assessment and survey instruments can be developed for the main study. As the focus of the IVFT is the analyses of the psychometric properties of the survey items and assessments, the IVFT requires a large, diverse field test sample, though not a nationally representative one.
Gaining schools’ cooperation in voluntary research is increasingly challenging. The OFT will be used to test materials and procedures revised based on the results of the IVFT and to gain a deeper understanding of effective recruitment strategies that lead to higher response rates and thus better data quality. The OFT will include a responsive design approach for nonresponding parents. The OFT is also an opportunity to finalize our standard protocols for test administration. It will allow NCES to tighten assessment and survey timing, so as to maximize the overall functionality of the assessments and surveys while minimizing the time it takes respondents to complete them. With the focus of the OFT on recruitment strategies, tactics for retention of the sample within the study, and the operational administration of the surveys and assessments, the OFT will provide the MGLS team with a small-scale practice in obtaining a nationally representative sample.
Purpose of the Cognitive Laboratory Work
Eye-tracking is the observation and recording of eye behavior, such as fixation and movement. Fixation is when a respondent’s gaze stops moving, lingering enough for the respondent to process what he has seen. The movement of a respondent’s eyes between fixation points provides information on item processing and understanding. The eyetracking data, overlaid with participant behavior and subjective comments, will provide key insights about a participant’s experience, attention, and message processing (e.g., lack of organization leading to inefficient search for key information) that participants may not be aware of or cannot explicitly discuss. Eye-tracking data is collected by utilizing a high speed camera and an LED infrared light that illuminates the face (no more powerful than typical sunlight). An Infrared camera is used to capture eye movements by acquiring an image of the eyes and calculating gaze location in real time. The remote system requires no physical contact with the student.
Cognitive laboratory work, including the use of eye-tracking, will be conducted to examine the performance and differential functioning of the student assessment items. Eye-tracking data may provide possible insight into differential functioning of the student assessment items through examining the number and type of fixations that occur. Fixation patterns can also identify instructions or items that are unclear, or that prevent the efficient self-administration of the instrument. Examples of measures to be used for these analyses include:
Number of fixations
Whether participants fixated on an area of interest during the task (such as the instructions, or answer options). Lack of fixations on important elements may be the result of information not being salient enough or not interpreted as important. In addition, lack of fixations on important language (or by fixations directed at white space or off screen) while completing the item is an indicator of inattention to the task.
Time from first fixation to interaction with an assessment element
The time interval for participants from fixating on an important area of interest to selecting a survey element. Longer time intervals between fixation on an element and interaction with that element can indicate uncertainty with how to proceed.
Repeat fixations
Whether participants visually re-enter a particular area of interest (e.g. instructions) when completing the assessment. Fixating repeatedly on an area of interest can indicate confusion with the language or visual elements.
Order of fixations
Whether participants visually interacted with the information linearly (i.e., left-to-right to top-to-bottom). A non-linear pattern of fixations can indicate difficulty in getting started and lack of clarity of how to get oriented with the item.
Fixation duration
The average fixation duration during interaction with individual items. Longer fixation duration can be an indicator of difficulty with information processing.
This work will be conducted concurrent to the IVFT data collection and will inform any instrument modifications in advance of the 2017 OFT.
A total of 120 students will be tested: 30 students in each of three focal disability groups (autism, emotional disturbance, and specific learning disability) and 30 students who do not have an Individualized Education Program (IEP). Cognitive laboratory participants will be asked to complete the MGLS:2017 IVFT assessments and student survey. During the cognitive laboratory work, the cognitive laboratory interviewer will remain in the room with the participant. To collect information on eye-tracking, a Tobii X2-60 eye tracker will be used to capture participants’ eye movements while they interact with the assessments and the survey. Participants will be seated in a chair in front of the tablet computer with the survey. The eye tracker requires calibration. To calibrate the eye tracker, prior to beginning the assessments and the survey, the cognitive laboratory interviewer will ask participants to follow circles, that will appear in different positions on the screen, with their eyes. This calibration process takes about ten seconds to complete. Once participants are calibrated to the eye tracker, the cognitive laboratory interviewer will instruct participants to use the tablet computer from a comfortable position but to try to refrain from making any large head movements while completing the assessments and the survey.The entire set-up and calibration process takes less than five minutes per participant to complete. The collection of eye-tracking data does not require that any equipment be directly placed on the cognitive laboratory participants.
Administration of Assessments and Survey Components
For this cognitive laboratory work, participants will be administered the IVFT versions of the MGLS:2017 assessments and student survey. The assessments and surveys are estimated to take approximately 90 minutes. The cognitive interview is estimated to take approximately 30 minutes. Therefore, the total cognitive laboratory time is estimated to be about 120 minutes.
Student Assessments and Student Survey. Students will participate in a combination of math, reading, and executive function assessments and a student survey, designed to take a total of approximately 90 minutes per student.
Student Mathematics Assessment. The MGLS:2017 mathematics assessment will be taken on a tablet computer as will all the instruments. The focus will be on domains of mathematics that are most likely to be the central focus of middle school learning now and in the future: the Number System, Ratios and Proportional Relationships, Expressions and Equations, and Functions. To ensure that the study is sensitive to the variation in students’ mathematics ability, the assessment will include items with appropriately varying cognitive demand. The MGLS:2017 mathematics assessment will provide valuable information about the development of middle grade students’ knowledge of mathematics and their ability to use that knowledge to solve problems, moving toward stronger reasoning and understanding of more advanced mathematics.
Student Reading Assessment. The MGLS:2017 reading assessment will use a two-stage adaptive assessment design consisting of a brief routing block (first stage: approximately 10 minutes) followed by a skill-based block (second stage: approximately 20 minutes), for a total of 30 minutes. The routing block will include items that measure foundational components of reading that are important for comprehension: Vocabulary, Morphological Awareness, and Sentence Processing. Performance on the routing block will direct students to one of three types of skill-based reading blocks (reading components, basic comprehension, or scenario-based comprehension) within the second stage. The second-stage skills blocks will be used to gather more information on foundational reading component skills, students’ efficiency at basic reading comprehension, ability to comprehend short passages, and students’ ability to comprehend informational text and reason more deeply about text.
Student Executive Function Assessment. Executive function (EF), a set of capacities and processes originating in the prefrontal cortex of the brain, permits individuals to self-regulate, engage in purposeful and goal-directed behaviors, and conduct themselves in a socially appropriate manner. Self-regulation is needed for social success, academic and career success, and good health outcomes. Executive function includes capacities such as shifting (cognitive and attention flexibility), inhibitory control, and working memory. Four different executive function measures will be included in the field tests: Stop Signal (inhibitory control), 3-Back with verbal stimulus (working memory), 2-Back with nonverbal stimulus (working memory), and the Hearts and Flowers task (shifting or cognitive flexibility). The functioning of the directions and practice screens for each task are of particular interest here, as they will impact on the examinee’s understanding of and performance on the particular EF task.
Student Survey. The purpose of the student survey is to collect information on students’ attitudes and behaviors; out-of-school time use; and family, school, and classroom environments. The student survey will also serve as a source for information about socioemotional outcomes having to do with social relationships, support, and school engagement.
The IVFT employs a spiral design in which not all students will receive the same assessments and survey items. Table 1 below presents a summary of the student assessment and survey booklet spiral design. This spiral design has been approved for the IVFT. As for the IVFT in-school sessions, to keep the assessment to approximately 90 minutes and gain as much information on as many assessment and survey items as possible, the cognitive laboratories will employ a spiral design in which not all students will receive the same assessments and survey items. The 30 participants in each subgroup will be divided equally across the different booklets, with approximately 20 students taking each booklet version.
Table 1. Item Validation Field Test (IVFT) Student Assessment and Spiral Design
|
All cognitive activities will be conducted in English by researchers from the MGLS:2017 study team, who have developed the cognitive laboratory protocols. The professional staff comprising the cognitive interviewing team will include an instrument lead and up to three interviewers. Staff overseeing the work have extensive experience using cognitive interviewing techniques and eye-tracking methodologies for federal studies. All cognitive interview staff will participate in a specialized training prior to the commencement of the cognitive laboratory work. Training will involve interactive activities as checks of understanding. Weekly debriefings will be held among the design team during cognitive laboratory work.
Assessments will be conducted in laboratory settings at either the Fors Marsh Group building or RTI International Research Triangle Park locations. No cognitive laboratory activities will be conducted in a school setting.
The screen of the tablet computer will be recorded using the eyetracking software for eye-movement analysis. This recording will document the location of students’ eye-movement fixations during the interview. Cognitive laboratory interviewers will be trained to set up the equipment appropriately, record the assessment, and review the recording to supplement their notes on student performance. The interviewer will take notes during the cognitive laboratory session interviews. The interviewer will pay special attention to whether students have any trouble getting started with the tasks or seem to have difficulty understanding the directions.
In addition to the data provided by the eye-tracking software, the student will be interviewed following a standard retrospective protocol and debriefing to identify any concerns with administration of the assessments and the survey, procedures, or directions for the tasks (Volume III). This interview portion is not about the eye-tracking measurement. This interview portion is meant to provide further information about the assessments and surveys. Therefore, participants will be interviewed to gain further information on the functionality of the math assessment, reading assessment, executive function assessment, and student survey.
The length of time of each individual section varies. Completion of the assessments and survey takes between 80-90 minutes and, together with the retrospective verbal protocol and debriefing questions, the cognitive interview activity will last about 120 minutes.
As part of the MGLS:2017 design contract, content experts were consulted in the development of the assessments and questionnaires. These experts are listed by name, affiliation, and expertise in table 2.
Table 2. Members of the MGLS:2017 Content Review Panels
Name |
Affiliation |
Expertise |
Mathematics Assessment Content Review Panel (June 18–19, 2013) |
||
Tom Loveless |
Brookings Institution |
Policy, math curriculum |
Linda Wilson |
Formerly with Project 2061 |
Math education, math assessment, middle school assessment, author of NCTM Assessment Standards for School Mathematics and NAEP math framework, teacher |
Kathleen Heid |
University of Florida |
Math education, use of technology, teacher knowledge, NAEP Grade 8 Mathematics Standing Committee member |
Edward Nolan |
Montgomery County Schools, Maryland |
Math curriculum and standards, large-scale assessment of middle grade students |
Lisa Keller |
University of Massachusetts, Amherst |
Psychometrics, former math teacher |
Paul Sally |
University of Chicago |
Math education, mathematics reasoning, mathematically talented adolescents |
Margie Hill |
University of Kansas |
Co-author of Kansas mathematics standards, former NAEP Mathematics Standing Committee member, former district math supervisor |
Executive Function Content Review Panel (July 18, 2013) |
||
Lisa Jacobson |
Johns Hopkins University; Kennedy Krieger Institute |
Development of executive functioning skills, attention, neurodevelopmental disorders, and parent and teacher scaffolding |
Dan Romer |
University of Pennsylvania |
Adolescent risk taking |
James Byrnes |
Temple University |
Self-regulation, decision making, cognitive processes in mathematics learning |
Socioemotional-Student-Family Content Review Panel (July 25–26, 2013) |
||
James Byrnes |
Temple University |
Self-regulation, decision making, cognitive processes in mathematics learning |
Russell Rumberger |
University of California, Santa Barbara |
School dropouts, ethnic and language minority student achievement |
Tama Leventhal |
Tufts University |
Family context, adolescence, social policy, community and neighborhood indicators |
Susan Dauber |
Bluestocking Research |
School organization, educational transitions, urban education, parent involvement and family processes |
Scott Gest |
Pennsylvania State University |
Social networking, social skills, longitudinal assessment of at-risk populations |
Kathryn Wentzel |
University of Maryland |
Social and academic motivation, self-regulation, school adjustment, peer relationships, teacher-student relationships, family-school linkages |
Richard Lerner |
Tufts University |
Adolescent development and relationships with peers, families, schools, and communities |
School Administrator Content Review Panel (August 16, 2013) |
||
Susan Dauber |
Bluestocking Research |
School organization, educational transitions, urban education, parent involvement and family processes |
George Farkas |
University of California, Irvine |
Schooling equity and human resources |
Jeremy Finn |
State University of New York at Buffalo |
School organization, school dropouts |
Edward Nolan |
Montgomery County Schools, Maryland |
Large urban school system administrator |
Tom Loveless |
Brookings Institution |
Policy, math curriculum |
Reading Assessment Content Review Panel ( April 14, 2014) |
||
Donna Alvermann |
University of Georgia |
Adolescent literacy, online literacy, codirector of the National Reading Research Center (funded by the U.S. Department of Education) |
Joseph Magliano |
Northern Illinois University |
Cognitive processes that support comprehension, the nature of memory representations for events depicted in text and film, strategies to detect and help struggling readers |
Sheryl Lazarus |
University of Minnesota |
Education policy issues related to the inclusion of students with disabilities in assessments used for accountability purposes, student participation and accommodations, alternate assessments, technology-enhanced assessments, teacher effectiveness, large-scale assessments, school accountability, research design (including cost analyses), data-driven decision making, rural education, the economics of education |
Disabilities Content Review Panel (April 29, 2014) |
||
Jose Blackorby |
SRI International |
Autism, specific learning disabilities, special education, curriculum design, alternate student assessment, large-scale studies of students with disabilities, codirector of the Special Education Elementary Longitudinal Study (SEELS) |
Lynn Fuchs |
Vanderbilt University |
Specific learning disabilities, student assessment, mathematics curriculum, psychometric models |
Mitchell L. Yell |
University of South Carolina |
Autism, emotional and behavior disorders, specific learning disabilities, pre-K–12 instruction and curriculum, special education, evidence-based intervention |
Sheryl Lazarus |
University of Minnesota |
Special education policy, inclusion of students with disabilities in assessments, accommodations, alternate assessments, technology-enhanced assessments, large-scale assessments, school accountability, research design (including cost analyses) |
Martha Thurlow |
University of Minnesota |
Specific learning disabilities, reading assessment, alternate student assessment, early childhood education, special education, curriculum, large-scale studies |
Diane Pedrotty Bryant |
University of Texas, Austin |
Educational interventions for improving the mathematics and reading performance of students with learning disabilities, the use of assistive technology for individuals with disabilities, interventions for students with learning disabilities and who are at risk for educational difficulties |
Expert Meeting, Middle Grades Experts (January 23, 2015) |
||
Nancy Flowers |
University of Illinois at Urbana-Champaign |
Program evaluation, Large-scale data collection, Research methods |
Deborah Kasak |
National Forum to Accelerate MG Reform |
Education policy, School reform, Schools to watch |
Doug MacIver |
Johns Hopkins University |
School reform, Adolescent engagement, learning and achievement |
Margaret McLaughlin |
University of Maryland |
Special education policy, Students with disabilities |
Steve Mertens |
Illinois State University |
Teacher preparation, School reform, Evaluation |
Karen Swanson |
Mercer University |
Curriculum and instruction, Transformative education, Faculty professional development |
Expert Meeting, Students with Disabilities (April 2, 2015) |
||
Jose Blackorby |
SRI International
|
Autism, specific learning disabilities, special education, curriculum design, alternate student assessment, large-scale studies of students with disabilities, codirector of the Special Education Elementary Longitudinal Study (SEELS) |
Jacqueline Buckley |
Institute of Education Sciences, National Center for Special Education Research |
Large-scale studies of students with disabilities |
Richelle Davis |
Special Education and Rehabilitative Services, Office of Special Education Programs |
Large-scale studies of students with disabilities |
Lindsey Jones |
National Council for Learning Disabilities |
Large-scale studies of students with disabilities |
Margaret McLaughlin |
University of Maryland |
Special education policy, Students with disabilities |
Kim Sprague |
Institute of Education Sciences,
|
Large-scale studies of students with disabilities |
Jim Weindorf |
National Council for Learning Disabilities |
Large-scale studies of students with disabilities |
Recruiting
The RTI team will recruit participants using networks of professional contacts, community organizations, and disability communities to obtain the desired sample of students in the general education population and in each of the three focal disability groups (autism, emotional disturbance, and specific learning disability).
Potential participants will be identified by contacting community organizations and disability organizations, such as the Council for Exceptional Children and the Autism Society District of Columbia. These organizations provide advocacy support and networking opportunities for parents of children with disabilities and can connect the recruiting team to networks of parents local to FMG or RTI who may be interested in the laboratory work and with students willing to participate. The recruitment team will use multiple strategies of outreach, such as:
requesting contact information for families of students in the target age and disability groups;
offering to visit the organization to place addresses on the flyers and send them in the mail without receipt of contact lists; and
posting flyers in locations recommended by the organizations where they may be seen by the target populations.
Once a list of possible participants is compiled, recruiters will contact parents to request the participation of the student. Recruitment contacts will take place by telephone. During the recruitment phone call, a brief screener survey will be administered to interested participants to ensure students meet grade and language eligibility requirements. This screener survey will also help to ensure the appropriate number of students in each disability category are recruited. After confirmation that student participants are eligible, willing, and available to participate in the research project, the cognitive laboratory sessions will be scheduled at a day and time convenient to the parent and child. Parents will receive a confirmation e-mail of this appointment. Parents will be asked to complete a written permission form upon arrival for the cognitive laboratory session, prior to beginning to work with the student. In order to assure that sample size targets are met, recruiters will recruit five extra students in each category in case any parents or students who originally agree to participate change their mind about doing so,. These students will still be asked to participate, with the expectation that others will drop out.
See Volume 4 for recruitment, consent forms, confirmation, and thank you letters.
Incentives
To attract participants for the cognitive laboratory activities and to thank them for their time, incentives will be offered. General education students will receive a $30 gift card for participating. This incentive amount acknowledges the unusually long length of the sessions. Students in the three focal disability groups who participate will receive a $45 gift card. This increased incentive addresses the difficulty in recruiting students with disabilities, as well as the fact that the task itself will be more difficult for these students. In addition, all parents who bring their child to and from the FMG or RTI offices to enable the student to participate will receive $25 as a thank you for their time and effort.
NCES is authorized to conduct this study under the Education Sciences Reform Act of 2002 (20 U.S. Code, Section 9543). By law, the data provided by parents and students may be used only for research purposes and may not be disclosed or used in identifiable form for any other purpose except as required by law (20 U.S. Code, Section 9573). The laws pertaining to the collection and use of personally identifiable information will be clearly communicated to students and parents.
The confidentiality plan developed for the MGLS:2017 requires that all contractor and subcontractor personnel and field workers who will have access to individual identifiers sign confidentiality agreements and notarized nondisclosure affidavits. The plan also requires that all personnel receive training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. NCES understands the legal and ethical need to protect the privacy of the MGLS:2017 respondents.
Table 3 shows the expected burden for the cognitive laboratory activities. The desired yield for the cognitive lab work is 120 participants: 30 from each of three focal disabilities (specific learning disability, autism, and emotional disturbance) and 30 with general education students who do not have an IEP. As noted above, recruiters will recruit five extra students in each category. Therefore, we are seeking consent for 140 potential participants. To recruit the 140, we anticipate initially contacting 178 parents during recruitment. This assumes that of the 178 who express an interest to participate, when contacted approximately 78 percent will be eligible and still willing to participate.
The estimated burden for recruitment is 10 minutes on average, for a total of approximately 30 burden hours. Total burden for the MGLS:2017 cognitive laboratory activities is approximately 270 hours.
Table 3 – Estimated Respondent Burden for MGLS:2017 Eye-tracking Cognitive Laboratory Activities
Activity |
Sample Size |
Number of responses & respondents |
Length (mins) |
Total burden (hours) |
Respondent hourly wage* |
Estimate of respondent labor cost |
Recruitment (parent) |
|
178 |
10 |
30 |
$22.71 |
$681 |
Student Cognitive Laboratory |
178 |
140 |
120 |
280 |
$7.25 |
$2,030 |
Study Total |
|
318 |
|
310 |
|
$2,711 |
* Student hourly rate based on Federal minimum wage as of 10/30/2015. Hourly rate for parents is based on average hourly rate of all occupations: http://www.bls.gov/oes/current/oes_nat.htm#00-0000
Recruitment for the interviews will begin upon receiving OMB approval. All cognitive laboratory activities will be completed no later than May 2016. The results will be summarized in an August 2016 report, as well as interim reports submitted during the cognitive laboratory process and presented at the TRP in April 2016.
The estimated cost to the federal government for the cognitive lab eye tracking activities is $289,791. This cost includes design, planning, recruitment of students and their parents, cognitive testing, incentives, interpretation/analysis of results, discussion of findings, and report preparation.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |