National Center for Education Statistics
Middle Grades Longitudinal Study of 2016-2017 (MGLS:2017) Cognitive Interviews
Volume I
Supporting Statement
OMB# 1850-0803 v. 97
February 27, 2014
The Middle Grades Longitudinal Study of 2016–17 (MGLS:2017) will be the first study sponsored by the National Center for Education Statistics (NCES) within the Institute of Education Sciences (IES) of the U.S. Department of Education (ED) to follow a nationally representative sample of students as they enter and move through the middle grades (grades 6, 7, and 8). The purpose of this submission is to conduct cognitive interviews to refine the assessments and questionnaires for the MGLS:2017 field test. Prior to cognitive interviews, all items have been reviewed by topical experts. The primary goal for the cognitive laboratory work is to ensure assessments and questionnaire items are appropriate for the middle grades population, clearly written, and not biased against subgroups. We aim to gain a clear understanding of participants’ comprehension, retrieval, judgment, and response strategies when asked selected items, and to use this information to improve item wording and format.
Study Description
The data collected for MGLS:2017 through repeated measures of key constructs will provide a rich descriptive picture of the experiences and lives of all students during these critical years and will allow researchers to examine associations between contextual factors and student outcomes. Because mathematics and literacy skills are important for preparing students for high school and are associated with later education and career opportunities, the study is placing a focus on student growth in these areas and on their instruction. MGLS:2017’s emphasis on inclusiveness involves oversampling students in several of the Individuals with Disabilities Education Act (IDEA) categories. A key goal of the study is to better understand the supports students need for academic success, high school readiness, and positive life outcomes, including high school graduation, college and career readiness, and healthy lifestyles. The study will track the progress students make in reading and mathematics and their developmental trajectories as they transition from elementary to high school; it will identify factors in their school, classroom, and home and out-of-home experiences that may help to explain differences in achievement and development and may contribute to their academic success and other outcomes both during the middle grade years and beyond. Baseline data will be collected from a nationally-representative sample of approximately 15,000 to 20,000 6th grade students in spring of 2017 with annual follow-up in spring 2018 and spring 2019 when most of the students in the sample will be in grades 7 and 8 respectively. An additional sample of more than 3,000students in at least five disability categories will be obtained in order to provide researchers with adequate power to analyze results for those groups.
At each wave of data collection in the national study, students’ reading achievement, mathematics achievement, and executive function will be assessed. Students will also complete a survey that asks about their engagement in school, out-of-school experiences, peer group, and identity and socioemotional development. Parents will be interviewed about their background, family resources, and parental involvement. Students’ teachers will complete a two-part survey that asks about their background and classroom instruction and then asks teachers to report on each student’s academic behaviors, mathematics performance, and conduct. School administrators will be asked to report on school supports and services, as well as school climate. Student information will be abstracted from school records and field staff will complete an observation checklist on the school physical plant and resources.
The study design requires a set of longitudinal and complementary instruments across multiple participants that provides information on the outcomes, experiences, and perspectives of students across grades 6, 7, and 8; their families and home lives; their teachers, classrooms, and instruction; and the school settings, supports, and services available to them. The National Center for Education Statistics (NCES), Institute of Education Sciences, U.S. Department of Education has contracted with Decision Information Resources, Inc. (DIR) and its subcontractors, Mathematica Policy Research and Educational Testing Service (ETS), to design and field test the instruments and procedures for MGLS:2017.
The MGLS:2017 design project is developing a set of direct student assessments and student, parent, teacher, and school administrator questionnaires; developing and evaluating procedures for selecting the study sample; will gain the cooperation of schools, teachers, parents, and students; and administer and evaluate the study instruments. The MGLS:2017 design project has three phases of activity. In Phase 1 (Instrument Development), the design team is working with NCES to identify and develop measures for the study’s key constructs. A critical component of this activity is soliciting and incorporating input from content and technical experts. The products from Phase 1 will be a large mathematics item pool, a draft set of executive function assessments, drafts of student, parent, teacher, and school administrator questionnaires, and a draft facilities checklist. Phase 2 (Testing and Evaluation) will begin with cognitive laboratories for each study instrument and conclude with a large field test in spring 2015 to obtain psychometric information. The field test will be conducted in 50 schools, with more than 4,100 students in grades 5 through 8, as well as 450 mathematics teachers, and 819 parents. Phase 3 (Analysis and Revision) will use psychometrics to identify items, scales, and measures for mathematics, reading, and executive function assessments, along with the nonassessment instruments for the national study. The procedures used in the field test to select and recruit student participants, work with school personnel, and administer each of the instruments will be scrutinized, and recommendations for the national study will be finalized.
This request for clearance is submitted under the National Center for Education Statistics (NCES) generic clearance agreement (OMB #1850-0803) to conduct cognitive laboratory activities to inform the design of the direct assessment and nonassessment instruments being developed for MGLS:2017. DIR and its partner Mathematica Policy Research will conduct all cognitive laboratory activities.
Requests to recruit schools and field test the instruments in February 2015 will be submitted to OMB at a later date. The MGLS:2017 collections are authorized by law under Title 20, USC, Section 9543a, 2006:.
Purpose of the Cognitive Laboratory Work
Cognitive laboratory work will be conducted to ensure that clear instructions and well-designed items are used in the following instruments: the Student Mathematics Assessment, the Student Executive Function Assessment, the Student Questionnaire, the Parent Questionnaire, the Mathematics Teacher Questionnaire, the Mathematics Teacher Student Report (TSR), and the School Administrator Questionnaire. Students, parents, and school administrators will participate in one-on-one interviews, while teachers will participate in focus groups.
Several criteria were followed for selecting items for cognitive laboratory work:
Constructs requiring new items to be developed
Items from prior studies that have been modified
Items that have not been used on diverse samples
New topic areas requiring exploration to determine the level of item development needed
Although many of the items in the MGLS:2017 instruments will be drawn from existing NCES surveys, the item pool development process identified assessment domains or questionnaire constructs for which existing items were lacking. For some constructs, we identified relevant existing items that had not been used with the target population. For example, we found items for the Mathematics Teacher Questionnaire that were previously administered as part of a survey for guidance counselors. Some student assessment items had been previously administered to elementary or high school students, but not specifically to middle grade students. In other instances, we identified items from existing instruments that fit within one of the constructs; however, the study sample on which they had been used was not as ethnically, geographically, and socioeconomically diverse as the MGLS:2017. All items selected for cognitive laboratory work will be tested with a diverse group of students, parents, and school staff to ensure that they are comprehensible and carry the same meaning across a range of participants. We also received feedback during the Content Review Panel (CRP) meetings on constructs or aspects of items to clarify, which has led us to include broader topics (as opposed to exact items), for which we will seek feedback from participants on how they think about these particular areas (such as whom they include when asked about “parent(s)”). By testing draft items and getting feedback about topic areas, we will be able to refine the items for the field test where they will undergo further evaluation using larger and even more diverse samples. Below, we highlight particular goals for the cognitive laboratory work for each survey instrument.
Student Mathematics Assessment. Although many items in the pool have been previously tested, the cognitive laboratory work aims to confirm that items are appropriate for middle grade students from a variety of educational backgrounds, and ensure readability and understanding of items. In particular, the items selected for cognitive laboratory work will be those that contain more text or representations of concepts that seem unique to specific curricula (for example, use of mixed numbers versus improper fractions at different grade levels). We have drawn from existing pools, such as the Early Childhood Longitudinal Study–Kindergarten Class of 2010–2011 (ECLS-K:2011), the High School Longitudinal Study of 2009 (HSLS:09), National Assessment of Educational Progress (NAEP), or the Asia Pacific Economic Cooperation (APEC), that may not have been administered in grades 6, 7, or 8. Additionally, we are collaborating with MGLS:2017 team members at Educational Testing Service (ETS) and Professor William Schmidt a leader expert in mathematics and mathematics curriculum to develop new items as needed to strengthen the measurement across these grades in the selected areas of mathematics.
The assessment items will undergo two types of cognitive laboratory work. First, students will complete a subset of assessment items and be interviewed one-on-one. The cognitive interview will help to ensure that the items selected have instructions that are clearly articulated, that the mathematical task that the item is intended to inform is understandable by the student, and the format in which the item is presented does not present any unintended challenges. Second, teachers will be asked to review the assessment items, making hard copy comments with specific attention to item clarity, appropriate use of terminology, item format, and appropriateness for grade level. The teachers will then participate in one of three focus groups (grade 6 teachers, grade 7 teachers, and grade 8 teachers) with teachers recruited from different regions of the country. These two cognitive laboratory activities are described in more detail in subsequent sections. Focus groups will provide us with varied perspectives for discussion to ensure the items selected are appropriate for middle grade students across different mathematics ability levels and in keeping with instructional approaches.
Student Executive Function Assessment. The primary goal of cognitive laboratory work for the executive function tasks is to ensure that the selected tasks and directions are clear and comprehensible to middle grade students across diverse backgrounds and literacy levels. We will also collect information on the timing of the task administration.
Student Questionnaire. The cognitive laboratory work for the student questionnaire will focus on modified and newly developed items, as well as items that have been used in smaller-scale studies. The cognitive interviews will help inform us about how these items may work in a large-scale study with a broad sample of students, such as the MGLS:20171. For example, student interests or ”sparks” are salient to this age group, however, there are no existing large-scale or prior NCES-fielded items for this topic. Therefore we will test items derived from smaller-scale studies on middle grade students, gaining a better understanding of such things as item clarity, understandability, and appropriateness of response options. Another goal for the cognitive laboratory work is to examine students’ time use (in particular, hanging out or socializing with friends and paid work). Therefore, the cognitive interviews will specifically inform the study about students’ understanding of the types of activities they consider. Additionally, as certain items have been previously fielded in studies of high school students, we will also obtain information on how well these items capture students’ academic expectations and their perceptions of parenting behaviors. Finally, the cognitive laboratory work for the student questionnaire will help detect any sensitivity issues for special groups, such as whether students with different linguistic backgrounds have difficulties understanding specific terms or phrases in questionnaire items.
Parent Questionnaire. The cognitive interviews for the parent questionnaire are also designed to inform item clarity, understandability, and appropriateness of response options. For example, in our cognitive laboratory work for the parent questionnaire, we will focus on questions related to parenting processes. These items in particular use terminology that may be potentially more subjective, resulting in unintended variation in response. In particular, we will examine how parents think about the questions on parenting and what referent group they use when the item wording refers to “parents” generally. We will explore these broader issues by including questions on parental involvement and on the conversations parents have with their children about future planning for course taking, schooling, and work. We will also examine whether parents experience difficulties with or interpret differently item wording (notably across cultural or linguistic groups) or if parents appear to be sensitive to reporting on their own childrearing behaviors. Together, these topics will help to inform the final item selection and wording.
Mathematics Teacher Questionnaire. The cognitive laboratory work for the Mathematics Teacher Questionnaire will focus on items from non-NCES sources and items previously fielded in other NCES studies with high school teachers. We will test new items on mathematics content coverage and the amount of time teachers spend on the content they cover (tied to Common Core State and Practice Standards) to gain a clear understanding of participants’ comprehension of how lessons are linked to the standards as well as their ability to recall and anticipate coverage across the school year. Cognitive laboratory work will also ensure that prior NCES items used for high school teachers remain appropriate for middle grade teachers and will ascertain whether any wording changes we propose to these items remain clear and easy to understand.
Mathematics Teacher Student Report (TSR). The primary goal of cognitive laboratory work for the TSR is to determine if questions cover the skills teachers look for in assessing students’ mathematics performance and whether they consider some of these skills to be more or less important.
School Administrator Questionnaire. The goal of cognitive interviews for the School Administrator Questionnaire is to explore whether school administrators from schools with varying grade configurations feel that they are able to respond to the survey items. We aim to confirm that the items are specific enough to get the desired data, yet general enough to be applicable across school configurations. The cognitive interview will include items on the types and levels of assistance made available to students transitioning from elementary to middle school and middle to high school. We will also use the cognitive interview to ask questions about how best to reach administrators and ensure high response rates given past NCES experience of difficulties with this group.
The cognitive laboratory work for the MGLS:2017 instruments consists of different data collection activities. Table 3 presents a summary of the different cognitive laboratory activities and an estimate of burden. All cognitive activities will be conducted in English by researchers from the design team, who have developed the item pools and the cognitive laboratory protocols. The professional staff comprising the cognitive interviewing team will include an instrument lead and up to three interviewers. Staff overseeing the work have extensive experience using cognitive interviewing techniques for federal studies. All cognitive interview staff will participate in a specialized training prior to the commencement of the cognitive laboratory work. The lead will conduct in-person or webinar sessions, lasting half a day, that will include a step-by-step review of the protocols, and a review of the specific observation and probing techniques. Hard copy materials will be reviewed. Training will involve interactive activities as checks of understanding. Weekly debriefings will be held among the design team during cognitive laboratory work. When appropriate, we plan to coordinate cognitive laboratory work between the assessments and questionnaires for a given group of participants. For example, the teacher focus groups will discuss both assessment items and questionnaire items—the Student Mathematics Assessment, the Mathematics Teacher Questionnaire, and the Mathematics TSR. This will help ensure alignment of constructs from the Student Mathematics Assessment and the instructional measures from the Mathematics Teacher Questionnaire, and offer an opportunity to obtain important feedback on how well teachers can respond to the items on the TSR. The MGLS:2017 measures will be developed using an iterative process. The outcomes of the cognitive interviews will be reviewed on a continuous basis, revisions to the items will be made as necessary, and the revised items used with subsequent participants. Deciding beforehand how many revisions will be needed is problematic, but we assume that at least one round of revisions will be needed after we have completed about one-third of the scheduled cognitive interviews. The following is a description of the data collection methods for each instrument.
Student Mathematics Assessment. We will work individually with 30 students in person, 10 students per grade (6th, 7th, and 8th) for approximately 60 minutes each to test items for inclusion in the Mathematics Assessment using paper-and-pencil forms. In many cases assessments will be conducted at one of the DIR or Mathematica corporate offices. When the distance is greater than about 30 minutes travel, one of our staff will identify a community space in which to meet (such as a private room at the local library), and will travel to the student. Because the recruitment and consent process will be conducted without any contact with or involvement from the schools, no cognitive laboratory activities will be conducted in a school setting.
During the cognitive interview, students will complete one block of 15-20 test items, using paper and pencil, in the presence of a researcher, and will then be interviewed following a standard protocol (Volume III). The use of paper and pencil will allow us to instruct students to circle or highlight words that are difficult to understand as they work through the series of items. It also allows us to capture comments from the student without interrupting their thought process as they complete the assessment. The computerized administration that will be used in the field test will differ in the use of a mouse or touchpad instead of a pencil, but the cognitive aspects of the item will be the same. After students have completed the items, we will ask them whether the questions and instructions are clear, how familiar the tasks are, and whether they understand the question being asked (even if they do not know how to answer it). In particular, we will probe students about multi-sentence word problems by asking them to restate what the question is asking in their own words. We will also pay close attention to students’ understanding of varied item presentations (for example, students’ ability to recognize fractions and equations in multiple formats, such as a/b or ½ or as they would view it in equation editor with “a over b”). These interviews will gather information to help evaluate the language and literacy demands of the items, the ease of understanding the questions, and any ambiguities in the questions or response options.
With the permission of students and their parents, interviews will be audio-recorded for note-taking purposes. A digital recorder placed on the table between the student and the interviewer will be used to record the interview. During the interview, the interviewer will note pertinent aspects of the interview process, such as the student’s level of motivation and any special circumstances that might affect the interview. As needed, recordings will be analyzed afterward if it is determined that additional information is required beyond what was captured in the protocol or if additional members of the development team need to hear a student’s tone and complete response to a question. We will keep the paper copies of students’ assessment items to review for any indications of confusing sentences or words.
Student Executive Function Assessment. The cognitive interviews for the executive function assessment may encompass two phases. Initially, we will conduct nine one-on-one interviews for approximately 60 minutes with 3 students per grade (6th, 7th, and 8th). If necessary, based on the finding from the initial set of nine interviews, administration procedures will be revised and a new set of nine students (3 per grade) will participate in a second round of cognitive labs. This means, we will have potentially 18 respondents, 6 per grade, for approximately 60 minutes per session. The MGLS:2017 executive function tasks will measure inhibitory control and working memory. We will program four tasks on a laptop or tablet for the students to complete. As with the Student Mathematics Assessments, we expect most students will travel to one of the DIR or Mathematica corporate offices to complete the cognitive laboratory activity.
After explaining the purpose of the activities and the student’s participation, we will provide the executive function (EF) assessment to the student and observe him or her, using a standard protocol (Volume III). The EF assessments will have directions and practice items at the beginning of the assessment that will be written as well as recorded for the student to hear through headphones. We expect most of the students participating in the EF assessments to do so in a comfortable conference room at Mathematica’s Washington, DC office. With the parent’s and student’s permission, interviews will be video-recorded for note-taking purposes using an unobtrusive camera, likely placed on the laptop and focused on the student. Cognitive lab interviewers will be trained to set up the camera appropriately, record the assessment, and review the recording to supplement their notes on student performance. As with the audio recordings, the video recordings will be analyzed afterward, as needed, if it is determined that additional information is required beyond what was captured in the interviewer’s notes during the cognitive laboratory session. The interviewer will take notes during the cognitive laboratory session and will fill in any gaps using the video recording after the interviews. The interviewer will pay special attention to whether students have any trouble getting started with the tasks or seem to have difficulty understanding the directions. After the students complete the assessment, the interviewer will debrief them about their experiences to identify any problems with administration, procedures, or directions for the executive function tasks. Each task takes approximately 5 to 10 minutes to complete and, together with the debriefing questions, the cognitive activity will last about 60 minutes.
Student Interview. For the student interview, we plan to include 18 students, 3 per grade (6th, 7th, and 8th), for approximately 60 minutes each followed by a second set of 3 per grade after reviewing the first set of results across the different locations. The focus will be to assess items from the Student Questionnaire. As with the Student Mathematics Assessment and Executive Function Assessment, these interviews will be conducted in person at a corporate office or, when the distance is greater than about 30 minutes, an interviewer will travel to the student and conduct the interview in a community space (such as a private room at the local library). With the parents’ and participants’ permission, interviews will be audio-recorded for note-taking purposes. During the interview, the interviewer will note pertinent aspects of the interview process, such as the student’s level of motivation and any special circumstances that might affect the interview. As needed, recordings will be analyzed afterward if it is determined that additional information is required past what is captured within the protocol.
The student interview will follow a standard protocol (Volume III), taking about 60 minutes. We will examine Student Questionnaire items by conducting semi-structured interviews with probes to make sure students understand the questions. The interview will be interactive; for example, we use questioning techniques after individual items so that we can get sufficient feedback from students about new topics that have not been previously asked of a diverse group of middle grade students. The semi-structured interviews and probing will help inform our decisions to make modifications to the wording of pre-existing items. For example, our cognitive laboratory work will include items on students’ academic expectations and their perceptions of parenting behaviors; we will ask students about the meaning of these items, what factors they considered in thinking about the items, and how they arrived at their ultimate responses.
Parent Interviews. We will conduct eight one-on-one interviews with parents, two or three parents of students in each grade (6th, 7th, and 8th), for approximately 30 minutes to test items and topics from the Parent Questionnaire. Parents will be sent the topics and items beforehand so that they can read the instructions and be familiar with what will be covered (estimated at 5 minutes of prep time). These interviews will be conducted via telephone or in person (for example, when parents bring their child to the student activities). With the participants’ permission, interviews will be audio-recorded for note-taking purposes. As needed, recordings will be analyzed afterward if it is determined that additional information is required past what is captured within the protocol.
The parent interview will be conducted following a standard protocol (Volume III) and will include specific survey items followed by probes to inform our decisions on potential modifications to wording of items. For example, we will include items on parental involvement and conversations about mathematics course taking. The section on parental involvement will be focused on attempting to unpack the thought process around whether parents are basing their answers only on their own individual parenting and interactions with the focal child or together with another parent or guardian. We will also include questions on conversations with middle grade students regarding their future plans for course taking to ensure that the items are age appropriate and cover the necessary breadth of topics.
School Administrator Interviews. We will conduct eight one-on-one interviews with school administrators. These interviews will be conducted via telephone and will take approximately 30 minutes. With the participants’ permission, interviews will be audio-recorded for note-taking purposes. As needed, recordings will be analyzed afterward if it is determined that additional information is required past what is captured within the protocol.
Administrators will be sent the topics and items beforehand so that they can read the instructions and be familiar with what will be covered during the interview (estimated at 5 minutes of prep time). The interview will cover topics related to school services, course offerings, and assisting students with transitions that may present difficulties for administrators to answer. In particular, we will gather information about their sources or alternate people to talk to. The interview will also gather information about the best way to reach administrators to ensure high response rates.
Mathematics Teacher Focus Groups. We plan to conduct focus groups with 15 mathematics teachers, approximately 5 teachers per grade (6th, 7th, and 8th). The focus groups will be conducted in three groups of roughly five teachers and duration of approximately 60 minutes per focus group. In addition to the time in the focus group, we will provide advance materials for review and ask that each teacher spend approximately 3 hours reviewing the assessment items and questionnaire items. Therefore, each teacher will be asked for approximately 4 hours of time.
The focus groups will be held in the evenings via webinar. Focus groups (as compared to one-on-one interviews) provide an opportunity for teachers to share ideas and discuss issues with assessment items in a more synergistic manner, garnering richer information. With a “virtual” focus group, we are able to solicit feedback from teachers in different regions of the country. We will determine if participants have access to the Internet and webinar technology to support that approach, which would permit focus group facilitators to display items for participants during the group discussion.
Approximately three weeks prior to the focus groups, we will send hard-copy versions of items for the Student Mathematics Assessment, Mathematics Teacher Questionnaire, and TSR for participants’ review. We expect that teachers will spend approximately 3 hours reviewing the assessment and teacher survey items. For the Mathematics Assessment, we will ask each teacher to review approximately 60 items. We will divide the MGLS:2017 item pool into 3 sets by grade-level standards with overlap of some additional items for the adjacent grade levels. If teachers teach more than one grade level, those teachers will be asked to consider the items in relation to a specific grade. We will provide teachers with a draft of all 150 student assessment items (directing them to focus on a particular set of about 60 items); the teacher survey items will be limited to only the specific items undergoing cognitive laboratory work. Prior experience from the ECLS-K suggests that teachers are usually interested in reviewing the entire item pool. For the teacher instruments, teachers will be instructed to consider a particular class period or student when completing the questions in hard-copy form, and will be asked to bring and refer to the form during the focus group. Teachers will be asked to enter any comments that they may have directly on the hard copies of the mathematics items and teacher survey instruments. We will collect these at the conclusion of the focus groups.
Each of the three “virtual” focus groups will focus on a single grade level, with 5 teachers in the given grade group (6th, 7th, and 8th). During the one-hour focus group session, we will spend the first 20 to 30 minutes discussing the Mathematics Student Assessment and the remaining time getting feedback on the Mathematics Teacher Questionnaire and TSR. We expect that the majority of the comments on the student mathematics assessment items will be available in hard copy and easily interpretable, so there will be limited need for discussion. Following the protocol in Volume III, we will ask the teachers to comment on the face validity of the items, the familiarity of item types, the readability and clarity of items, and to identify any ambiguities in the questions. Teachers will be sent directions for this level of review. The discussion will focus on questions about items that pose particular concerns, eliciting recommendations for improving accessibility to the questions (for example, use of different wording); we will also discuss any distractors the teachers detect (for example, common misconceptions of students). During the second half of the group session, we will ask teachers to provide feedback on a variety of items in the Teacher Questionnaire, including the new items on content coverage and instructional allocation. We will also solicit feedback on whether teachers feel adequately familiar with their students to respond to items to be included in the TSR.
The mathematics assessment team leader will facilitate the focus group session while a research support staff member will take notes. With the participants’ permission, the discussion will be audio-recorded for note-taking purposes. As needed, recordings will be analyzed afterward if it is determined that additional information is required past what is captured within the protocol. In addition to soliciting verbal feedback during the groups, the advance materials will request that teachers write any comments on items that pose concerns and raise questions for them, and then send the hard-copy versions back to the study team for review. This will help ensure that all issues are addressed and all teachers can provide input beyond what can be covered in a one-hour focus group.
In addition to project staff at DIR, Mathematica, and ETS, several consultants have provided expert advice during the instrument development process. William Schmidt (Michigan State University) has assisted with the development of the student mathematics assessment and recommended measures and survey items for the Mathematics Teacher Questionnaire and TSR. Jacquelynne Eccles (University of Michigan) has assisted with the development of the socioemotional and executive function measures and identified content and recommended survey items for the student and parent questionnaires. Donald Rock (independent consultant, previously at ETS) has assisted with the development of the student math assessment and provided expert advice on structure of the mathematics assessment and the design of the field test sample. Kathy Terry (Houston Independent School District) has assisted with the parent and school administrator measures. She is also assisting with the field test plan and with the approaches used to secure cooperation from districts, schools and families in the field test.
The MGLS:2017 design team also convened a series of Content Review Panels (CRPs) to gather input on key constructs to measure in the MGLS:2017 and identify possible item sources. The MGLS:2017 Mathematics Assessment Panel first met in Washington DC on June 18-19, 2013 to provide input during the assessment development process and ensure that the math assessment will result in reliable and valid data on students' mathematics knowledge and skills. Specifically, the Math CRP helped to guide decisions about which domains of mathematics are most critical to sample for the assessment. The panel also provided input on the amount of assessment time that should be devoted to different domains and areas of mathematics and offered their advice for the best ways to assess these given the constraints of the MGLS:2017 assessment (for example, limited assessment time). Following the meeting, members of the Math CRP were asked to review a revised version of the assessment framework and test blueprint and provide us with a short, written reaction to the domains and their definitions, the learning progressions, the allocation of items within domains and depth of knowledge categories in the framework and blueprint. The Math panel participated in a 3-hour webinar on September 13, 2013 to discuss these topics.
The MGLS:2017 Executive Function panel met via webinar on July 18, 2013 to provide input during the assessment development process and help guide decisions about which constructs are most critical to examine. In particular, the panel gave input on potential measures and scales for assessing middle grades students’ executive functioning (for example, inhibitory control, cognitive flexibility, working memory).
The MGLS:2017 Socioemotional-Student-Family panel met in person for a 2-day meeting in Washington DC (July 25-26, 2013) to provide input during the instrument development process and ensure that the assessment and questionnaires are likely to result in reliable and valid data on students’ socioemotional functioning, school experiences, and home life. The panel provided input on potential items and scales as well as the best reporter for various aspects of students’ socioemotional functioning (e.g., perceived competence, internalizing behavior, conscientiousness, engagement). The panel also gave input on the student and parent questionnaires that the MGLS:2017 will use to collect data on children’s family, school, and out-of-school time experiences.
Finally, the MGLS:2017 School Administrator panel met via webinar on August 16, 2013 to provide input on constructs to measure school-level characteristics. This panel too gave input on potential items and scales and the best reporter for the constructs of interest.
Following the CRP meetings, a revised construct matrix was sent out to panelists for comment. They were asked to provide written feedback on the high priority constructs that should be included in the field test questionnaires, the waves when they would recommend measuring those constructs (grade 6, 7, and/or 8), and who would be the most appropriate respondent(s). The instruments being evaluated in the cognitive interviews reflect recommendations made by these various CRPs (CRP members are listed in Table 1).
Table 1. Members of the MGLS:2017 Content Review Panels
Name |
Affiliation |
Expertise |
Mathematics Assessment Content Review Panel |
||
Tom Loveless |
Brookings Institute |
Policy, Math curriculum |
Linda Wilson |
Formerly with Project 2061 |
Math education, math assessment, and middle school assessment, author for Assessment Standards for School Mathematics (NCTM) and NAEP math framework, teacher |
Kathleen Heid |
University of Florida |
Math education, use of technology, teacher knowledge, NAEP Grade 8 mathematics standing committee member |
Edward Nolan |
Montgomery County Schools |
Math curriculum and standards, large-scale assessment of middle grade students |
Lisa Keller |
UMass Amherst |
Psychometrics, former math teacher |
Paul Sally |
University of Chicago |
Math education, mathematics reasoning, mathematically talented adolescents |
Margie Hill |
University of Kansas |
Co-author of Kansas math standards, former NAEP Math Standing Committee member, former district math supervisor |
Executive Function Content Review Panel |
||
Lisa Jacobson |
Johns Hopkins University; Kennedy Krieger Institute |
Executive functioning; attention; neurodevelopmental disorders; parent and teacher scaffolding development of executive functioning skills |
Dan Romer |
University of Pennsylvania |
Adolescent risk taking |
James Byrnes |
Temple University |
Self-regulation; decision making; cognitive processes in mathematics learning |
Name |
Affiliation |
Expertise |
Socioemotional-Student-Family Content Review Panel |
||
James Byrnes |
Temple University |
Self-regulation; decision making; cognitive processes in mathematics learning |
Russell Rumberger |
University of California, Santa Barbara |
School dropouts and ethnic and language minority student achievement |
Tama Leventhal |
Tufts University |
Family context; adolescence; social policy; communities and neighborhood indicators |
Susan Dauber |
Bluestocking Research |
School organization; educational transitions; urban education; parent involvement and family processes |
Scott Gest |
Pennsylvania State University |
Social networking, social skills, and longitudinal assessment of at-risk populations. |
Kathryn Wentzel |
University of Maryland |
Social and academic motivation; self-regulation; school adjustment; peer relationships; teacher-student relationships; family-school linkages |
Richard Lerner |
Tufts University |
Adolescent development and relationships with peers, families, schools, and communities |
School Administrator Content Review Panel |
||
Susan Dauber |
Bluestocking Research |
School organization; educational transitions; urban education; parent involvement and family processes |
George Farkas |
University of California, Irvine |
Schooling equity and human resources |
Jeremy Finn |
State University of New York at Buffalo |
School organization, school dropouts |
Edward Nolan |
Montgomery County Schools |
Large urban school system administrator |
Tom Loveless |
Brookings Institute |
Policy, Math curriculum |
Recruiting
DIR and Mathematica will recruit participants using networks of professional contacts (including CRP members), community organizations, and charter or private schools to help achieve the desired cultural, linguistic, grade, and geographical diversity in the samples. The cognitive laboratory activities combine instruments or link participants where possible to streamline recruitment and optimize interview time:
Students and parents will be recruited together, when we contact parents to obtain consent for their minor child. Student assessments and interviews will be conducted in person; the three target locations (Houston, Chicago, and Washington, DC) will allow us to obtain a diverse sample within short distances to our cognitive interviewing staff. Parents will be interviewed over the phone using a toll-free number or in person while their child completes their activities.
School staff will not be linked to students, as the cognitive interviewing activities for teachers and school administrators will be done over the phone and have different participant criteria. Further, in order to achieve adequate diversity of each group, the teachers and administrators interviewed will not necessarily work together in the same school.
For the teacher focus groups, we will target a broader geographical area to capture varied curricula and a range of academic press in mathematics. We intend to recruit at least three teachers from each of five geographic regions, with at least one teacher per grade level per region (although we expect that some teachers will be teaching multiple grade levels).
For the school administrator interviews, we will build on established relationships of project staff at the three target locations. To the extent possible, we will recruit administrators working at schools with different configurations and sizes, in urban and rural areas, and in public and private school settings.
Students. To ensure a diverse sample, students will be recruited by DIR and Mathematica staff to achieve an even mix: of sixth-grade, seventh-grade, and eighth-grade students; of gender; of race/ethnicity (Black, Asian, White/ Hispanic); and of socioeconomic background. Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics.
DIR and Mathematica staff will use existing contacts to reach out directly to their network of parents and other adults in the community who work with children (e.g., through churches, sports, and community-based organizations). If needed, the recruiters may also ask members of the CRPs for referrals (members of the CRP work with students participating in mathematics programs designed for both high performing students and students struggling in mathematics) or place advertisements in print or online sources. The recruitment team will use a variety of methods (e-mail, letter, and phone) to make the initial contacts.
Interested participants will be screened to ensure the mix of characteristics described above. Both companies will use the same basic recruitment screener, customized as necessary for the specific population being recruited. When recruiting student participants, staff will first speak to the parent/guardian of the interested minor before starting the screening process. During this communication, the parent/guardian will be informed about the objectives, purpose, and participation requirements of the student data collection effort and the activities it entails, followed by screening of the student.
Once parents agree to their child’s participation, they will be asked to complete and sign a written consent form for their child’s participation prior to scheduling the student activities. After confirmation that participants are qualified, willing, and available to participate in the research project, they will receive a confirmation e-mail/letter. Only after DIR or Mathematica has obtained written consent from the parent/guardian will a student be allowed to participate in the cognitive interview session. See Volume IV for representative recruitment, consent, confirmation, and thank you materials.
Parents. We will determine a parent’s interest in participating in the parent cognitive interview at the same time as we are speaking with them about their child’s participation. As with students, we are interested in a sample that includes as much diversity as possible. We will attempt to include, in order from most to least important, a mix: of parents of sixth-grade, seventh-grade and eighth-grade students; of socioeconomic backgrounds; of race/ethnicity (Black, Asian, White, Hispanic); and of mothers and fathers. We will use the same procedures for determining whether a parent satisfies these criteria as for students. Also, parents will be asked to indicate their interest in participating on the same consent form that is used for their child.
Teachers/School Administrators. To ensure a diverse sample, teachers and school administrators will be sampled to obtain the following criteria:
A school population that includes grade 6th, 7th, or 8th, as appropriate;
Teachers who are teaching grade 6th, 7th, or 8th math at some point during the current school year;
A mix of school sizes;
Various school configurations (i.e., schools that include grades 6-8 and others that only include one or more of these grades or that offer grades below or above grades 6-8); and
A mix of school socioeconomic demographics.
Although the sample will include a mix of these characteristics, the results will not explicitly measure differences by those characteristics. Because the teacher focus groups and administrator interviews will be done over the phone rather than in person, recruitment will not be restricted to the Houston, Chicago, and Washington DC areas. Instead, DIR and Mathematica staff will use their contacts nationwide (including but not limited to school staff they have worked with on previous studies and members of various professional organizations) to identify teachers and school administrators who meet the criteria for participation. If needed, the recruiters may also ask members of the CRPs for referrals, or place advertisements in print or online sources. All teachers and administrators will be asked to sign a consent form prior to their participation in a cognitive interview or focus group.
Incentives
To attract participants to participate in the cognitive laboratory activities and to thank them for their time, students will receive a $25 gift card for participating; parents who travel to the DIR or Mathematica offices or another location for the student activities will receive $20; parents who complete the parent cognitive interview will receive a $20 gift card; teachers will participate outside their normal work day to review items and participate in the one-hour focus group, and will receive a $75 gift card; and school administrators will participate in the interview outside their normal work day and will receive a $25 gift card.
Table 2. Cognitive Laboratory Incentives, by Activity
Activity (Number of Participants) |
Incentive Amount |
Student Interview (18), Mathematics Assessment (30), and Executive Function Assessment (18) |
$25 for student + $20 for parent bringing them in |
Parent Interview (8) |
$20 |
Teacher Focus Group (15) |
$75 |
School Administrator Interview (8) |
$25 |
At the beginning of the cognitive interview, respondents will be informed that their participation is voluntary and that their answers may only be used for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002), 20 U.S.C., § 9573]. Written consent will be obtained from legal guardians (of minor students), teachers, and school administrators before interviews are conducted. No personally identifiable information will be maintained after the interview analyses are completed. With respondent permission, interviews will be audio recorded (as well as videotaped for the executive function tasks) for use in later analysis. If the respondent (or the student’s parent) indicates that he/she does not want to be audio recorded (or videotaped), only written notes will be taken. The recordings and notes will be destroyed at the conclusion of the MGLS:2017 design contract.
Table 3 shows the expected burden for the cognitive laboratory activities. We anticipate initially contacting approximately 16 school administrators, 30 teachers, and 132 parents during recruitment to yield the final sample of respondent interviews to test the school administrator questionnaire, the teacher survey and teacher student report, and the student questionnaire, respectively. The estimated burden for recruitment is 10 minutes on average, for a total of approximately 30 burden hours. The estimates of the numbers of administrators, teachers, and parents who will need to be contacted about their participation and the recruitment time are based on estimates from similar cognitive laboratory studies (e.g., TIMMS, NAEP, and ECLS-K). Three focus groups will be conducted with roughly 5 teachers participating in each. Including the time to review the instruments prior to the focus group, response burden per teacher is expected to be 240 minutes. Direct child assessments will be conducted with 10 students from each grade (6th, 7th, and 8th), for a total of 30 students. In addition, we will conduct up to 18 one-on-one interviews with students completing the Executive Function (EF) assessment. Another 18 students (6 students per grade) will complete the student questionnaire. Total burden for the MGLS:2017 cognitive laboratory activities is approximately 166 hours.
Table 3 – Respondent Burden Chart for MGLS:2017 Cognitive Laboratory Activities
Activity (Number of Items/Topics per Participant) |
Subpopulations Represented (grades 6, 7, and 8) |
Mode |
Number of Respondents |
Length (mins) |
Total Burden (Hrs) |
Recruitment |
|
Mail, e-mail, telephone |
178* |
10 |
30 |
Student Mathematics Assessment (15 to 20 items) |
10 students per grade; Cultural diversity |
In-person, hard copy assessment |
30 |
60 |
30 |
Student Executive Function Tasks (4 tasks) |
6 students per grade; Cultural diversity |
In-person, computer assessment |
18 |
60 |
18 |
Student Questionnaire (8 survey items/item series) |
6 students per grade; Cultural diversity |
In-person interview |
18 |
60 |
18 |
Parent Questionnaire (4 survey items/item series) |
2-3 parents of students per grade; Cultural diversity |
In-person or telephone interview, hard copy topic form |
8 |
35 |
5 |
School Administrator Questionnaire (4 topics/survey item series) |
Varying school types, sizes, and grade configurations |
Telephone interview, hard copy topic form |
8 |
35 |
5 |
Mathematics Teacher Questionnaire (4 survey items/item series), Mathematics Assessment (60 assessment items with discussion on about 10 items), and Mathematics Teacher Student Report (1survey item/item series) |
Approximately 5 teachers per grade who teach: different mathematics courses, diverse students, and in schools of various sizes |
Hard copy forms with focus group debriefing |
15 |
240 |
60 |
Study Total** |
275 responses |
|
178 |
N/A |
166 |
* These estimates account for anticipated attrition rates of 50 percent from initial contact to participation of 66 students, 15 teachers, and 8 administrators.
** To avoid over-counting, only the total number of students/parents (n=132), teachers (n=30) and administrators (n=16) contacted during recruitment contribute to the total number of respondents.
The estimated cost to the government to conduct the MGLS:2017 cognitive laboratory activities is $123,000.
Table 4 provides the schedule of milestones and deliverables for the cognitive interviews. Recruitment for the interviews is expected to begin in March 2014. Interviews and focus groups need to be completed by May 2014. The results of the cognitive laboratory activities will be summarized in a May 2014 report, as well as interim reports submitted during the cognitive laboratory process.
Table 4. Schedule of MGLS:2017 Cognitive Laboratory Milestones and Deliverables
Activity |
Date |
Recruitment begins |
March 2014 |
Data collection, coding, and analysis |
March – May 2014 |
Cognitive laboratory report completed |
May 2014 |
1 The field test will provide much more information about how these items work for different groups of students such as racial-ethnic minorities and students from different socioeconomic backgrounds. The field test analyses will examine differences in distributions (e.g., means, ranges, response frequencies) across items and scales.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |