NCES System Clearance for Cognitive,
Pilot, and Field Test Studies
REQUEST FOR OMB Clearance
Supporting Statement Part A
Prepared by:
National Center for Education Statistics
U.S. Department of Education
Washington, DC
May 26, 2010
Part A JUSTIFICATION
A.1 Importance of Information 1
A.2 Purposes and Uses of the Data 4
A.3 Improved Information Technology (Reduction of Burden) 5
A.4 Efforts to Identify Duplication 5
A.5 Minimizing Burden for Small Institutions 6
A.6 Frequency of Data Collection 6
A.7 Special Circumstances 6
A.8 Consultations Outside NCES 6
A.9 Payments or Gifts to Respondents 6
A.10 Assurance of Confidentiality 6
A.11 Justification for Sensitive Questions 7
A.12 Estimates of Burden 7
A.13 Total Annual Cost Burden 8
A.14 Annualized Cost to Federal Government 8
A.15 Program Changes or Adjustments 8
A.16 Plans for Tabulation and Publication 8
A.17 Display OMB Expiration Date 8
A.18 Exceptions to Certification Statement 8
PART B COLLECTION OF INFORMATION EMPLOYING STATISTICAL INFORMATION
B.1 Respondent Universe and Response Rates 1
B.2 Procedures for Collection of Information 1
B.3 Maximizing Response Rates 1
B.4 Tests of Procedures 1
B.5 Individuals Consulted on Statistical Design 2
Appendix A NCES Authorization Legislation
This is a request for a 3-year renewal of the generic clearance for the National Center for Education Statistics (NCES) that will allow it to continue to develop, test, and improve its survey and assessment instruments and methodologies. The procedures utilized to this effect include but are not limited to experiments with levels of incentives for study participants, tests of various types of survey operations, focus groups, cognitive laboratory activities, pilot testing, exploratory interviews, experiments with questionnaire design, and usability testing of electronic data collection instruments.
NCES is requesting that the current clearance for testing of new methodologies for surveys and assessment be extended for three years. We have found that the testing clearance is a helpful vehicle for evaluating questionnaires/assessments and various data collection procedures. In the past, traditional methods typically consisted of small "hothouse" field tests accompanied by interviewer debriefing. The information collected through their use is quite limited in its ability to detect and diagnose problems with the instruments and the procedures being tested. The generic testing clearance has allowed NCES to take advantage of a variety of methods that are useful for identifying questionnaire/assessment and procedural problems, suggesting solutions, and measuring the relative effectiveness of alternative solutions. Through the use of these techniques, employed routinely in the testing phase of NCES surveys, questionnaires and assessments can be simplified for respondents, respondent burden can be reduced, and the quality of the questionnaires and assessments used in continuing and one-time surveys and assessments can be improved. Thus an increase in the quality of the data collected through these surveys has been achieved as well.
NCES is requesting a three-year generic clearance for pretesting, during which NCES will provide periodic reports on pretesting activities. These pretesting activities will include, in addition to methods used in the past, expanded field tests including split sample questionnaire experiments in multiple panels and usability testing of electronic data collection instruments. The focus of these activities will include investigation of item types, research about incentives (cash and non-cash), and mode (telephone, paper and pencil, computer-based, mail-out and mail-in, etc.) and methodology of questionnaires and assessments, as well as testing of items.
This clearance package is intended to serve as a request for system generic clearance. In this document we have provided a description of the scope of possible activities that might be covered under this clearance. NCES requests the same conditions that have been included in the previous clearance agreement for Cognitive, Pilot, and Field Test Studies (OMB# 1850-0803 v.9). The requested clearance is important to NCES’ use of pretesting activities, because of the length of time required to plan the activities. This system generic clearance will go through the usual two Federal Register Review periods. Subsequent to which, NCES requests that OMB review and comment/clear on proposed studies in a two-week period with no 30-day Federal Register Notice period required under the Generic clearance. This clearance is similar to the testing clearances held by the Census Bureau, the Bureau of Labor Statistics, and the National Science Foundation.
Programs that have submitted developmental studies under this clearance include the Early Childhood Longitudinal Studies (ECLS), the National Household Education Surveys (NHES), the Schools and Staffing Survey (SASS), pilot of the Teacher Compensation Survey (TCS), postsecondary survey activities, and the National Assessment of Educational Progress (NAEP). We anticipate that other NCES programs will also be able to use this clearance for their developmental projects.
The specific methods proposed for coverage by this clearance are described below. Also outlined are the procedures currently in place for keeping OMB informed about the identity of the surveys and the nature of the research activities being conducted.
The methods proposed for use in questionnaire and assessment development are as follows:
Field or pilot test. For the purposes of this clearance, we are defining field tests as data collection efforts conducted among either purposive or statistically representative samples, for which evaluation of the questionnaire and/or procedures is the main objective, and as a result of which only research and development (R&D) and methodological reports may be published, but based on which no statistical reports or data sets will be published. Field tests are an essential component of this clearance package because they serve as the vehicle for investigating basic item properties, such as reliability, validity, and difficulty, as well as feasibility of methods for standardized administration (e.g., computerized administration) of forms. Under this clearance a variety of surveys will be pretested, and the exact nature of the surveys and the samples is undetermined at present. However, due to the smaller nature of the tests, we expect that some will not involve representative samples. In these cases, samples will basically be convenience samples, which will be limited to specific geographic locations and may involve expired rotation groups of a current survey blocks that are known to have specific aggregate demographic characteristics. The needs of the particular sample will vary based on the content of the survey being tested, but the selection of sample cases will not be completely arbitrary in any instance.
Behavior coding. This method involves applying a standardized coding scheme to the completion of an interview or questionnaire, either by a coder using a tape-recording of the interview or by a "live" observer at the time of the interview. The coding scheme is designed to identify situations that occur during the interview that reflect problems with the questionnaire. For example, if respondents frequently interrupt the interviewer before the question is completed, the question may be too long. If respondents frequently give inadequate answers, this suggests there are some other problems with the question. Quantitative data derived from this type of standardized coding scheme can provide valuable information to identify problem areas in a questionnaire, and research has demonstrated that this is a more objective and reliable method of identifying problems than the traditional interviewer debriefing, which is typically the sole tool used to evaluate the results of a traditional field test (New Techniques for Pretesting Survey Questions by Cannell, Kalton, Oksenberg, Bischoping, and Fowler, 1989).
Interviewer debriefing. This method employs the knowledge of the employees who have the closest contact with the respondents. In conjunction with other methods, we plan to use this method in our field tests to collect information about how interviewers react to the survey instruments.
Exploratory interviews. These may be conducted with individuals to understand a topical area and may be used in the very early stages of developing a new survey. It may cover discussions related to administrative records (e.g. what types of records, where, and in what format), subject matter, definitions, etc. Exploratory interviews may also be used in exploring whether there are sufficient issues related to an existing data collection to consider a redesign.
Respondent debriefing questionnaire. In this method, standardized debriefing questionnaires are administered to respondents who have participated in a field test. The debriefing form is administered at the end of the questionnaire being tested, and contains questions that probe to determine how respondents interpret the questions and whether they have problems in completing the survey/questionnaire. This structured approach to debriefing enables quantitative analysis of data from a representative sample of respondents, to learn whether respondents can answer the questions, and whether they interpret them in the manner intended by the questionnaire designers.
Follow-up interviews (or reinterviews). This involves re-interviewing or re-assessing a sample of respondents after the completion of a survey or assessment. Responses given in the reinterview are compared with the respondents’ initial responses for consistency of responses. In this way, reinterviews provide data for studies of test–re-test reliability and other measures of the quality of data collected. In turn, this information aids in the development of improved (more reliable) measures.
Split sample experiments. This involves testing alternative versions of questionnaires, and other collection methods such as mailing packages, and incentive treatments, at least some of which have been designed to address problems identified in draft questionnaires or questionnaires from previous survey waves. The use of multiple questionnaires, randomly assigned to permit statistical comparisons, is the critical component here; data collection can include mail, telephone, Internet, or personal visit interviews or group sessions at which self-administered questionnaires are completed. Comparison of revised questionnaires against a control version, preferably, or against each other facilitates statistical evaluation of the performance of alternative versions of the questionnaire. Split sample tests that incorporate questionnaire design experiments are likely to have a larger maximum sample size (e.g. several hundred cases per panel) than field tests using other methodologies. This will enable the detection of statistically significant differences, and facilitate methodological experiments that can extend questionnaire design knowledge more generally for use in a variety of NCES data collection instruments.
Cognitive and usability interviews. This method involves intensive, one-on-one interviews in which the respondent is typically asked to "think aloud" as he or she answers survey questions. A number of different techniques may be involved, including asking respondents to paraphrase questions, probing questions asked to determine how respondents came up with their answers, and so on. The objective is to identify problems of ambiguity or misunderstanding, or other difficulties respondents have answering questions. This is frequently the first stage of revising a questionnaire.
Focus groups. This method involves group sessions guided by a moderator, who follows a topical outline containing questions or topics focused on a particular issue, rather than adhering to a standardized questionnaire. Focus groups are useful for surfacing and exploring issues (e.g., confidentiality concerns) which people may feel some hesitation about discussing.
Procedures for Clearance
Before testing activity is undertaken, NCES will provide OMB with a memo describing the study to be conducted and a copy of questionnaires and debriefing materials that will be used. Depending on the stage of questionnaire development, this may be a printed questionnaire, a set of prototype items showing each item type to be used and the range of topics to be covered by the questionnaire, or an interview script. When split sample experiments are conducted, either in small group sessions or as part of a field test, the different versions questionnaires to be used will be provided. For a test of alternative procedures, the description and rationale for the procedures will be submitted. A brief description of the planned field activity will also be provided. NCES requests that OMB raise comments on substantive issues within 10 working days of receipt.
Data collection for this project is authorized under the authorizing legislation for the questionnaire being tested. In most cases, this will be Education Sciences Reform Act (ESRA), Part C, Section 153, although other legislation, such as Title 13 or 15, may apply for surveys conducted in concert with other Federal agencies. A copy of the Education Sciences Reform Act, Part C, Section 153, is attached. We do not now know what other titles will be referenced, since we do not know what survey questionnaires will be pretested during the course of the clearance. If other than ESRA, the authorizing statute will be specified in each IC.
2. Needs and Uses
The information collected in this program of developing and testing questionnaires will be used by staff from NCES and sponsoring agencies to evaluate and improve the quality of the data in the surveys and assessments that are ultimately conducted. None of the data collected under this clearance will be published for its own sake.
Because the questionnaires being tested under this clearance are still in the process of development, the data that result from these collections are not considered official statistics of NCES or other Federal agencies. Data will not be made public, except it can be included in research reports prepared for sponsors inside and outside of NCES. The results may also be prepared for presentations related to survey methodology at professional meetings or publications in professional journals.
Information quality is an integral part of the pre‑dissemination review of the information disseminated by NCES (fully described in NCES’s Statistical Standards and IES Style Guide, both of which can be found at http://nces.ed.gov/statprog/standards.asp). Information quality is also integral to the information collections conducted by NCES and is incorporated into the clearance process required by the Paperwork Reduction Act.
During the past two years the generic testing clearance has been used for:
Teacher Compensation Survey (Common Core of Data) Pilot
Principals’ Employment Status Feasibility Study
National Household Education Survey (NHES) 2009 Study - Cognitive Interviews
2010 NAEP Puerto Rico Cognitive Interview Study - Mathematics Items
2011 NAEP Technology Writing Assessment Usability Study
NAEP 2007: Review of Mathematics Items Used in Puerto Rico
NHES:2009 Respondent Debrief Interviews
Certifications Items Focus Groups
2012 National Postsecondary Student Aid Study (NPSAS:12) focus group
Certifications Items cog labs
Cognitive Interviews NHES Draft Questionnaires
FRSS99: District Survey of Dropout Prevention Pretest
Certificates Items Cognitive Labs
NAEP Grade 8 Mathematics Block Difficulty Pilot Study
NAEP 2011 Writing Assessment Audiovisual Stimuli Cognitive Interviews
FRSS 98: Fast Response Survey of Distance Education in K-12 - Pretest
Certifications and Certificates Cognitive Tests
NAEP Texas Survey of First-Year College Students
ELS Items Development Cognitive Labs
Certifications/Certificates Pilot Test
3. Use of Information Technology
When the survey or assessment being pretested employs automated methods for its data collection, the research conducted under this submission will also use automated data collection techniques. This clearance offers NCES the opportunity to try innovative technologies that would reduce burden and increase the use of information technology.
4. Efforts to Identify Duplication
This research does not duplicate any other questionnaire design work being done by NCES or other Federal agencies. The purpose of this clearance is to stimulate additional research, which would not be done under other circumstances due to time constraints. This research will involve collaboration with staff from other agencies that are sponsoring the surveys conducted by NCES. The research may also involve joint efforts with staff from other Federal laboratory facilities. All efforts would be collaborative in nature, and no duplication in this area is anticipated.
To the maximum extent possible, we will make use of previous information, reviewing results of previous evaluations of survey data before we attempt to revise questionnaires. However, this information is not sufficient to refine our survey questionnaires without conducting additional research.
5. Minimizing Burden
This research will be designed as relatively small-scale data collection efforts. This will minimize the amount of burden required to improve questionnaires and procedures, test new ideas, and refine or improve upon positive or unclear results from other tests. The results of the research conducted under this clearance are expected to improve the methods and instruments utilized in full scale studies and thereby improve information quality while minimizing burden to respondents.
6. Consequences of Less Frequent Collection
This clearance involves one-time questionnaire development activities for each survey that is connected with the clearance. If this project were not carried out, the quality of the data collected in the surveys would suffer.
7. Special Circumstances
All the guidelines listed in the OMB guidelines are met. There are no special circumstances.
8. Consultations Outside the Agency
The 60-day Federal Register notice was published on June 8, 2010 (75 FR, No. 109, p. 32426). No public comments have been received in response to this notice.
Consultation with staff from other Federal agencies that sponsor surveys conducted by NCES will occur in conjunction with the testing program for the individual survey. Consultation with staff from other Federal laboratory facilities may also occur as part of joint research efforts. These consultations will include discussions concerning potential response problems, clarity of questions and instructions, and other aspects of respondent burden. Additional efforts to consult with potential respondents to obtain their views on the availability of data, clarity of instructions, etc., may be undertaken as part of the testing that is conducted under this clearance.
9. Paying Respondents
Respondents for activities conducted in the laboratory (e.g. cognitive interviews and focus groups) under this clearance may receive compensation for travel and participation. This practice has proven necessary and effective in recruiting subjects to participate in such research, and is also employed by the other Federal cognitive laboratories. Research on incentives that may be conducted under this clearance may also involve nonmonetary incentives. The Office of Management and Budget has noted that effectiveness of such incentives is a worthwhile research topic. If incentives need to be proposed for any research activity under this clearance, justification will be provided and we will work closely with OMB on the incentive strategy to be employed. NCES will typically propose incentives at the level approved by the Office of Management and Budget for cognitive laboratories and focus groups (currently up to $40 for cognitive interviews and up to $75 for focus groups). If a higher level incentive is proposed for approval, a meaningful justification will be provided.
10. Assurance of Confidentiality
If the collection is under the authority of Education Sciences Reform Act of 2002 (ESRA 2002), all respondents who participate in research under this clearance will be informed that the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002) Section 9573, 20 U.S. Code] and that their participation is voluntary. For personal visit and telephone interviews, this information will be conveyed verbally by the interviewer. For personal visit interviews, respondents will also be notified in writing to give them something they can keep and read. For self-administered questionnaires, the information will be included in the mailing package, recruitment communications and materials, either on the questionnaire, or the instructions. For Internet-based data collections, this information will be displayed prominently, and in a format that allows the respondent to print it out. All participants in cognitive research will be required to sign written notification concerning the voluntary and confidential nature of their participation. We will also inform respondents in writing of the need to have an OMB number. No participant direct identifiers will be maintained. If not under ESRA, the Gen IC will specify the specific authority.
Each respondent will be assured that all information identifying them, their school, or institution will be kept confidential in compliance with the legislation (Education Sciences Reform Act (Section 9573, 20 U.S. Code):
“SEC. 9010. CONFIDENTIALITY.
IN GENERAL.—All collection, maintenance, use, and wide dissemination of data by the Institute, including each office, board, committee, and center of the Institute, shall conform with the requirements of section 552a of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provisions Act (20 U.S.C. 1232g, 1232h).
STUDENT INFORMATION.—The Director shall ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools, shall remain confidential in accordance with section 552a of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provisions Act (20 U.S.C. 1232g, 1232h).”
11. Justification for Sensitive Questions
Most of the questions that are included on NCES questionnaires are not of a sensitive nature and should not pose a problem to respondents. However, it is possible that some potentially sensitive questions may be included in questionnaires that are tested under this clearance. One of the purposes of the testing is to identify such questions, determine sources of sensitivity, and alleviate them insofar as possible before the actual survey is administered.
12. Estimate of Hour Burden
We estimate that the number of people involved in our exploratory, field test, pilot, cognitive, and focus group work will be at most 45,000 per year; the vast majority of which will be contacted as part of screening and recruitment activities preceding the actual research. Given that screening and recruitment activities are included in the burden calculations, we estimate the annual burden hours will be approximately 0.2 hours per person or an annualized 9,000 hours overall. The total estimated respondent burden is 27,000 hours for the period from October 2010 through September 2013. These hours will be distributed as follows:
Time Period Respondents Respondent burden (hours)
October 2010 - September 2011 45,000 9,000
October 2011 - September 2012 45,000 9,000
October 2012 - September 2013 45,000 9,000
A variety of forms will be used in conducting the research under this clearance, and the exact number of different forms, length of each form, and number of subjects/respondents per form are unknown at this time. However, we can project that our activities will likely include testing items, testing data collection modes, and testing incentive payment levels, in the form of “hothouse” field tests, expanded field tests including split sample questionnaire experiments in multiple panels, cognitive labs, exploratory interviews, reinterviews, and focus groups among students, teachers, parents, and other types of respondents.
13. Estimate of Cost Burden
There is typically no cost to respondents for participating in the research being conducted under this clearance, except for their time to complete the questionnaire
14. Cost to Federal Government
There is no way to anticipate the actual number of participants, length of interview, and/or mode of data collection for the surveys to be conducted under this clearance. Thus, it is impossible to estimate in advance the cost to the Federal Government. Costs will be covered by divisions conducting the research from their data collection budgets. We will include information about costs in the individual submissions.
15. Reason for Change in Burden
We anticipate no change from the currently approved burden.
16. Project Schedule
This research program is for questionnaire and procedure development purposes. Data tabulations will be used to evaluate the results of questionnaire testing. The information collected in this effort will not be the subject of population estimates or other statistics in NCES reports; however, it may be published in research and development reports or be included as a methodological appendix or footnote in a report containing data from a larger data collection effort. The results of this research may be prepared for presentation at professional meetings or publication in professional journals. Due to the nature of this clearance, there is no definite or tentative time schedule at this point. We expect work to continue more or less continuously throughout the duration of the clearance.
17. Request to Not Display Expiration Date
No exemption is requested.
18. Exceptions to the Certification
There are no exceptions to the certification.
File Type | application/msword |
File Title | Hi All, |
Author | Edith.McArthur |
Last Modified By | #Administrator |
File Modified | 2010-09-10 |
File Created | 2010-09-10 |