REQUEST FOR GENERIC CLEARANCE OF IMPROVEMENT PROJECTS
FROM THE NATIONAL CENTER FOR SCIENCE AND ENGINEERING STATISTICS (NCSES)
The National Center for Science and Engineering Statistics (NCSES) of the National Science Foundation (NSF) requests a three-year extension of the Office of Management and Budget’s (OMB’s) generic clearance that will allow NCSES to continue to rigorously develop, test, and evaluate data collection and statistical activities, methodologies, and dissemination methods and tools, and engage with data users and external stakeholders. Authorized under Section 505 of the America COMPETES Reauthorization Act of 2010, NCSES is tasked to “serve as a central Federal clearinghouse for the collection, interpretation, analysis, and dissemination of objective data on science, engineering, technology, and research and development.” This request is part of an ongoing initiative to improve NCSES surveys and other data collections as recommended by both its own guidelines and those of OMB.1
In the last decade, NCSES, along with other federal agencies, has made increasing use of and is now routinely using state-of-the art techniques to improve the quality and timeliness of data collection, analysis, and dissemination, while simultaneously reducing public burden. The purpose of this generic clearance is to allow NCSES to continue to adopt and use these techniques to improve its current data collection, analysis, and dissemination activities on the US science and engineering (S&E) enterprise. These techniques will be used to improve the content of existing data collections, to aid in the development of new data collections, to fill gaps in coverage of the S&E enterprise in the existing NCSES portfolio, and to facilitate statistical activities in support of increased evidence building for the American public. The generic clearance will also allow NCSES to explore alternative methods of data collection, as well as data dissemination tools and mechanisms.
Following standard OMB requirements, NCSES will submit to OMB an individual request for each project it undertakes under this generic clearance. NCSES will request OMB approval in advance and provide OMB with a copy of the materials that describe the project and will be part of the public burden (e.g., questionnaires, protocols, recruitment materials).
NCSES expects to use a variety of data collection techniques for the improvements, as appropriate to the individual projects, including, but not limited to, focus groups, cognitive testing, usability testing field observations, participatory design workshops, exploratory interviews, key-informant interviews, behavior coding, respondent debriefings, pilot tests, experiments, and response analysis surveys. NCSES has used such techniques in previous activities conducted under generic clearance. NCSES expects to continue taking advantage of new online tools available for designing, evaluating, and testing efforts, which may allow the agency to recruit larger numbers of participants to its studies.
Focus Groups. A methodology that brings together a small group of relatively homogenous participants to discuss pre-identified topics. A trained facilitator moderates the discussion using a moderator’s guide containing questions or topics focused on a particular issue or issues. Focus groups are useful for exploring and identifying issues with either respondents or stakeholders. Focus groups can be employed when a pre-existing questionnaire or survey questions on the topic do not yet exist, for obtaining data user requirements for new or improved data tools and services, and for stakeholder outreach on a variety of statistical activities. In the past, NCSES has used focus groups to assist with redesigning data collections when it became evident that the content of a survey was outdated and did not reflect the constructs being measured.
Cognitive and Usability Laboratory Techniques. A methodology that refers to a set of tools employed to study and identify errors that are introduced during the data collection process. These techniques are generally conducted by a researcher with an individual respondent. Cognitive techniques are generally used to understand the question-response process, whereas usability testing is used to understand respondent reactions to the features of a web survey instruments, websites, or data tools. In concurrent interviews, respondents probing occurs during throughout the interview to assess the respondents ‘in-the-moment’ thinking. In retrospective interviews, respondents proceed through the survey or task as they would normally, then answer questions about their responses, thought, or experiences. Other techniques, which are described in the literature, will be employed as appropriate These techniques include, but are not limited to: follow-up probing, memory cue tasks, paraphrasing, confidence rating, response latency measurements, free and dimensional sort classification tasks, and vignette classifications. The objective of these techniques is to aid in the development of data collection and data access tools reduce response error and burden. These techniques are generally very useful for studying and revising a pre-existing data collection instruments, websites, and data tools. NCSES has used cognitive and usability testing in previous generic clearance projects to improve existing questionnaires, to develop new data collections and to develop new data tools.
Participatory Design Workshops. A technique where stakeholders are brought together to design a product that works for them, often using collaboration tools. This method is often used at early stages of a project and can help ensure that products meet user needs. NCSES used this technique in the past to design a mobile app for data collection.
Exploratory/Key-informant Interviews. A technique where interviews are conducted with individuals to gather information about a topical area. These may be used in the very early stages of developing a new survey or new data delivery mechanism. They may cover discussions related to administrative records, subject matter, definitions, functionality, etc. These interviews may also be used to engage with stakeholders about data products, tools, or access needs. NCSES has used such interviews extensively in recordkeeping studies with respondents to several of its establishment surveys to determine both what types of records institutions keep (and therefore what types of information they can supply), as well as where and in what format such records are kept. NCSES has also conducted interviews to assess the features that will be most useful for a national secure data service.
Respondent Debriefings. A technique in which individuals are queried about how they have responded to a particular survey, question, or series of questions. The purpose of the debriefing is to determine if the original survey questions are understood as intended, to learn about respondents’ form filling behavior and recordkeeping systems, or to elicit respondents’ satisfaction with the survey. This information can then be used (especially if it is triangulated with other information) to improve the survey. This technique can be used as a qualitative or quantitative measurement, depending on how it is administered. This technique has been employed in NCSES generic clearance projects to identify potential problems with existing survey items both quantitatively and qualitatively.
Pilot Studies. These methodologies are typically used to test a preliminary version of the data collection instrument. Pretests are used to gather data and assess reliability, validity, or other measurement issues. Pilot studies are also used to test aspects of implementation procedures. The sample may be general in nature or limited to subpopulations for whom the information is most needed. Alternatively, small samples can be selected to statistically represent at least some aspect of the survey population.
Experiments. A technique for controlled testing of alternatives that allow one to choose from among competing options, such as questions, definitions, web designs or features, or contact strategies, and determine which option is more effective. Nearly any of the improvement methods can be strengthened when paired with an experimental design.
Behavior Coding. A technique in which a standard set of codes is systematically applied to respondent/interviewer interactions in interviewer-administered surveys or respondent/questionnaire interactions in self-administered surveys.
Response Analysis Surveys (RAS). These surveys, which appear at the end of a questionnaire while in the field, are typically brief, asking respondents few questions about their experiences responding to the survey. Similar to respondent debriefings, a RAS can collect information about the data collection, the response process, or record keeping. However, because these questions appear within the data collection instrument, a RAS can capture in-the-moment reactions to the survey response process. NCSES expects to use this technique primarily to evaluate data collection tools and methods.
The National Center for Science and Engineering Statistics (NCSES) within the U.S. National Science Foundation (NSF) is responsible for collecting, analyzing, evaluating, and disseminating information on science, engineering and technology employment, workforce, and education, as well as research and development (R&D) funding and performance. In accordance with the National Science Foundation Act of 1950, 42 U.S.C. 1861, et seq (Public Law 507) and the America COMPETES Reauthorization Act of 2010, 42 U.S.C. 1862p §505 (Public Law 111-358), NCSES is directed to “serve as a central Federal clearinghouse for the collection, interpretation, analysis, and dissemination of objective data on science, engineering, technology, and research and development…that is relevant and useful to practitioners, researchers, policymakers, and the public.” Also, through 42 U.S.C. 1862p §505, NCSES is required to “support research using the data it collects, and on methodologies in areas related to the work of the Center.” NCSES publishes data in individual survey reports and in legislatively mandated reports such as Science and Engineering Indicators.2 NCSES also releases data in a variety of formats including data tables, data tools, interactive web tools, and public use files.
An extension to NCSES’ previously-granted generic clearance is requested for several reasons. As a federal statistical agency, NCSES is engaged in a process of continuous improvement in the data collections it conducts and in the way it provides access to data and information. Critical to the improvement in existing data collection is the ability to engage in small scale projects to test alternatives and evaluate current approaches. Generic clearance authority substantially enhances NCSES’ ability to engage in such exploration, testing, and evaluation. Furthermore, as the data collection and dissemination landscape changes, NCSES must continuously evaluate its data collection, analysis, and dissemination activities. Respondent behaviors will change (e.g., response rates decrease over time); technology will change (e.g., the web quickly became the preferred data collection and dissemination option); and the S&E enterprise will change (e.g., today’s students increasingly pursue multi/interdisciplinary studies rather than a single discipline). Similarly, the understanding of how to improve dissemination continues to evolve (e.g., the emphasis on developing tools and services that allow more people to easily access federal data assets).
Thus, NCSES requests an OMB generic clearance structure to continue improving the overall quality of its data collection and dissemination efforts, reduce the burden on respondents to NCSES surveys, shorten the time required for NCSES to update and improve its data collections, and redesign and improve its dissemination tools and methods.
The information obtained from these efforts will be used to develop new NCSES data collections and improve current ones. Specifically, the information will be used to reduce respondent burden and to improve the quality of the data collected. These objectives are met when respondents are presented with plain, coherent, and unambiguous questionnaires asking for data compatible with respondents’ memory and/or current reporting and recordkeeping practices. The purpose of the improvement projects will be to ensure that NCSES data collections are continuously attempting to meet the standards set forth by OMB for official statistics. In addition, the information obtained from data dissemination improvement efforts will be used to help improve and refine data access or improve existing dissemination methods. Improved data access will help policymakers, researchers, and the general public by easing and streamlining the way they find the information they are seeking.
Improved NCSES data collection will help policy makers in decisions on R&D funding, graduate education, the scientific and technical workforce, and innovation, as well as contribute to increased agency efficiency and reduced costs. In addition, methodological findings have broader implications for research and may be presented in technical papers at conferences or published in the proceedings of conferences or in peer-reviewed journals.
NCSES will employ information technology, as appropriate, to reduce the burden of respondents who agree to participate in its improvement projects. Many respondents of current NCSES data collections supply email addresses that can be used to recruit respondents for improvement projects. This allows respondents to communicate with NCSES at their convenience. Past NCSES projects have also used research participant vendors who communicate with participants through email. Respondents to current NCSES data collections for academic institutions can often provide website addresses where NCSES can go to find some of the information needed for data collection (e.g., about their schools), reducing the amount of information they have to provide. NCSES will continue to explore state-of-the-art technology that reduce respondent burden to both individual and establishment data collections. For example, NCSES frequently uses desktop sharing and videoconferencing software to conduct cognitive interviews and usability testing in remote locations. NCSES has also used online platforms to conduct asynchronous focus groups that allow participants to respond when it is more convenient. By using these technologies, NCSES is able to capture both comments and web screen interactions and have a complete record of each session, making it unlikely that there will be a need to call respondents back to clarify notes from the sessions.
Web data collection facilitates accurate data by providing respondents with automated tabulations and feedback on inconsistent answers. However, the success of web data collection depends on well-designed features. Thus, one focus of NCSES improvement activities is improving the usability of NCSES data collections. In addition, NCSES continues to explore the adoption of innovative methods that could reduce respondent burden and provide easier access to data and information. Enhanced data dissemination tools help users find, access, and organize data more easily.
NCSES may also use online tools to recruit respondents and administer unmoderated, self-administered instruments and tasks. With these tools, NCSES can conduct studies with a large number of respondents with specific characteristics of interest easily and efficiently. These online studies can allow researchers to administer smaller tasks across large groups of respondents, reducing the burden for any one respondent. Finally, these self-administered online methods allow for the use of experiments in a more rigorous design that is not possible in interviewer-administered settings due to the resources required to obtain necessary sample sizes to detect statistical differences.
Improvement projects will be conducted for both existing data collections and dissemination tools, as well as to develop new NCSES data collections and dissemination methods. The NCSES data collections themselves are subject to scrutiny to ensure there is no duplication of other efforts. Likewise, the projects conducted under the generic clearance authority will be structured in order not to duplicate other efforts either within NCSES or across the Federal Statistical System.
One goal of NCSES’ efforts to improve its data collection and dissemination activities is to minimize the burden on the small organizations that respond to NCSES data collections. By learning about organizational and recordkeeping practices of small, medium, and large organizations, NCSES is better situated to design data collections that minimize the burden for various types of respondents, especially small entities. For example, NCSES has investigated methods for collecting data from businesses with fewer than ten employees about their R&D activities.
In the case of pilot studies or experiments, if probability samples are utilized, sampling rates proportional to size are often used to make sure that a large institution has a higher probability of being selected than a small institution. This ensures that a high proportion of the attribute of interest—U.S. S&E funding, performance, employment, or education—is captured while minimizing the burden on small entities.
If NCSES were unable to conduct the improvement projects in this request, the quality of the data NCSES collects could decrease because the current collection, analytical techniques and dissemination activities would not be systematically evaluated and updated to better reflect the current best practices. Over time, data collection, analysis, and dissemination activities that are currently well-designed would eventually become obsolete, and new collections and procedures could not be implemented without adequate testing and refinement. Advances in understanding how organizations or individuals provide data and how NCSES can better serve its stakeholders would be curtailed. Finally, NCSES’ ability to provide timely and accurate data would be diminished.
Under this clearance, NCSES will explain any circumstances that would result in respondents being required to:
Report information to the agency more often than quarterly;
Prepare a written response to a collection of information in fewer than 30 days after receipt of it;
Submit more than an original and two copies of any document;
Retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
Respond to a statistical survey in a manner that is not designed to produce valid and reliable results;
Use a statistical data classification that differs from one approved by OMB;
Respond in a manner that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of compatible data with other agencies for confidential use;
Submit proprietary trade secret or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.
Comments on this data collection effort were solicited in the Federal Register (90 FR 23070) on May 30, 2025 (see Attachment A). NCSES did not receive any comments related to this request.
The primary objectives of the data collection and dissemination improvement projects include involving respondents in the development of new content and collection methods, soliciting feedback to current collections and dissemination tools, observing respondent navigation of web data collection and dissemination tools, exploring how respondents’ recordkeeping systems work, and eliciting feedback on the response process. These objectives focus on consultation with respondents to reach the goals of understanding (1) how to minimize the time and effort to complete data collection tasks; (2) how to reduce other aspects of burden such as concerns about the use of the data NCSES collects; (3) how to motivate respondents to provide information of the highest quality accuracy, and (4) data users’ ability to access and use the data NCSES disseminates.
NCSES and its data collection partners sometimes provide incentives to participants in improvement projects. In some cases, the incentive covers travel costs only. In other cases, an incentive is offered for activities, such as focus groups or cognitive interviews. This practice has proven necessary and effective in recruiting participants in small-scale research that is more representative of the population of interest than convenience samples and is also employed by other federal agencies.
Generic clearance requests for projects offering participant incentives will explain the rationale and describe the incentive amount. Unless otherwise specified and with approval granted by OMB, for improvement projects conducted in-person or using methods that are equivalent to in-person, incentives will be limited to no more than $50 for participation in a 60-minute cognitive interview or usability test and no more than $90 for participation in a 90-minute focus group. The incentive amounts reflect the increasing difficulty of recruiting people from disperse, highly specialized populations and competing demands that make up our target populations.3,4 These amounts are maximum amounts and are proposed with the understanding that not all improvement projects will require these amounts. Respondents for field test activities such as experiments and pilot tests will receive an incentive only when there are extenuating circumstances that warrant it.
In situations where the incentive limits discussed above may prove ineffective for recruiting certain subpopulations, with approval granted by OMB, NCSES may request a higher incentive amount. Any such request will provide justification and citations/references (as applicable) for the use of an increased incentive amount that includes a discussion the research questions of interest and why the subpopulation requires a higher incentive.
Respondents in the improvement projects will be advised that their participation is voluntary (see Attachment B for the informed consent form). In focus groups, interviews, and other respondent activities, NCSES may ask participants for permission to record sessions via audio or video recording. Such recordings are conducted to provide project staff, including those not conducting or observing the activity, with a complete and accurate record to supplement note taking. Recording the session also allows staff to focus more on what is taking place during the session rather than on the completeness of their notes. In some cases, recordings may be used to train others to conduct this type of research. For sessions that are recorded, participants will be asked for their consent to the audio or video recording. They will be notified if there is any chance that a session may be played for audiences for research purposes.
Confidentiality will be pledged in some cases. The pledge of confidentiality will be made under the Privacy Act (where applicable) and the National Science Foundation Act of 1950 (as amended). Specifically, when confidentiality is pledged to individuals, the pledge used will be the following:
The information is solicited under authority of the National Science Foundation Act of 1950, 42 U.S.C. 1861, et seq (Public Law 507). All information you provide is protected under the NSF Act, as amended, and the Privacy Act of 1974, 5 U.S.C. §552a (Public Law 93-579) and will only be used for research or statistical purposes. Any information publicly released such as statistical summaries will be in a form that does not personally identify you.
When confidentiality is pledged to organizations, such as businesses, colleges and universities, and other non-profit organizations, the pledge used will be the following:
The information is solicited under authority of the National Science Foundation Act of 1950, 42 U.S.C. 1861, et seq (Public Law 507). All information you provide is protected under the NSF Act, as amended, and will only be used for research or statistical purposes. Any information publicly released such as statistical summaries will be in a form that does not personally identify you or your organization.
There may be occasions when NCSES funds and/or contributes to research performed by others and seeks approval for the collection under this generic clearance. In those cases, the confidentiality pledge may vary. If so, NCSES will inform OMB of the confidentiality pledges made for that project.
NCSES does not anticipate asking questions of a sensitive nature in work conducted under this generic clearance, except those usually asked for demographic and/or classification purposes (e.g., income). However, in its efforts to evaluate data collection instruments, NCSES may ask respondents whether items might be considered sensitive in the context of data collection.
Over the three years of the requested generic clearance, NCSES estimates that a total reporting burden of 12,200 hours (approximately 4,067 hours annually) will result from working to evaluate or improve existing data collection, analysis, and dissemination activities. This includes both the burden placed on respondents participating in each activity as well as burden imposed on potential respondents during screening activities. Table 1 provides potential improvement projects by collection type for which generic clearance activities might be conducted, along with estimates of the number of respondents and burden hours that might be involved. The individual potential activities that will be conducted under this generic clearance for each collection type will vary in their estimated burden depending on the methods used. Methods like focus groups and participatory design workshops tend to have longer participation times, while methods like response analysis surveys tend to have shorter participation times. Each project conducted under this generic clearance will include a burden estimate in the request for the specific activities conducted.
Table 1: Potential improvement projects by collection type with the number of respondents and burden hours
Collection Type |
Respondents |
Total Burden (hours) |
Annualized Burden (hours) |
R&D Enterprise Surveys |
800 |
1,050 |
350 |
Science and Engineering Workforce Surveys |
6,300 |
1,725 |
575 |
STEM Education Surveys |
800 |
1,000 |
334 |
Data Dissemination Tools and Methods |
2,500 |
425 |
142 |
Other projects not specified |
10,000 |
8,000 |
2,667 |
Total |
20,400 |
12,200 |
4,067 |
The cost to respondents generated by the list of potential projects is estimated to be $683,322 over the three years covered under this generic clearance. No single year’s cost would exceed $683,322, so if all work were done in one year, costs in that one year would be $683,322 and the costs in each of the other 2 years would be zero. As in previous requests for generic clearance authority, the total cost was estimated by summing all the hours that might be used on all projects over the three years (12,200) and multiplying that figure by the mean hourly wage ($56.01) of the level of employee who typically answers NCSES’ questionnaires or participates in NCSES improvement projects. This wage amount is the May 2024 national, cross-industry estimate of the mean hourly wage for a financial analyst, or Job Category 13-2051, by the Bureau of Statistics (https://data.bls.gov/oesprofile/), at the time of submission of this request.
There are no planned capital, startup, operation, or maintenance costs to the respondents, recordkeepers, or data users involved in these improvement projects. Some explorations involving the use of alternative sources of data (e.g., converting respondents’ records into a standard format for upload) may entail some costs; in such events, details would be provided in the materials associated with that burden request.
The 3-year cost to the Federal government generated by the improvement projects is estimated to be approximately $4,000,000. This estimate is based on an average cost of $200,000 per project and an estimated 20 projects over the three-year period. The main components of these costs are contractor costs and staff time. There are no startup, equipment, operations, or maintenance costs. Bidders on the NCSES contracts are required to have all software, licenses, and hardware needed to complete the improvement projects.
The 2022 burden request was for 11,500 burden hours across 28,515 participants. The current burden request is slightly higher at 12,200 hours across 20,400 participants. The increase in burden, despite the decrease in the number of participants, is attributed to planned activities that have a higher burden for individual participants. For example, NCSES has been conducting participatory workshops and focus groups for the development of new data dissemination methods. These workshops and focus groups tend to require fewer participants but longer participation times for the activities.
Data will be collected to improve new data collections, analyses, and dissemination activities. Methodological findings from improvement projects may be referenced in the technical notes for published data, in methodology reports, in technical papers presented at conferences, in the proceedings of conferences, or in journals. Generic clearance activities will not be used to calculate substantive results or official estimates that will be released.
A17. OMB Approval Expiration Date
NCSES will display the expiration date for OMB approval of the information collection on project materials.
A18. Exceptions to the Certification Statement
No exceptions to the Certification Statement should be required. If so, OMB approval will be requested in advance of conducting the survey or data collection.
1 NSF Information Quality Guidelines are available at https://www.nsf.gov/policies/information-quality#nsfs-information-quality-guidelines-d12 OMB Information Quality Guidelines are available at https://www.govinfo.gov/content/pkg/FR-2002-02-22/pdf/R2-59.pdf. OMB standards and guidelines for statistical surveys are available at https://www.statspolicy.gov/policies/
3 Singer, E., & Ye, C. (2013). The Use and Effects of Incentives in Surveys. The Annals of the American Academy of Political and Social Science, 645, 112–141. https://doi.org/10.1177/0002716212458082
4 Heimel, S., Bottini, C., Satisky, B., & Hall, D. (2024) 2022 NTEWS Final Report of Wave 2 Analyses. Internal Report (National Center for Science and Engineering Statistics). https://ncses.nsf.gov/443/assets/0/files/2022-ntews-pilot-wave-2-analyses.pdf
| File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
| File Title | REQUEST FOR GENERIC CLEARANCE OF |
| Author | mvburke |
| File Modified | 0000-00-00 |
| File Created | 2025-11-18 |