Supporting Statement Part A LSAMP Evaluation

Supporting Statement Part A LSAMP Evaluation.docx

Request for Clearance of Data Collection for the Evaluation of the National Science Foundation Administered Program: Louis Stokes Alliance for Minority Participation (LSAMP)

OMB:

Document [docx]
Download: docx | pdf

CUI//PROPIN

C

Shape1

Shape2 Shape3

Supporting Statement A

Request for Clearance of Data Collection for the Evaluation of the National Science Foundation Administered Program: Louis Stokes Alliance for Minority Participation (LSAMP)




2025





Request Submitted by:



LeRoy Jones II

Lead Program Director

The Louis Stokes Alliances for Minority Participation (LSAMP) Program





















A.1 Circumstances Making the Collection of Information Necessary

Per the National Science Foundation Act of 1950, we are presenting plans for this necessary and legally required (42 U.S.C. 1862 (a)(5) collection of information. In accordance with the requirements of the Paperwork Reduction Act of 1995, we have provided the opportunity for public comment on this action. After obtaining and considering public comment, NSF has prepared the following submission requesting Office of Management and Budget (OMB) clearance of this collection for no longer than three years. The evaluation seeks to consider the experiences of those leading and participating in the Louis Stokes Alliance for Minority Participation (LSAMP) program since the last full evaluation (which concluded in 2005), as well as current experiences, at ten select multi-institution alliance locations as case study sites. NSF selected ten sites in collaboration with our research partner NORC at the University of Chicago (NORC), which will conduct all data collection. A key component of the site selection process was ensuring inclusion of alliance sites with varied features, to include sites with more remote geographies and alliance types.

The evaluation will enable the program to better understand where successes can be modeled and replicated, and opportunities can be identified to better support promising students pursuing STEM degrees. With this OMB Package request, we are seeking approval to conduct interviews and focus groups with individuals who are affiliated with the LSAMP alliances at these pre-identified sites using the procedures discussed below. These procedures and protocols have been reviewed by the NORC Institutional Review Board (IRB).

A.2 Study Purpose and Use of the Information

The LSAMP program has the stated objective to support STEM education and professional opportunities for all students across affiliated campuses, and to decrease barriers to STEM education for students through direct program enrollment and support. There are presently over 60 LSAMP alliances across the country, each housing multiple individual institutions. There are three types of LSAMP alliances: Bridge-to-the-Baccalaureate (B2B), STEM Pathways Implementation Only (SPIO) alliances in existence less than ten years, and STEM Pathways Research Alliances (SPRA) in existence at least ten years or more. To be included in this evaluation, alliances needed to be funded in the reference period of 2008-2018. Additionally, specific alliances were selected for inclusion by NSF to represent institutions and alliances of varying size, geographies, and LSAMP tenure. Some SPRA alliances (in existence at least ten years or more) also have Bridge to the Doctorate (BD) projects that provide additional funding to support students’ graduate work. An additional six alliances are Bridge to the Baccalaureate (B2B) alliances led by two-year institutions. Each alliance consists of a lead institution and several other partner institutions all operating towards the same alliance goals (as stated in annual reports and other documents). LSAMP-partner institutions typically are in the same geographic region as the lead institution and also include two-and-four year institutions. Students in alliance schools can be defined in three groups: Level 1 students who receive direct financial support for participation in alliance; Level 2 students who receive no direct financial support but participate in alliance activities; and Level 3 students at LSAMP institutions who did not receive funds or participate in LSAMP activities. Data about Level 3 students will be used for comparison purposes, because there are no Level 3 participants recognized by the program or grantees.

Lower access to education and resulting participation in STEM fields has been observed by institutions and hiring managers to include those of less economic advantaged families, women, of certain racial groups, and persons with disabilities in the STEM workforce has been shown to originate in high school, and was exacerbated by the COVID pandemic, with poor and historically marginalized high school students unable to participate in their high school STEM classes at the same rates as their more affluent peers (Hamilton & Kim, 2021). Research suggests that collaboratives, such as the alliances funded by the LSAMP program, can improve STEM degree completion rates, but long-term success requires sustained change at the institutional level (Center for Urban Education, 2019; May & Bridger, 2010). LSAMP has been supporting efforts aimed at broadening participation in STEM disciplines during college matriculation for STEM degree attainment and/or preparation for STEM careers and entry into the STEM workforce nationally for over 30 years.

This Office of Management and Budget (OMB) clearance request describes NORC’s approach for evaluating the LSAMP program, under the guidance and sponsorship of the NSF Directorate for STEM Education (EDU) Division of Equity for Excellence in STEM (EES). This evaluation aims to identify and characterize the organizational structure and institutionalization of LSAMP alliances and assess the impact of LSAMP strategies to strengthen pathways and increase STEM degrees earned by students at the undergraduate and postbaccalaureate levels.

A.3 Use of Automation

NORC will not be using any automated technologies for data collection or analysis.

A.4 Methodology and Efforts to Identify Duplication and Use of Similar Information

This proposed evaluation of the NSF LSAMP program does not duplicate other NSF efforts. Similar information, and information on the program in general, is not being collected by either NSF or other institutions. Any available information (e.g., administrative data) will be utilized and will not be included in this novel data collection effort.

As part of this comprehensive evaluation, and with a focus on fully utilizing all preexisting data, NORC has completed a thorough review of available data (e.g., annual reports). The LSAMP program anticipates using the results of this evaluation (both the review of existing data and data from site visits) to: (1) identify “LSAMP best practices” that can be implemented by both LSAMP grantee/alliance institutions and non-LSAMP grantees/alliance institutions to sustain and grow impacts of the LSAMP programming on broadening participation4 in STEM; and (2) inform the LSAMP program on sustainability mechanisms and goals appropriate for LSAMP alliance institutions to maintain their successful outcomes. Table 1 presents the core research questions, data sources to be utilized, and analytic plan.

Table 1. Research Questions

Research Questions 

Data sources 

Analysis 

1. Organizational Pathways, Structures, Policies and Mechanisms (PSPM) 

a. What PSPM have LSAMP institutions implemented to recruit, support, and retain highly competitive STEM undergraduate and graduate students? 

Administrative data 

Document review 

Site visits (staff interviews & student focus groups) 

Literature review 

Landscape analysis Comparative Case Study 

Descriptive 

Qualitative 

b. To what extent have the PSPMs implemented by alliances influenced increases in the number of STEM degrees awarded to LSAMP participants (Level 1 students) at LSAMP institutions? 

Identified PSPM1

WebAMP2 and BD monitoring system 

NSC3 and IPEDS 

Examine associations between PSPMs and degrees, overall and by groups 

2. Undergraduate and Graduate Degree Completion 

 

 

a. How do STEM bachelor’s degree enrollment and degree completion rates at LSAMP institutions compare with national and state level degree completion rates?

WebAMP

NSC and IPEDS

Create and compare undergraduate enrollment and degree measures, overall and by groups, for LSAMP institutions and schools at the state and national level

b. How do STEM bachelor’s degree enrollment and completion rates at LSAMP institutions compare with institutional-level degree completion rates? 

c. How do STEM graduate degree enrollment and completion rates at LSAMP BD institutions compare with national and state level degree completion rates?

WebAMP and BD monitoring system 

NSC and IPEDS 

d. How do STEM graduate degree enrollment and completion rates at LSAMP BD institutions compare with institutional-level degree completion rates? 

3. Return on Investment

 

 

a. What is the cost (funding and other resources) to implement the LSAMP programs at institutions? 

Staff interviews 

Administrative data 

WebAMP 

Return on Investment 

b. What are the monetized benefits connected to student outcomes? 

c. What is the ratio of the benefits to costs required to implement LSAMP? What is the internal rate of return? What are the net benefits? 

This comprehensive evaluation includes several components, including utilization of existing data (see data sources in Table 1 listed in italics), but the components and protocols presented here focus on qualitative data collection at ten (10) case study multi-institutional LSAMP sites, which will include interviews with institutional staff and focus groups with current students. Half of these interviews and focus groups will be planned for virtual data collection (e.g., Zoom), with the remainder planned for a hybrid of in-person (i.e., on-campus) and virtual (e.g., Zoom). In the hybrid model, virtual sites will be selected based on travel burden, selecting in-person options as much as practicable, and in consideration of the physical accessibility of satellite locations from the central location of each selected site. The burden on respondents will be discussed below but is not expected to differ between modes.

Site visits (both in-person and virtual) and corresponding data collection will take place upon receipt of OMB clearance and is anticipated to take place in the calendar years of 2025 and 2026 (academic year 2025-2026). Sites will be notified of the process and their selection in advance and given an opportunity to assist with scheduling and recruitment of individuals. To reduce the burden on staff, most interviews will take place during regular business hours. Similarly, for focus groups, they will take place during the least burdensome time for students, based on individual institutional need, as much as practicable (given facilities restraints, etc.).

Site Selection: We will study three institutions at each of ten unique alliance sites, for a total of thirty (30) institutions. Selected institutions at each alliance will include the lead institution and two partners that will be selected in consultation with NSF and the lead institution. Five alliances (15 institutions) will be studied virtually, and five alliances (15 institutions) will be studied in a hybrid or in-person and virtual format (note that some locations within an alliance may not be accessible in-person due to logistical constraints). NSF has purposefully selected these sites based on variables of interest, including geography, alliance type (e.g., SPIO, SPRA, B2B), and alliance duration (e.g., new alliances and well-established alliances) to ensure a heterogenous sample. The selected alliances are: “Northern New Jersey,” “Puerto Rico LSAMP,” “Islands of Opportunity (Hawaii),” “Louisiana,” “All Nations (Midwest/West),” “North Carolina,” “IINSPIRE (Iowa, Illinois, Nebraska),” “Kansas,” “Indiana,” and “CIMA (Texas).”

Respondent Universe and Sampling Methods: Data will be collected from school staff, including administration, support staff, and faculty, and from students. Any staff member currently involved with the LSAMP program will be eligible for inclusion. These individuals will be identified through official records and participant referral within the organization to ensure all eligible individuals are considered. Students must be currently enrolled in the LSAMP program, either at the graduate or undergraduate level, and referred by a current LSAMP staff member. This acknowledges that the students are unlikely to have been enrolled during the reference period (2008-2018) but responds to the need to be able to access these students and have them participate. The protocols for data collection with this population are responsive to this design, and include the opportunity to have them think retrospectively, should they have experience during the reference period.

Participant Characteristics and Recruitment Methods: Staff members will include those whose work is directly related to LSAMP projects (and are known as officially affiliated with the work through annual reports, etc.), and those who work alongside, but not directly on LSAMP activities, and will be referred by direct LSAMP project staff. We aim to recruit nine (9) staff members per alliance. Direct LSAMP participants will be administrators, faculty, and staff who run the LSAMP program or one of its affiliated activities (e.g., tutoring, mentoring, bridge programs, undergraduate research). Selection for participation will be based upon selecting varying positions/experience within LSAMP programming, to ensure a broad set of experiences are captured. Other participants will include staff who may not be direct affiliates but mentor students who are part of the program. These participants will be identified via rosters and recruited via email and telephone. In addition to other selected participants, we will include the LSAMP director at each institution within the alliance.

We will conduct no more than five rounds of combined outreach for any given participant. For staff that are initially unresponsive, we will confirm their LSAMP participation with the director. For in-person interviews, we plan to schedule these prior to arrival. However, when on site, we will work to locate and introduce ourselves to any participants we have been unable to schedule from afar. For virtual interviews, after our five outreach attempts, we will request additional LSAMP staff from the director to ensure we maintain a robust response group. Direct participants will be identified via word of mouth either from students (e.g., naming their advisor) or other staff who identify peripheral participants.

Students will be selected based upon their participation level (i.e., Level 1, 2, or 3) so that we receive a range of participation levels. We will oversample Level 1 students because these students are the most integrated into LSAMP and can thus offer insights on multiple programmatic components. Students will be referred by the LSAMP staff members with whom we speak. We acknowledge that this may introduce some selection bias, which we will mitigate as much as possible by providing clear instructions to administrators. For example, we will request that they refer students with more experience in the program, those in their first year in the program, and those who have needed additional support. Additionally, we will include representation of Level 3 students (non-LSAMP participants). We expect the LSAMP administration to be the primary source of student identification along with staff participants. We will send an email to students’ institutional email accounts, and any accounts the program has on record will be carbon-copied (cc’d). Students will also learn about the study from LSAMP staff so that they will expect to receive emails from NORC prior to recruitment.

Information Collection Procedures: Each site visit will involve a multi-pronged data collection, each with a specific purpose. First, we will collect organizational-level artifacts such as staff rosters, meeting minutes (when available), and mission statements. Taken together, these artifacts, in combination with artifacts previously collected through the document review, will be used to capture alliance and/or institutional strategies and to learn about LSAMP’s influence on each institution’s broader organizational context. To better understand how the programs function and how such functioning aligns with LSAMP’s goals, beyond what is available in this administrative data, we will also conduct interviews4 with project participants, including administrators and staff, at each site. Additionally at each site, we will conduct focus groups with current students.

Based on our interest in institutionalization and influence, interview questions will probe about whether and how organizational pathways, structures, policies, and mechanisms (PSPMs) are integrated into new or pre-existing sets of practices, what meanings interviewees ascribe to these PSPMs, and how they view the impacts or effects of the PSPMs. (See Appendix 1 for tested instruments.) For coordinators and administrators, we will attempt to reduce the burden of in-person/synchronous participation by allowing them to preview the questions, including the request for staff/student referrals, as part of the scheduling process, and by offering them the opportunity to provide any information (e.g., enrollment information, contact information) ahead of time if that would be helpful for them in reducing the burden.

We will also conduct one student focus group made up of Level 1, Level 2, and Level 3 students at each site (See Appendix 2 for the tested Student Focus Group instrument). Each focus group will include eight students, as focus group practices suggest six to eight participants is an ideal size for manageable discussions. Most questions will be suitable for the whole focus group. However, a subset of questions will be targeted at students based on their exposure to and engagement with LSAMP activities. Overall, the questions will focus on how students learn about LSAMP activities, how they engage with these activities (or reasons why they do not), what supports or hinders their engagement (including funding), their perceptions of LSAMP-related success, and their perceptions of ways that LSAMP has shaped their postsecondary experiences. Those not participating in LSAMP will be asked general questions, allowing a parallel comparison to general advising, courses, and other academic experiences.

Taken together, these data will provide a holistic view of each alliance/institutional site and how they engage with one another. Artifacts will provide details on formalized LSAMP activities, staff interviews will provide important perspectives into the organizational processes and practices that support or hinder the emergence and perpetuation of new practices, and student focus groups will provide critical insight into what PSPMs have been most beneficial and the ways that alliances have shaped this critical stakeholder group’s postsecondary experiences.

A.5 Impact on Small Business or Other Small Entities

There are no anticipated impacts on small businesses or other small entities.

A.6 Consequence on Collecting the Information Less Frequently

This information is already collected on a relatively limited basis, with efforts conducted only every ten years. This current collection of information will reflect the 2008-2018 time period, and another effort or similar effort does not exist. Should this current data collection not be completed, the NSF will not be able to determine the effectiveness of some key aspects of the program. Nor will NSF be able to better understand how and why the program succeeds or has limitations as it does and use that key information to better construct future rounds of funding and support. This could negatively impact the LSAMP program’s ability to achieve its goals.

A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

This project fully complies with all guidelines of 5 CFR 1320.5.

A.8 Federal Register Notice

The Federal Register notice for this collection was published on October 2, 2024, at 89 FR 80268 and no comments were received.

A.9 (Also “Gifts or Remuneration”) Methods to Maximize Cooperation Rates

We anticipate that cooperation will be relatively straightforward, as we will be using known and validated professional email addresses and respondent referrals. For coordinators and other staff, we will reach out to them directly, offering the opportunity to self-schedule and manage appointments for interviews as it best fits their professional calendars. We will work with their expected working hours to reduce burden and offer as much advance notice as possible. For more senior level individuals, who may have a dedicated Executive Assistant, we will include that individual in all correspondence to assist in coordinating information and cooperation. For students, we will more highly utilize the respondent referral approach, by asking for connections to be made by staff (and then directly engaging the students to offer some level of privacy regarding participation).

We do not currently plan to offer external incentives for participation. Incentives to encourage participation are not necessary for staff and administrators, as this will be conducted within the scope and purpose of their regular job responsibilities. Similarly, students have reported (e.g., in cognitive testing) an eagerness to share their experiences in hopes it helps others benefit, so the incentive for them is more intrinsic. With the exception of level 3 participants, many of these students are receiving support from this program. Therefore, we will be direct and clear that their participation and any information shared will not be linked to their records or impact their funding and enrollment in the program in any way. This will be shared in writing prior to participation, and again as a consent statement read aloud prior to the start of focus groups sessions.

Should our planned and tested methods prove unsuccessful after three attempts to contact and schedule, and incentives are necessary to complete the data collection, we will offer $20 incentives to students for their participation in the one-hour focus group sessions. Again, participation is within the overall purview of their job responsibilities for staff, so incentives should not be needed. We will remind all staff that their participation is voluntary and will not in any way have an impact on their employment or the funding experienced by their institution or alliance through the LSAMP program. While voluntary participation inherently means individuals may choose not to participate, we will clearly convey during recruitment the importance of sharing their experiences and knowledge.

A.10 Confidentiality Provided to Respondents

The data collected from interviews and focus groups will be stored using all appropriate best practices and federal guidelines. Additionally, NORC will employ a least-privileges paradigm for all identifiable data. Data will be stored on NORC’s secure servers, and only those people directly working on the LSAMP project will have access. NORC files are password protected to ensure that only direct project staff can access the files.

Data will only be presented in aggregate (at both the institutional and alliance levels), without attribution to a specific individual person. NORC will provide NSF only this aggregated data, and all documents linking individuals will be destroyed using established protocols at the completion of the work aligned with required records management and best practices.

A.11 Questions of a Sensitive Nature

NORC does not anticipate collection of any questions of a sensitive nature. All topics pertain exclusively to respondents’ work in professional capacities. Similarly, we will not be collecting any potentially sensitive data such as tenure status or salary.

A.12 Respondent Burden Consideration and Reduction

There is no financial cost expected for respondents for their efforts in this data collection and evaluation beyond the opportunity cost for student participants for time spent participating.

The total estimated burden is 422.50 hours across 300 participants. Table 2 below identifies the two site types across the ten alliances (five hybrid and five virtual), each with three institutions, for a total of 30 institutions. The hybrid sites will be primarily in-person, with some locations completed virtually due to distance and accessibility (e.g., the flagship campus will be visited while a distant regional campus may not). For the 15 hybrid sites, ten institutions will be in-person, while five will be virtual. For the virtual sites, all interviews and data collection will be conducted entirely virtually. Table 2 also identifies the number of hours each component is expected to take, in total, across all sites and participants.

Table 2. Burden Hours for Respondents in Hybrid and Virtual Data Collection


Hybrid Sites (Five Sites, Fifteen Institutions): All Nations LSAMP (Montana), Puerto Rico LSAMP, Islands of Opportunity (Hawaii), Louisiana, Northern New Jersey


Respondent

Recruitment and Preparation Hours

Interview Hours

Total Burden Hours Per Individual

Total Number of Individuals

Cost per respondent

Total Burden

Hours

Total Burden Cost

Staff member

0.25

1

1.25

45 (3 from each of the 15 institutions)

61.66

56

2,774.50

LSAMP Director

2.25

1

3.25

15 (1 from each of 15 institutions)

160.32

49

2,405.00

Student

0.25

1

1.25

90 (one group of 6 from each of 15 institutions)

0

112.50

0


Virtual-Only Sites (Five Sites, Fifteen Institutions): North Carolina, IINSPIRE, Kansas, Indiana, CIMA


Respondent

Recruitment and Preparation Hours

Interview Hours

Total Burden Hours Per Individual

Total Number of Individuals


Total Burden

Hours


Staff member

0.25

1

1.25

45 (3 from each of the 15 institutions)

61.66

56

2,774.50

LSAMP Director

2.25

1

3.25

15 (1 from each of 15 institutions)

160.32

49

2,405.00

Student

0.25

1

1.25

90 (one group of 6 from each of 15 institutions)

0

112.50

0


TOTAL


Hybrid Cases (5 sites—15 total institutions)


218


Virtual Cases (5 sites—15 total institutions)


218


Total Burden Hours


436.00


Total Respondent Cost Estimate


$10,359




Annualized Cost to Respondents

There are no annualized hourly costs to respondents. The overall respondent cost estimate is $21,458.55, for 435 hours of respondent time, at an average of $49.335 per hour.

A.13 Estimate of Other Total Annual Cost Burden to Respondents or Record-Keepers

There are no capital, maintenance, or operating costs to respondents.

A.14 Annualized Cost to the Federal Government

The annual cost to the Federal Government includes the costs for developing the protocols and the cost of federal personnel time. The cost time to develop and administer the protocols is $422,531.84 and the NSF staff cost time is $218.40. The total cost time is $422,750.24.

Table 2 in A.12 shows the total annualized cost to the Federal Government. This cost includes the cost to:

  1. Develop the protocols [including conducting cognitive interviews].

  2. Conduct in-person interviews with administrators and student focus groups at 5 alliances, a total of 15 institutions.

  3. Conduct virtual interviews with administrators and student focus groups at 5 alliances, a total of 15 institutions.

The estimates provided include [the development of interview and focus group protocols, scheduling and implementation of interviews and focus groups] but does not include the analyses of collected data, and preparation of reports.

Total federal government personnel costs will be $218.40. Salaries are based on the January 2025 General Schedule for the Washington, DC Metropolitan area (https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2025/DCB.pdf and https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2025/DCB_h.pdf ). It is anticipated three NSF staff members will be involved at the GS-13 and GS-15 levels. The costs assumes two GS-15 employees with an annual salary of $167,603 working at 1 hour and one GS-13 with an annual salary of $120,579 working at 1 hour.





Table 3. Annualized Costs to the Federal Government

Estimated Costs for Federal Government Personnel

Item

Annual Salary ($)

Hourly Rate

($)

Number of Hours

Annualized Cost

($)

Personnel from the NSF Directorate for STEM Education GS-13

120,579

57.78

1

57.78

Personnel from the NSF Directorate for STEM Education GS-15

167,603

80.31

1

80.31

Total Cost for Government Personnel:

218.40

Estimated Costs for Contractor to Develop Protocols

Item

Annualized Cost

($)

Administrator Interview Protocol Development

19,603.64

Student Focus Group Protocol Development

19,603.64

Total Cost for Contractor to Develop Protocols

$39,207.27

Estimated Cost for Contractor to Administer Protocols

Item

Number of Interviews

Hours per Interview

Hourly Rate

($)

Annualized Cost

($)

Interviews with LSAMP Administrators

120

1

1,277.75

153,329.83

Focus Groups with LSAMP Students

180

1

1,277.75

229,994.74

Total Cost for Contractor to Administer Protocols:

$383,324.57

Total Annualized Cost to the Federal Government:

$422,531.84

A.15 Explanation for Program Changes or Adjustments

This is a new collection of information.

A.16 Plans for Tabulation, Publication and Project Time Schedule

NORC at the University of Chicago will be completing all data collection, analysis, and reporting to NSF. NORC specializes in these types of evaluations and data collection efforts, with extensive experience in providing these services to the federal government. In terms of public dissemination, NSF will determine if and how any findings are shared and will lead all efforts in publication and dissemination.

Estimated Project Time Schedule

Perform site visits, conducting interviews and focus groups

Will begin within six months of clearance (depending on clearance date, we may need to delay to align with academic schedules). They will take four to six months to complete. (Completion within 12 months of OMB approval.)

Analyze interview and focus group data

This will conclude within six months of site visit completion. (Completion within 18 months of OMB approval.)

Provide NSF with final reporting and tabulations on case studies, integrating into larger evaluation effort.

This will begin concurrently with site visits and conclude within twelve months of site visit completion. (Completion within 24 months of OMB approval.)

A.17 Reasons why Display of OMB Expiration Date is Inappropriate

No exceptions are sought, all instruments will display the OMB Expiration Date.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are sought from the Paperwork Reduction Act.

1 PSPM: organizational pathways, structures, policies, and mechanisms

2 WebAMP is the LSAMP Monitoring System; BD means Bridge to the Doctorate

3 NSC means National Student Clearinghouse; IPEDS means Integrated Postsecondary Education Data System

4 Broadening Participation in STEM: NSF has a mandate to broaden participation in science and engineering, as articulated and reaffirmed in law since 1950. Congress has charged NSF to “develop intellectual capital, both people and ideas, with particular emphasis on groups and regions that traditionally have not participated fully in science, mathematics, and engineering”.


4 For interview protocol and student focus group protocol, we will conduct cognitive testing with higher education staff and students before finalizing protocol. We will have two focus groups—one for staff, one for students—focused on gathering feedback on the relevance and flow of the protocol.

5 Derived from the median hourly pay for such roles, according to the BLS Occupational Outlook Handbook retrieved from https://www.bls.gov/ooh/management/postsecondary-education-administrators.htm.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement A
SubjectRequest for Clearance of Data Collection for the Evaluation of the National Science Foundation Administered Program: Louis Stoke
AuthorRosenblatt, Rebecca
File Modified0000-00-00
File Created2025-05-19

© 2025 OMB.report | Privacy Policy