Strengthening Community Colleges Training Grants Program Round 4 (SCC4) Evaluation Supporting Statement A
OMB Control Number: 1290-0NEW
OMB Expiration Date: TBD
SUPPORTING STATEMENT FOR
Strengthening Community Colleges Training Grants Program Round 4 (SCC4) Evaluation
OMB CONTROL NO.: 1290-0NEW
The Chief Evaluation Office (CEO) and the Employment and Training Administration (ETA) in the U.S. Department of Labor (DOL) are partnering to commission an evaluation of the fourth round of the Strengthening Community Colleges Training Grants Program (SCC4). The program, which awarded approximately $65 million dollars in grants to 16 community college grantees, aims to enable community colleges to adopt and enhance evidence-based career pathways programs that lead to employment opportunities in good jobs in locally in-demand industries. The SCC4 evaluation will shed light on the impact on participants of receiving comprehensive supports through the evaluation, which services are effective for which types of students, why they are effective, and the core components of the SCC4 programs that support success. In addition, SCC4 will build on existing evidence regarding successful programs in community college settings and advance the understanding of how career pathways programs promote economic mobility. CEO contracted with Mathematica and its subcontractors, the Community College Research Center and Social Policy Research Associates, to conduct impact, outcomes, and implementation studies. This information collection request seeks Office of Management and Budget (OMB) clearance for new data collection for the SCC4 evaluation. This package requests clearance for seven data collection instruments as part of the evaluation:
Participant baseline survey and consent form
Participant follow-up survey
Contact information update request
Service receipt log
College survey
Semi-structured interview topic guide for college administrators, program directors, program staff, college instructors and faculty, and partners
Semi-structured interview topic guide for SCC4 participants
The SCC4 evaluation will include three components: (1) an impact study of student success coaching and enhanced supports, (2) an outcomes study, (3) and an implementation study. We will use the first four data collection instruments for the impact study and draw on the college survey for the outcomes study. The implementation study will draw on the college survey and remaining data collection instruments.
Wages and employment continue to show long-term stagnation for workers who do not have postsecondary degrees.1 Sector-based career pathway programs offer promising approaches to meeting employers’ needs and promoting economic mobility for workers.2 Within this context, the DOL’s ETA is investing in a fourth round of the Strengthening Community Colleges grants. Through these grants, community colleges will adopt or enhance evidence-based career pathways programs that increase employment opportunities in good jobs in locally in-demand industries. SCC4 seeks to build community colleges’ capacity to support students in obtaining good jobs and meet employers’ needs for skilled workers by enhancing sector-based career pathways programs.3 Since May 2024, DOL has awarded approximately $65 million dollars to 16 community college grantees. The SCC4 evaluation will build on existing evidence and shed light on which sector-based career pathways and strategies are effective for which types of students, why they are effective, and the core components that support success.
Citation of sections of laws that justify this information collection: This evaluation is authorized by Section 169 of the Workforce Innovation and Opportunity Act (WIOA), which authorizes research and evaluations to improve the management and effectiveness of workforce programs and activities, such as the Strengthening Community Colleges (SCC) grant program.
This package is requests clearance for seven data collection instruments that will be used for the impact, outcomes, and implementation studies described below.
DOL will use the data collected through the instruments summarized in this request to describe SCC4 participants and programs, describe participants’ outcomes, assess the impacts of key components of SCC4 on these outcomes, and understand implementation of the SCC4 grants. These data and the study team’s descriptive and impact analyses will provide DOL and other policymakers with important information to guide management decisions, support future planning efforts about such grant programs, and share evidence of the effectiveness of various components of career pathways programs to help meet the skill development needs of employers and support students in obtaining good jobs in in-demand industries.
The SCC4 evaluation comprises three components: (1) an impact study to measure the effects of student success coaching and enhanced supports on participant outcomes, (2) an outcomes study to measure the educational and labor market outcomes of SCC4 participants, and (3) an implementation study to understand program implementation. The evaluation will occur over five years (2024 to 2029).
The impact study will be a randomized controlled trial (RCT) in which participants will be randomly assigned to either a treatment group that receives student success coaching and enhanced supports or a comparison group that receives status quo support services offered as part of SCC4. Student success coaching will involve the coach using proactive outreach to assess student needs and meet regularly with students. Enhanced supports will include providing resources to students to support program completion, such as funding for transportation, employment-related costs, or emergency needs. The study team identified impact study grantees during an evaluability assessment in summer 2024; at that time, they assessed all grantees on a series of criteria, including the potential number of enrolled students, feasibility of implementing a student success coaching and enhanced supports intervention sufficiently distinct from current college offerings, and interest in and feasibility of random assignment of students to treatment and comparison groups. The impact study will include five community colleges from four grantees.
The impact study will compare the outcomes of SCC4 participants in the treatment group to SCC4 participants in the comparison group. It will address three key research questions (see below) and will include a baseline survey, follow-up survey, and contact information update requests involving all study participants (treatment and comparison groups). The study team will merge these data with administrative data on employment and earnings to address the impact study research questions. The impact study will also include service receipt logs completed by coaches providing student success coaching and enhanced supports to participants. The three research questions are as follows:
What is the impact of providing student success coaching and enhanced supports to participants in SCC4 training programs on participant outcomes, including:
Services received through the SCC4 program
Program completion and credential attainment
Enrollment in further postsecondary education
Employment and earnings
Job characteristics and satisfaction
How do these impacts vary by participant characteristics?
How do these impacts vary based on the service needs of participants?
The outcomes study will describe the earnings and employment of SCC4 participants following program enrollment at all 16 grantees. It will rely primarily on administrative data on earnings and educational outcomes to measure the outcomes of SCC4 participants. It will also use the college survey to describe how outcomes varied across SCC4 programs based on key characteristics. Specifically, the outcomes study will address the following research questions.
To what extent did SCC4 participants complete their SCC4 programs and receive the targeted industry-recognized credentials?
What were the additional post-secondary education outcomes of SCC4 participants following their SCC4 programs?
What were the employment and earnings outcomes of SCC4 participants following their SCC4 programs?
How did the outcomes found in questions 1–3 vary by participant characteristics and grantee characteristics?
The implementation study will include all 16 grantees and examine the context of SCC4 programs, including how grantees integrate employer perspectives and student voice to shape programming, partnerships, training and student success coaching and enhanced supports provided, population served, and common implementation successes and challenges. The implementation study component of the evaluation will include a web-based survey of each grantee, interviews conducted through site visits to the five impact study colleges, and phone interviews with the other grantees. The implementation study will include interviews with grantee staff and leaders, college administrators, program directors, instructors and faculty, service support staff, select partners, and students during visits to approximately five impact study colleges across four grantees. Further, we will conduct interviews with non-impact study grantees’ grant administrators and program managers, as well as partners. This information collection request will include the college survey, the topic guides for each of these interviews, and the participant surveys (baseline and follow-up) and service receipt logs that will inform the impact study.
The SCC4 evaluation implementation study will address nine key research questions:
What are the variations in the model, structure, sectors, partnerships, and services of the grants across sites? What are the reasons for this variation?
How were the various sectors selected? What is the sector strategy, role of the sector convener and what type of entity plays that role?
How did SCC4 grantees plan for implementation of their program model?
What activities took place during the planning phase?
What types of partners were involved in planning and what roles did they play?
What strategies did SCC4 grantees plan and implement in their initial models to promote program access, persistence, completion, and transitions to good jobs and for whom?
How have the planned strategies operated in practice and how have representatives from the student groups of interest experienced them?
To what extent did the planned strategies align with the strategy options and core elements outlined by DOL?
To what extent did the planned strategies occur within and address specific features of a sector pathway?
What types of partners did SCC4 grantees work with to meet the goals of the grant and what roles did those partners play in implementation? In particular, how did grantees engage with students and communities to incorporate their input across planning and implementation?
At RCT sites, what enhanced services are provided to SCC4 students in the intervention group?
What factors affect the choices the colleges made regarding frequency of coaching sessions, what supports to proactively offer, how to monitor uptake, and other decisions about intervention design and implementation?
How do the enhanced services differ from business-as-usual student support services at the college (i.e., what is the service contrast)?
To what extent does the actual provision of enhanced services align with the prescribed components of the intervention (i.e., is there fidelity to the model)?
How does implementation of the enhanced services vary by site and sector pathway and change over time?
How do take-up rates vary by available services and by program and business-as-usual student groups? How does the use of available services vary by institutional, program, and student characteristics?
How did grantees work with industry and labor to promote career development and pathways to good jobs for program participants?
What career planning and employment transition services are provided to SCC4 students as a result of the grant?
How do these services differ from business-as-usual career planning and employment transition services at the college (i.e., what services were available before the grant)?
How do take-up rates vary by available services and by institutional, program, and student characteristics?
How did grantees plan for sustainability after the end of the grant period?
What lessons are they drawing?
What policies and practices are they institutionalizing?
What resources are required?
What barriers did institutions identify when pursuing systems change?
Understanding the impact and implementation of the SCC4 grant program requires collecting data from multiple sources. For the impact and outcomes studies, the study team will collect outcomes data from community college student records, employment and earning records from the National Directory for New Hires database, and postsecondary enrollment data from the National Student Clearinghouse, which do not require OMB approval and are outside of this request to OMB. The study team will collect outcomes data for all SCC4 participants. It will also collect survey data from all participants in the impact study, including through a baseline survey, a follow-up survey, and periodic contact information update requests. Additionally, the team will collect service receipt data from coaches in the form of logs. For the implementation study, the study team data collection will include interviews from site visits and phone calls with college administrators and program directors, instructors and faculty, SCC4 participants, program staff, and partners, as well as a survey of colleges. We also will use the college survey for the outcomes study.
The data collection instruments included in this clearance request are as follows: (1) a participant baseline survey and consent form; (2) a participant follow-up survey; (3) contact information update requests; (4) service receipt logs; (5) a college survey; (6) a semi-structured interview topic guide for college administrators and program directors, instructors and faculty, program staff, and partners; and (7) a semi-structured interview topic guide for SCC4 participants (see Table A.1 for details on how the study team will use data from these instruments).
The impact study instruments are as follows:
Participant baseline survey and consent form. For the impact study, before random assignment, participants enrolling in an SCC4 program at one of the participating colleges will complete baseline surveys, including active consent. The consent form will include permission for collection of the participant’s administrative earnings data, postsecondary enrollment data, baseline and follow-up surveys, and participation in interviews conducted during site visits. The study team will administer the baseline survey to approximately 6,000 participants in study Years 2 through 4, when colleges conduct intake for the SCC4 program. The participant baseline survey will collect basic demographic information, background earnings and educational attainment, and interest in receiving student success coaching and enhanced supports. We will administer it electronically by web; it will take approximately 15 minutes to complete.
Participant follow-up survey. We will field the participant follow-up survey to all impact study participants (approximately 6,000 participants) at the five impact study colleges in fall 2027, toward the end of the evaluation period. The participant follow-up survey will capture experiences from participants on a range of experience relative to their program entrance and ask about current employment and earnings, further education and training the participant has received since study enrollment, experiences in receiving student success coaching and enhanced supports, and overall well-being. The study team will administer it electronically by web; it will take approximately 15 minutes to complete.
Contact information update requests. The study team will send information update requests to all impact study participants at four points throughout the study—every six months following the start of study enrollment. The requests will ask participants to confirm or provide updates on their phone number, email, mailing address, and preferred method of contact. Administration will occur via a web-based form sent to participants in a text.
Service receipt logs. As part of the impact study data collection, the study team will ask grantee staff to track information on the frequency and type of services they provide to participants. Fifteen grantee staff across the five colleges participating in the impact study, primarily coaches, will be responsible for inputting information into these logs electronically.
The implementation study instruments are as follows:
College survey. As part of the implementation study, a college survey will provide information about all colleges funded for SCC4 programs, not just those participating in the impact study. We will administer the college survey electronically to all colleges in the study during fall 2025. This survey will collect details on service delivery models, staffing, partnerships, and implementation of the main program elements. We will use information from the college survey to support an implementation analysis as to how outcomes may vary across the various SCC4 program models. We will send the survey to 42 colleges across the 16 grantees. It will take approximately 40 minutes to complete and be administered electronically via the web.
Semi-structured interview topic guides. As part of the in-depth implementation study, the study team will conduct two rounds of visits to five colleges in the impact study and up to three phone interviews with program leaders at all grantees. The first visit will occur over three days in winter 2026 and will collect information on how colleges have implemented the intervention, any early challenges encountered, and their solutions. The second visit will occur over three days in spring 2027 and will provide information on how intervention models have changed over time, any additional challenges faced, sustainability, and systems changes. For grantees funded for SCC4 programs but not participating in the impact study, the study team will conduct one round of three phone calls during the 2026–2027 school year. Major topics during the site visits and phone calls will include the education and economic context surrounding the program, the organization and administrative structure, recruitment and enrollment, partnerships, employer engagement and work-based learning, integrated academic and career learning, academic and career counseling, wrap-around services, program sustainability, and systems change.
Semi-structured interview topic guide for college administrators, program leaders, program staff, instructors and faculty, and partners. During the two rounds of site visits, we will conduct one-on-one or small group semi-structured interviews with college administrators, program leaders, program staff, instructors and faculty, and any partners. Phone interviews will include program leaders and program staff.
Semi-structured interview topic guide for SCC4 participants. During the two rounds of site visits with impact study colleges, we will conduct semi-structured interviews with SCC4 participants.
Table A.1. Data collection instruments and uses of data
Data collection instrument |
How study team will use the data |
Impact study instruments |
|
|
This survey will gather information about the characteristics of SCC4 grant program participants and the services they are interested in receiving. The study team will use this information to describe the study sample, compare the characteristics of participants in the treatment and comparison groups, develop covariates for the impact analysis, develop subgroups for analysis, and potentially develop sample weights to adjust for an imbalance or survey nonresponse. |
|
This survey will gather information from SCC4 impact study participants to measure outcomes that cannot be collected using administrative data. The survey will collect data on participants’ employment status and earnings, further education and training after the SCC4 program, experiences in receiving or accessing student success coaching and enhanced supports, and overall well-being. |
|
This request will confirm or update participant contact information periodically to increase response rates for the participant follow-up survey. |
|
These logs will create a record of the student success coaching and enhanced supports provided to SCC4 students during and after their program enrollment. The study team will use this information to describe service receipt outcomes and the differences in services outcomes for the treatment and comparison groups. |
Implementation study instruments |
|
|
This survey will gather information on grantees, including the targeted sector and strategy elements being implemented; planned and established partnerships; services offered; strategies to reach the intended population; and early successes and challenges. The study team will use this survey to describe implementation of SCC4 programs and grantees’ key characteristics for the outcomes study. |
|
This topic guide will serve as the basis for detailed interview protocols to be developed for two rounds of site visits to colleges and phone interviews with all grantees. It will collect information on the institutional and community context of colleges; SCC4 program planning, management, and staffing; participant recruitment; SCC4 services; partnerships; participant outcomes; systems changes at colleges; and sustainability of services. |
|
This topic guide will provide an overview of key topics to be explored during semi-structured interviews with SCC4 program participants. The topics will include participant background, recruitment and enrollment, and overall program experience. |
This project will use multiple applications of information technology to reduce burden. The baseline, follow-up, and college surveys will have the capability of being hosted on the internet via a live secure web link. To reduce burden, the surveys will employ the following: (1) secure log-ins and passwords so respondents can save and complete the survey in multiple sessions, (2) drop-down response categories so respondents can quickly select from a list, (3) dynamic questions and automated skip patterns so respondents see only the questions that apply to them (including those based on answers provided previously in the survey), and (4) logical rules for responses so respondents’ answers are restricted to those intended by the question. The service receipt logs will also be designed using a short web-based form primarily composed of close-ended, “select one,” or “select all that apply” questions. The study team will work with grantees to set up the service receipt logs, either in the system used to collect participant consent or via an existing data collection or case management platform so as to streamline systems and reduce burden on the coaches who enter data.
Evaluation of the SCC4 grant program will not require collecting information available through alternate sources. For example, the evaluation will use information available from grantee applications and existing administrative data sets to ensure that data collected through interviews are not available elsewhere. Though the participant follow-up survey and service receipt logs will both collect information about participant service receipt, the survey will ask for an overview of the participant experience, whereas the logs will gather information on the frequency and content of each interaction.
Employer partners will participate in interviews as part of the implementation study site visits. Some of these partners might be from small businesses. To minimize burden on any small businesses that participate, we will request only the information required for the intended use and minimize burden by restricting the length of interviews to the minimum required time. We will also consider partners’ schedules when making decisions about the timing and locations of the interviews. As with all data collection activities, we will remind participants that their participation is completely voluntary.
The evaluation represents an important opportunity for DOL to add to the growing body of knowledge about what works in sector-based career pathways programs and strategies to improve employment outcomes for workers. Without collecting data on the SCC4 grant program through the surveys of participants and grant administrators, and qualitative interviews with impact study and non-impact study college administrators and program leaders, DOL will not be able to conduct a comprehensive evaluation of the grant program. Policymakers thus would not have information about the context in which the partnerships and programs operated, any operational challenges grantees and partners faced, or how the partnerships and sector-based career pathways programs and services evolved over time. Similarly, failure to collect baseline information from impact study participants would preclude DOL from ensuring that the treatment and comparison groups were equivalent, limiting the ability to determine the impact of the SCC4 grant services. Policymakers and the field thus would not gain high-quality information about the effectiveness of grantees’ approaches.
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentially that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentially to the extent permitted by law.
To collect information about respondents’ race and ethnicity, the study team requests an exemption from the requirement to collect detailed information, as outlined in the revised “Statistical Policy Directive No. 15 (SPD-15): Standards for Maintaining, Collecting, and Presenting Federal Data on Race and Ethnicity.” The study team plans to use the minimum categories in asking respondents to report on their race/ethnicity. The study team does not plan to collect detailed information on race/ethnicity (as outlined in SPD-15) as this is not necessary for planned data analysis and reporting. The detailed information requested will also create an additional, unjustifiable burden for respondents, who are likely busy with their job responsibilities and program participation. Asking the straightforward questions using the minimum categories will provide necessary information with minimal respondent burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years—even if the collection-of-information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
The 60-day notice FR Doc. 2024-27044 (89 FR 91802) to solicit public comments was published in the Federal Register on November 20, 2024. No comments were received.
The study team is coordinating consultation on the research design and data needs. The process involves convening experts in a technical working group (TWG), convening a student advisory group comprising current and former students in SCC-funded programs, and conducting discussions with grantee-level program staff. Table A.2 provides the names, titles, and affiliations of the five individuals expected to participate in the TWG meetings. A list of study team members is in Table A.3.
The TWG will provide substantive feedback throughout the project period, particularly on the impact study design, data collection and analysis, and reports. The TWG members have expertise in research methodology as well as on programs and populations similar to those being served in the SCC4 grant program. We have not yet confirmed the TWG members.
We will also consult an advisory group of current and former students in SCC-funded programs to prioritize student voice throughout the evaluation, from design to dissemination. For example, we will engage students in sense-making conversations to contextualize the findings during the reporting stage.
We will consult grantee and partner administrators and staff to better understand how the research design fits in with the institutional and regional context of grantees.
Table A.2. Expected participants in the TWG meetings
Name |
Title and Affiliation |
Cecilia Rios-Aguilar |
Professor of Education and Associate Dean, Graduate School of Education and Information Studies, University of California-Los Angeles |
Peter Riley Bahr |
Associate Professor, Marsal Family School of Education, University of Michigan |
Matthew Giani |
Research Associate Professor, Department of Sociology; Assistant Professor of Practice, Department of Educational Leadership and Policy, University of Texas-Austin |
Mark Potter |
Provost; Chief Academic Officer, City Colleges of Chicago |
LaShawn Richburg-Hayes |
Former Vice President for Education Studies, Westat |
Table A.3. SCC4 study team
Organization |
Individuals |
Mathematica
|
Ms.
Jeanne Bellotti |
|
Dr.
Ann Person Ms.
Brittany English Dr.
Ariella Spitzer Dr.
Lisbeth Goble |
Social
Policy Research Associates |
Ms.
Lea Folsom Kate
Dunham |
Community
College Research Center |
Dr.
Thomas Brock Dr.
Maria Cormier
|
Program or partner staff will not receive any payments or gifts because activities will be carried out in the course of their employment, with no additional compensation outside of their normal pay. Impact study participants will be eligible for up to $50 in payment for their time across the baseline survey, contact information update requests, and follow-up survey. These payments will include a $10 gift card for the baseline survey, $5 gift card for the second and fourth responses to the contact information update requests, and a $30 gift card for completing the participant follow-up survey. Participants who participate in semi-structured interviews as part of the implementation study will receive an additional $45 incentive payment for their time.
We will keep information we collect private to the extent permitted by law. The study team complies with DOL data security requirements by implementing security controls for processes that it routinely uses in projects that involve sensitive data. Further, we are conducting the evaluation in accordance with all relevant regulations and requirements.
Evaluating the SCC4 grant program using impact study methodology requires asking sensitive questions about wage rates and earnings. Past evaluations have included similar questions without any evidence of significant harm. As described earlier, the study team will assure all participants of the privacy of their responses before asking them to complete the baseline survey; the team also will inform them that they can skip any questions they do not wish to answer. We will report all data in aggregate, summary format only, eliminating the possibility of individual identification and ensuring that individual responses are private.
The study team will seek institutional review board approval for final, OMB-approved instruments.
Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. General estimates should not include burden hours for customary and usual business practices.
If this request for approval covers more than one form, provide separate hour burden estimates for each form.
Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.
Table A.4 includes assumptions about the annual number of respondents expected; the average number of responses per respondent; the average hours of burden per response; the annual burden hours estimated; the time value assumed for respondents; and the total annualized monetary burden hours for the implementation study’s survey of grant administrators, and the implementation study’s semi-structured interviews for site visits. All of the activities this request covers are annualized over three years. Here, we summarize the burden estimates, rounded to the nearest whole number for each of the data collection activities:
Participant baseline survey and consent form. The study team will administer this survey to about 6,000 impact study participants. We estimate each respondent will spend about 15 minutes on the survey. The annualized burden is approximately 500 hours.
Participant follow-up survey. The study team will field this follow-up survey with about 6,000 impact study participants and expects about 1,800 participants to complete the survey. We estimate each respondent will spend about 15 minutes on the survey. The annualized burden is approximately 150 hours.
Contact information update requests. As part of the impact study, the study team will send a contact information update request via text to all impact study participants every six months. We estimate each respondent will spend about two minutes on each contact information update request. The annualized burden is approximately 133 hours.
Service receipt logs. As part of the impact study, the study team will ask 15 SCC4 grantee staff across the five colleges in the impact study to complete service receipt logs. We estimate each respondent will complete 3,315 service receipt logs, which will take about two minutes each. The annualized burden is approximately 553 hours.
College survey. The study team will administer this survey to 42 college representatives at the 16 grantees. We estimate each respondent will spend about 40 minutes on the survey. The annualized burden is approximately nine hours.
Semi-structured interview topic guide for college administrators, program directors, program staff, instructors and faculty, and partners. As part of the implementation study, which will be conducted across all grantees, the study team will conduct semi-structured interviews with key respondent groups. Below we break out respondents by group to calculate anticipated burden.
College administrators. These interviews will occur across two rounds of site visits for the five impact study colleges and during a round of phone interviews for the 12 remaining non-impact study grantees. We will invite up to two participants from each college to participate in the site visit interviews and expect all of them to participate ((five colleges x two rounds of site visits) + (12 grantees x one round a phone interviews) = 22 interviews). Interviews will average 60 minutes. Given that each site visit will include two participants, the annualized burden is approximately 15 hours.
Program directors. These interviews will occur across two rounds of site visits for the five impact study colleges and during a round of phone interviews for the 12 remaining non-impact study grantees. We will invite one program director from each college to participate in the interviews and expect all of them to participate ((five colleges x two rounds of site visits) + (12 grantees x one round of phone interviews) = 22 interviews). Interviews will average 60 minutes. Given that each interview will include one participant, the annualized burden is approximately seven hours.
Instructors and faculty. These interviews will occur across two rounds of site visits for the five impact study colleges (four grantees). We will invite about three participants from each site to participate in the interviews and expect all of them to participate (three participants x five colleges x two rounds of site visits = 30 person interviews). Interviews will average 60 minutes. The annualized burden is approximately 10 hours.
Program and college staff. These interviews will occur across two rounds of site visits for the five impact study colleges (four grantees). We will invite about three participants from each college to participate in the interviews and expect all of them to participate (three participants x five colleges x two rounds of site visits = 30 person interviews). Interviews will average 60 minutes. The annualized burden is approximately 10 hours.
Partners. The study team will conduct semi-structured interviews with partners, such as employers who may have hired SCC4 participants. These interviews will occur during a single round of site visits for the five impact study colleges (four grantees) and during a round of phone interviews for the 12 remaining non-impact study grantees. We will invite about four participants from each college to participate in the site visit interviews and expect one participant for phone calls ((four participants x five colleges x one round of site visits) + (one participant x 12 colleges x one phone call) = 32 person interviews). Interviews will average 60 minutes. The annualized burden is approximately 11 hours.
Semi-structured interview topic guide for SCC4 participants. As part of the implementation site visits, which will occur at five colleges (four grantees), the evaluation will conduct semi-structured interviews with SCC4 participants. We will invite up to four participants from each college to participate in the interviews and expect three to participate (three participants x five colleges x two rounds of site visits = 30 participants). Interviews will average 60 minutes. The annualized burden is approximately 10 hours.
Table A.4. Estimated annualized respondent cost and hour burden
Activity |
No. of Respondentsa |
No. of Responses per Respondent |
Total Responses |
Average Burden (hours) |
Total Burden (hours) |
Hourly Wageb |
Monetized Value of Time |
Baseline Survey of Study Participants |
6,000 |
1 |
6,000 |
0.25 |
1500 |
$7.25 |
$3,625.00 |
Follow-Up Survey of Study Participants |
1,800 |
1 |
1,800 |
0.25 |
450 |
$7.25 |
$1,087.50 |
Contact Information Update Requests to Study Participants |
3,000 |
4 |
12,000 |
0.03 |
400 |
$7.25 |
$966.67 |
College Survey |
40 |
1 |
40 |
0.67 |
27 |
$49.33 |
|
Program Staff Service Receipt Logs |
15 |
3,315 |
49,725 |
0.03 |
1,658 |
$22.24 |
$12,291.31 |
Interviews with College Administratorsc |
34 |
1.3 |
44 |
1.00 |
44 |
$49.33 |
$723.51 |
Interviews with Program Directors |
17 |
1.3 |
22 |
1.00 |
22 |
$49.33 |
$361.75 |
Interviews with Instructors and Faculty |
15 |
2 |
30 |
1.00 |
30 |
$28.82 |
$288.20 |
Interviews with Program and College Staff |
15 |
2 |
30 |
1.00 |
30 |
$22.24 |
$222.40 |
Interviews with Partners |
32 |
1 |
32 |
1.00 |
32 |
$24.98 |
$266.45 |
Interviews with Study Participants |
36 |
1 |
36 |
0.75 |
27 |
$7.25 |
$65.25 |
Total |
11,004 |
|
69,759 |
|
4,220 |
|
$20,342.01 |
Note: Numbers are rounded to the nearest whole number for all columns other than the “No. of responses per respondent”, “Average burden hours”, and “Hourly wage rate” columns.
aAll annual totals reflect a three-year clearance and study data collection period.
bWe obtained the average hourly wages from U.S. Bureau of Labor Statistics.4 Estimates of study participants are based on the federal minimum wage ($7.25). Estimates of college administrators’ wages are based on the median wages for all “education administrators, postsecondary” occupations ($49.33). Estimates for instructors and faculty are based on the median wages for “educational instruction and library occupations” ($28.82). Estimates of wages for program staff are based on the median wages for “miscellaneous community and social service specialists” ($22.24). Estimates for partners are based on the median wages for “counselors, social workers, and other community and social service specialists” ($24.98).
cInterviews with college administrators, program directors, instructors and faculty, program and college staff, and partners will all use the same instrument: Semi-structured interview topic guide for college administrators, program directors, program staff, instructors and faculty, and partners.
The cost estimate should be split into two components: (a) a total capital and start up cost component (annualized over its expected useful life); and (b) a total operation and maintenance and purchase of service component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.
If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.
Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.
There will be no direct costs to respondents for the SCC4 evaluation other than their time.
Costs result from the following categories:
The estimated cost to the federal government for the contractor to conduct the data collection activities included in this package is $5,198,968. Annualized over four years of data collection, this amount comes to $1,299,742 per year.
The annual cost DOL will bear for federal technical staff to oversee the contract is estimated at $49,133. We expect the annual level of effort to perform these duties will require 200 hours for one Washington, DC-based federal GS 14 Step 2 employee earning $70.55 per hour, and 200 hours for one Washington, DC-based federal GS 15 Step 2 employee earning $82.99 per hour.5 To account for fringe benefits and other overhead costs, the agency has applied a multiplication factor of 1.6. Thus, [(200 hours × $70.55) + (200 hours × $82.99)] × 1.6 = $49,133.
The total cost to the federal government is $5,395,500, which includes costs for the contractor to conduct data collection, and costs for federal technical staff to oversee the contract. Therefore, the annualized cost is $5,395,500 / 4 = $1,348,875.
This is a new information collection.
The study team will address the research questions for each component of the study using the following analyses:
To measure the impact of the SCC4 programs, the evaluation will use an RCT that includes five colleges from four grantees. The impact analyses will focus on program progress and completion, enrollment in further education, and post-program employment and earnings. The RCT will also assess the characteristics of jobs by analyzing outcomes, such as hourly wages, benefits offered, and career growth opportunities, using participants’ survey data. When analyzing the RCT data, we will use a treatment effects framework to estimate impacts, considering both the impact on individuals given access to student success coaching and enhanced supports (intent to treat) and the impact on individuals who were induced into using student success coaching and enhanced supports through the intervention (local average treatment effect). To estimate the intent to treat, we will use a regression analysis of outcomes on an indicator for whether an individual was assigned to the treatment groups, controlling for individual characteristics, program fixed effects, and time fixed effects. To estimate the local average treatment effect, we will use an instrumental variable model in which treatment assignment is an instrument for the intensity of student success coaching and enhanced supports received. For both analyses, we will also perform subgroup analyses by individual characteristics and service receipt.
We will conduct an analysis of outcomes of SCC4 participants in the full set of grantees that includes the four grantees in the RCT plus the additional 12 grantees. The outcomes analysis will measure outcomes collected using administrative data, focusing on participants’ educational progress and completion, enrollment in further education, and post-program employment and earnings. The outcomes study will quantify these key outcomes among SCC4 participants as a whole and among subgroups. First, we will consider subgroups by grantee characteristics, estimated using the college survey. We will also consider subgroup analyses by individual grantees and groups defined by participant characteristics. Outcomes analyses will primarily rely on descriptive statistics, such as means and simple tabulations.
Implementation analyses will use data collected from all 16 grantees. We will integrate data from all implementation sources to answer the implementation study’s research questions. First, we will analyze data from the college survey to describe key features of program implementation, such as grantee and program staffing structure, staff background, partnership structure, services offered, and other program components. These analyses will primarily rely on tabulations and means. We will also analyze data from interviews using qualitative data analysis software such as NVivo. This coding will facilitate identification of key themes across colleges and enable the study team to describe SCC4 implementation. We will analyze coded data and develop college-level summaries that identify within- and across-college themes, similarities, and contrasts.
The evaluation includes an implementation study and an impact study. In 2025, we will submit a public-facing design report for the impact study. Data collection for the implementation and impact studies will begin in 2025 and end in 2027. The following products will be developed:
Implementation interim report. The study team will complete an interim report describing the findings from the implementation study. This report will answer research questions on program design and planning, drawing on findings from the document review, college survey, and first round of site visits to impact study colleges. The interim report will also provide an important snapshot of implementation to provide timely insights into grantees’ efforts to embed students’ and workers’ voice in program design, establish partnerships necessary to implement selected strategy elements, and recruit and enroll students.
Implementation study final report. The study team will also complete a final implementation study report addressing remaining research questions focused on program implementation, systems change, and program sustainability by incorporating data from the second round of site visits, phone interviews, and administrative data. It will also examine how sector-based career pathways can promote or inhibit program and labor market access and success.
Impact study interim report. The study team will complete an impact study interim report in summer 2027, documenting initial findings from impact analyses that can be completed with the data collected. This report will describe the implementation of the intervention, including how it impacted participants’ receipt of student success coaching and enhanced supports. Likely outcomes will include program completion and employment outcomes of initial SCC4 program cohorts. If there is sufficient statistical power, the report will also examine the effects on these outcomes for subgroups and present an analysis of the association between program components and participant outcomes.
Impact study final report. The study team also will complete a final report documenting how accessing the student success coaching and enhanced supports intervention affected participants’ outcomes. Likely findings will include employment, earnings, and education outcomes. This report will also examine the effects for subgroups and present an analysis of the association between program components and participant outcomes. It will also include the results of the outcomes study.
The OMB Control Number and expiration date will be displayed or cited on all forms that are part of the data collection.
There are no exceptions to the certification statement in this information collection.
1 Groshen, E., and H. Holzer. “Labor Market Trends and Outcomes: What Has Changed Since the Great Recession?” August 2021. https://journals.sagepub.com/doi/full/10.1177/00027162211022326. Accessed January 30, 2025.
2 Peck, L.R., D. Schwartz, J. Strawn, C.C. Weiss, R. Juras, S. Mills de la Rosa, N. Greenstein, et al. “A Meta-Analysis of 46 Career Pathways Impact Evaluations.” Abt, December 2021. https://www.dol.gov/sites/dolgov/files/OASP/evaluation/pdf/A%20Meta-Analysis%20of%2046%20Career%20Pathways%20Impact%20Evaluations_final%20report.pdf. Accessed January 30, 2025.
3 U.S. Department of Labor. “Notice of Availability of Funds and Funding Opportunity Announcement for: Strengthening Community Colleges Training Grants.” 2023. https://www.dol.gov/sites/dolgov/files/ETA/grants/FOA-ETA-23-15_.pdf. Accessed January 30, 2025.
4 U.S. Bureau of Labor Statistics. “National, State, Metropolitan, and Nonmetropolitan Area Occupational Employment and Wage Estimates.” June 2023. https://www.bls.gov/oes/current/oes_nat.htm on December 4. Accessed December 4, 2024.
5 See Office of Personnel Management. “Salary Table 2025-DCB.” https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2025/DCB_h.pdf. Accessed June 9, 2025.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | 1-column report template |
Author | Sharon Clark |
File Modified | 0000-00-00 |
File Created | 2025-08-03 |