Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Next Generation of Enhanced Employment Strategies Project
OMB Information Collection Request
0970-0545
Supporting Statement
Part A
March 2020
Revised February 2021
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officer:
Hilary Bruck
Part A
Executive Summary
Type of Request: This Information Collection Request is for changes to the new collection request approved in April 2020 under OMB #0970-0545.
Description of Request: The Office of Planning, Research, and Evaluation (OPRE) within the Administration for Children and Families (ACF) will conduct data collection activities for the Next Generation of Enhanced Employment Strategies Project (NextGen Project). The project will include experimental impact, descriptive, and cost studies of about 10 programs. This request is for changes to first phase instruments as well as approval to use a subset of second phase instruments with programs selected for inclusion in the NextGen Project, with changes. As described in the initial request, we are using a two-phased approach for our information collection requests. The first phase includes instruments that will be uniform across programs selected for evaluation. The second phase includes materials that could be tailored to programs and therefore finalized after recruitment of specific programs. We do not intend for this information to be used as the principal basis for public policy decisions.
Time Sensitivity: We are planning to begin these data collections in some selected programs in mid 2021.
The Office of Planning, Research, and Evaluation (OPRE) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval for data collection activities conducted for the Next Generation of Enhanced Employment Strategies Project (NextGen Project). OPRE contracted with Mathematica to conduct the NextGen Project.
A1. Necessity for Collection
OPRE has spent decades studying strategies to help low-income people find and keep jobs. Findings from these studies have been mixed, revealing variation in what works for whom and the duration and magnitude of impacts. Some studies have also demonstrated that certain programs are less accessible to individuals with complex challenges, such as low educational attainment or involvement with the criminal justice system, due to the program’s eligibility requirements.
The NextGen Project is intended to build on the findings and lessons learned from these past and ongoing evaluations by identifying and rigorously evaluating the “next generation” of employment strategies for highly vulnerable populations with complex barriers to obtaining and retaining employment. These strategies may be enhancements or adaptations of previously evaluated strategies, or innovative approaches showing promise in the field and ready to be tested. Additionally, the project has a particular interest in the role of market-oriented, employment-focused programs, such as social enterprises and public/private partnerships, in assisting highly vulnerable populations obtain and retain employment. The current data collection request is necessary to conduct these rigorous evaluations.
A2. Purpose
Purpose and Use
The information collected through the instruments included in this Information Collection Request (ICR) will be used to evaluate innovative programs serving low-income individuals facing complex challenges to employment and economic independence to expand the evidence base in this area.
The NextGen Project is actively coordinating with another current project sponsored by OPRE, the Building Evidence on Employment Strategies for Low-Income Families (BEES) study (OMB #0970-0537). BEES may include impact and/or implementation studies of up to 21 employment-focused programs; these will not overlap with programs selected for the NextGen Project. The NextGen Project and BEES have a common goal to foster stronger understanding of the types of programs that can improve labor market outcomes for low-income individuals; however, the projects also maintain separate domains of focus. In addition, both projects are involved in a joint effort with the Social Security Administration (SSA). SSA has provided demonstration program funds to ACF to support the addition of a disability focus in both projects; specifically, to identify and evaluate employment-related programs for potential SSI applicants. This is intended to assist SSA in better understanding the types of early interventions that effectively connect or reconnect potential SSI applicants to work before they apply for SSI. See Section A4 for information about coordination and efforts to not duplicate activities.
Data collection instruments for the NextGen Project impact studies will provide baseline and outcome data about study participants, which the project team will use to estimate the effectiveness of each program. The project team will use data collection instruments for the descriptive studies to describe each program’s design, staffing, service provision, partnerships, and other details necessary to understand the nature of and context for the programs, and for other organizations to replicate them. The instruments will also help inform the interpretation of impact findings. Finally, the project team will use data collection for the cost studies to estimate the costs of implementing each evaluated program and to estimate the cost-effectiveness of the programs. The results will provide policymakers and practitioners with high-quality information on the effects, design and implementation, and the cost of the programs. Having this information will help strengthen policy and practice to better serve individuals facing complex challenges to employment and economic independence. Study findings may also inform future studies in this area.
The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker and is not expected to meet the threshold of influential or highly influential scientific information.
Research Questions or Tests
The questions this evaluation will answer are in Table A.1.
Table A.1. Research questions for the NextGen Project
Impact studies |
Did the program affect the amounts and types of services participants receive? |
Did the program improve participants’ employment outcomes (employment, earnings, job retention and advancement, and quality of job) and economic independence (income, public assistance receipt)? |
Did the program improve outcomes relevant to the challenges faced by the target population, for example reduce substance abuse; reduce criminal justice involvement; or increase education, credentialing, and training? |
Did the program improve participants’ physical health, mental health, and well-being? |
Was the program more effective for some groups of participants than others? |
Did the impacts of the program change over time? If so, how? |
How did the program’s costs compare to the benefit of the impacts it generated? What were the net benefits for participants and society as a whole? |
Descriptive studies |
How was the program designed and implemented? |
What contextual, organizational, and other factors impeded or facilitated implementation? |
What were the challenges faced, solutions, and lessons learned? |
What were the characteristics of study participants? |
What services were participants offered, and what were the participation and outcome patterns? |
What role did employers play in the program? How do local labor market conditions affect the program design, implementation, and employers’ and participants’ involvement? |
Which program services or implementation features appear to be related to program impacts? Which components or services do participants and staff perceive to be helpful? |
What were the backgrounds and experience of program staff and program leaders? |
How did staff spend their time, and how many participants did they work with? |
How did program leaders spend their time? |
How did participants perceive the program? What were the most helpful elements? How did the program affect their lives? |
Cost studies |
How was the program funded? What were its costs? Was the program sustainable? |
Study Design
The NextGen Project will include experimental impact, descriptive, and cost studies of about 10 programs. It will study programs that include a wide range of supports designed to serve individuals with multiple challenges to employment and that might be delivered by public–private partnerships, interagency collaborations, government initiatives, nonprofit agencies, or social enterprises. In addition to these studies, the project will include case studies of employers and social enterprises using novel strategies to serve the target population of interest. These case studies will not include programs or employers that participate in the impact, descriptive, or cost studies for the broader evaluation.
The impact studies are intended to produce internally valid estimates of the program’s causal impact, not to promote statistical generalization to other sites or service populations. The descriptive and cost studies are intended to present internally valid descriptions of the service population, implementation, and cost of the programs in the chosen sites, not to promote statistical generalization to other sites or service populations. See Section B.1 of this ICR for further information about the appropriateness of the design and its limitations.
The NextGen Project is currently identifying and assessing innovative programs for inclusion in the NextGen Project; these activities are approved under the generic clearance for Formative Data Collections for ACF Research (OMB #0970-0356). The programs will be assessed to determine if they meet three general criteria: (1) the program addresses the research priorities of this project; (2) the program is well implemented, or could be after some technical assistance; and (3) a rigorous evaluation of the program is feasible, using an experimental design, or could be after the program receives some technical assistance. Additionally, programs should have some evidence that they might be effective, and an evaluation of the program should build on existing evidence and be valuable to the field. Some programs to be selected will also address SSA’s research interests. The programs to be studied are not national programs, and the study is not designed to be nationally representative, nor will the project team attempt to generalize the evaluation results beyond the programs and target populations under study.
Phased Approach to Data Collection Approval
As noted in the Executive Summary, the NextGen Project will use a two-phased approach for OMB approval of this ICR.
Phase 1
In Phase 1, the project team is formally recruiting the programs being identified and assessed through the approved generic IC (discussed above). In April 2020, OMB granted approval for the project team to administer the baseline survey (Instrument 1) and to collect identifying and contact information for study participants (Instrument 2). We intend for these two baseline data collections to be uniform across programs selected for evaluation and do not anticipate that they will need to be tailored to a specific program beyond the program-based skip logic in the instruments.
Phase 2
In the first ICR submission we indicated that, under Phase 2, we would request approval of the remaining instruments. We anticipated that some of the Phase 2 instruments would require some revisions to tailor to each program selected for the evaluation. The initial ICR submission included drafts of these instruments and burden estimates for initial review and informational purposes (Appendices F and H – O), but did not seek approval at that time. Phase 2 instruments were also included in the Federal Register Notices, allowing for public comment on the initial versions. We indicated that once programs are selected for the evaluation, we would submit updated materials and burden estimates as either a non-substantive change request or a revision with abbreviated public comment time, dependent on the level of changes and guidance provided by the OMB Office of Information and Regulatory Affairs.
In a non-substantive change request approved in December 2020, we requested official approval to use a subset of the Phase 2 instruments across all selected NextGen sites, with non-substantive changes to all but one of the instruments as well as changes to capture how programs responded to COVID-19 and the resulting recession. As explained in that request, rather than tailoring instruments to each selected site, as initially proposed in the first ICR, we intend to use those same Phase 2 instruments across all sites, with skip patterns and/or instructions to interviewers indicating whether certain items only apply to certain types of respondents or programs. The following Phase 2 instruments were part of that request:
Instrument 6. Staff characteristics survey - revised
Instrument 7. Program leadership survey - revised
Instrument 8. Semi-structured program discussion guide - revised
Instrument 10. In-depth participant interview guide - revised
Instrument 11. Cost workbook
In this change request, we are seeking clearance for changes to the previously approved Phase 1 instruments; updates to the previously approved consent form and clearance for a parent/guardian consent form and a youth assent form for use in the evaluation of one selected program that serves youth; and approval to use a subset of Phase 2 instruments with programs selected for inclusion in the project with some changes made to those instruments. Specifically, the following instruments are part of this request:
Phase 1:
Appendix A. Informed consent form - revised
Appendix A.1. Bridges informed consent form
Instrument 1. Baseline survey - revised
Instrument 2. Identifying and contact information - revised
Phase 2:
Instrument 5. Service receipt tracking - revised
Instrument 9. Semi-structured employer discussion guide - revised
The original ICR submission included burden estimates for each instrument. The burden for completing the data collection for the instruments included in this request falls within those original estimates; the proposed changes do not change the burden estimates.
Impact studies. The experimental impact studies will provide rigorous evidence on whether each program is effective, for whom, and under what circumstances. Participants eligible for the programs will be asked to consent to participate in the study (Appendix A) and, if they provide consent,0 will be randomly assigned to one of at least two groups: one or more treatment groups offered the program or a control group not offered the program. Members of all study groups will continue to have access to other services offered in the community. Individuals who do not consent to participate in the study will not be randomly assigned, will not participate in the data collection efforts, and will not be eligible to receive the intervention (until after the second follow-up survey has been fielded).
The project team will collect information from study participants for the impact studies at three points: (1) at program entry before random assignment occurs (baseline); (2) at about 6 to 12 months after random assignment via the first follow-up survey; and (3) at about 18 to 24 months after random assignment via a second follow-up survey. (Note that the timing of the follow-up surveys might vary depending on when each program’s theory of change suggests impacts might be expected.) Table A.2 presents the data collection activities for the impact studies.
As noted above, this change request seeks approval for revisions to Instrument 1. Baseline survey and Instrument 2. Identifying and contact information. Changes to these instruments are detailed in Appendix Q. Summary of requested changes. The burden for these activities fits within the burden request for Phase 1 instruments submitted in the original ICR submission.
Table A.2. Data collection activities for the impact studies
Data Collection Activity and Associated Instrument |
Respondent, Content, Purpose of Collection |
Mode and Duration |
Proposed Phase 1 Instruments |
||
Baseline data collection
Instrument 1 (revised): Baseline survey
Instrument 2 (revised): Identifying and contact information |
Respondents: All consenting study participants.
Content: Baseline survey includes information on demographics, receipt of Social Security Administration benefits, employment history, social trust, COVID-19-related challenges, and challenges to maintaining employment. Identifying information includes name, Social Security number, and date of birth. Contact information includes physical and electronic addresses and social media information for participants and up to three friends or relatives. Instrument 2 also includes the Center for Epidemiologic Studies Depression Scale Revised (CESD-R), which is used by one program being considered for inclusion in the evaluation as a program eligibility screening tool.
Purpose: Baseline survey data will be used to describe the study sample and check that the characteristics of the study participants are similar on average across groups. The data will also be used to define subgroups, as covariates in regression models, and for weighting for nonresponse. A question-by-question justification for the items included in the baseline survey is presented in Appendix B – revised (updated to reflect the proposed changes under this change request).
Identifying information are used before random assignment to make sure participants have not already been enrolled in the study. The project team will use this information later to match study participants to their administrative data records to assess outcomes. In addition, the team will collect detailed contact information to locate participants to complete follow-up surveys.
A program under strong consideration for inclusion in the evaluation currently uses the CESD-R screening tool during program intake to assess eligibility for the program. Including it with the collection of identifying and contact information will streamline study intake procedures for this program. The CESD-R will only display for that program; other programs will skip these items. The CESD-R items will not add to the evaluation-related information collection burden; the items will be administered before study consent and used only to determine program eligibility in keeping with the program’s current intake requirements. The study team will maintain CESD-R scores for those who are eligible for the program (that is, those with scores of more than 16 on the scale) and consent to participate in the study. The revised consent form indicates that the program may share the eligibility screener score with the study team (Appendix A. Informed consent form – revised).
A question-by-question justification for the items included in the identifying and contact information is presented in Appendix C - revised (updated to reflect the proposed changes under this change request). |
Mode: Baseline survey will allow for multiple administration options: by program staff, self-administered by study participants via the web, or by NextGen Project staff via telephone.
RAPTER® identifying and contact information and responses to CESD-R questions will be provided verbally by study participants and entered into RAPTER® by program staff.
Duration: 25 minutes (total to complete the baseline survey and provide identifying and contact information) |
Proposed Phase 2 Instruments |
||
Follow-up data collection
Appendix F. Instrument 3 (draft): First follow-up survey
Appendix H. Instrument 4 (draft): Second follow-up survey |
Respondents: The project team will attempt to survey all study participants.
Content: The follow-up surveys collect data on outcomes of interest, including service receipt, employment, earnings, economic independence, well-being, health status, substance use, involvement in the criminal justice system; perceptions of the usefulness of the program being evaluated (for treatment group only); and updated contact information (on first follow-up survey only). The exact questions asked could vary by site depending on the site’s target population.
Purpose: The project team will use survey data to estimate program impacts on outcomes of interest; estimate the program impacts on the services the study participants receive; describe treatment group members’ perceptions of the usefulness of the program being evaluated; and describe the study sample. The updated contact information from the first follow-up survey will be used to assist in locating study participants for the second follow-up survey. A question-by-question justification for the items included in the follow-up surveys is in Appendix D. |
Mode: Participants will self-administer via the web. Alternatively, administered by NextGen Project staff via telephone
Duration: 50 minutes per follow-up survey |
Descriptive studies. The descriptive study for each program will describe the following: (1) the community, economic, and program context in which the program operates; (2) the characteristics of the program model, including the target population, services offered, role of partners and employers, theory of change, and plans for sustainability and replication; and (3) the implementation and cost drivers of the program, such as leadership, organizational culture and structure, staffing and staff development, and service delivery. The data collection period for the descriptive study will vary by participating program, typically around 4 to 8 months after the study begins enrolling participants. Table A.3 summarizes the proposed data collection activities for the descriptive studies. If respondents consent to being recorded, the interviewer will audiorecord discussions with program administrators, supervisors, staff; key partner staff, including employers; and participants.
As noted above, the study gained approval for a non-substantive change request to use a subset of the descriptive study instruments with all NextGen Project sites, with changes to those instruments, in December 2020. These included the staff characteristics survey (Instrument 6. Staff characteristics survey - revised), program leadership survey (Instrument 7. Program leadership survey - revised), semi-structured program discussion guide (Instrument 8. Semi-structured program discussion guide - revised), and in-depth participant interview guide (Instrument 10. In-depth participant interview guide - revised).
This change request seeks approval to use an additional subset of descriptive instruments with all NextGen Project sites, with changes to the instruments, as noted below in Table A.3. These include the service receipt tracking instrument (Instrument 5. Service receipt tracking - revised) and guide for employer discussions (Instrument 9. Semi-structured employer discussion guide - revised). Additional information regarding the proposed changes to these instruments is provided in Appendix Q. Summary of requested changes. The burden for these activities fits within the burden request for Phase 2 instruments submitted in the original ICR submission.
Table A.3. Data collection activities for the descriptive studies
Data Collection Activity and Associated Instrument |
Respondent, Content, Purpose of Collection |
Mode and Duration |
Proposed Phase 2 Instruments |
||
Treatment group service receipt
Instrument 5 (revised): Service receipt tracking |
Respondents: Program staff
Content: Information about the treatment group members’ participation in the program. In programs that also provide services to control group members, program staff might also record information on receipt of services of control group members.
Purpose: To describe the service receipt of treatment group members, including type of service, duration, location, and mode. |
Mode: Program staff will enter information about services received by study participants through the program in RAPTER®. If a program already collects data on service receipt through its own database, the study will use the information the program already collects.
Duration: 5 minutes per entry |
Characteristics of program staff and leaders
Instrument 6. Staff characteristics survey - revised
Instrument 7. Program leadership survey - revised |
Respondents: Program staff and leaders.
Content: Staff members’ and leaders’ professional backgrounds, skills, experience, credentials, and perceptions of the program. Leaders’ resource investments and decision-making processes. Changes due to COVID-19.
Purpose: To provide insight into how program structure, staffing, and leadership might affect implementation of the program. Compared with the semi-structured interviews, described below, the surveys will enable the collection of information (1) in a more structured format, (2) on topics that staff and leaders might be uncomfortable talking about in a group setting, and (3) from a broader set of staff and leaders than would have the time to participate in a semi-structured interview. |
Mode: Program staff and leaders will self-administer the surveys via the web.
Duration: 25 minutes for staff survey; 15 minutes for leadership survey |
Discussions with program staff, partners, and employers
Instrument 8. Semi-structured program discussion guide - revised
Instrument 9 (revised): Semi-structured employer discussion guide
|
Respondents: Program administrators, supervisors, staff; key partner staff, including employers
Content: Semi-structured discussions with program administrators, supervisors, direct service staff, community partners, and specialized treatment providers will provide information about the program’s design and implementation and any COVID-19 related challenges. Semi-structured discussions with employers will collect information about their involvement in developing and executing the programs of interest including any changes to the employers’ relationships with the programs as a result of the COVID-19 pandemic.
Purpose: To describe each program’s design, staffing, service provision, partnerships, and other details necessary to understand the nature of and context for the programs, and for other programs to replicate them. Also to help inform the interpretation of impact findings.
|
Mode: The interviews will be conducted in person during site visits, either individually or in small groups. Interviews may also be conducted via telephone or video dependent upon any COVID-related restrictions.
Duration: 90 minutes per administrator; 60 minutes per program supervisor, key partner staff, or employer; 45 minutes for direct service staff |
In-depth participant interviews
Instrument 10. In-depth participant interview guide - revised
|
Respondents: Select study participants
Content: Participants’ background and goals, experiences and challenges finding and retaining employment, experiences with the program, including reasons for disengaging from the program, if applicable. Challenges related to COVID-19.
Purpose: To provide the “stories” that will make the findings from the implementation and impact studies more meaningful. They might also inform the understanding of whether the program was implemented as planned and suggest possible refinements. |
Mode: The interviews will be conducted in person during site visits. Interviews may also be conducted via telephone or video dependent upon any COVID-related restrictions.
Duration: 120 minutes |
Cost studies. The cost study for each program will (1) provide descriptive information about the amount, sources, and types of its funding, and (2) produce an estimate of the average cost of the program per participant. The average cost of the program per participant will be used in the benefit-cost analysis. In that analysis, the benefits that accrue to program participants such as increased earnings and reduced receipt of public benefits will be compared with the cost of providing program services. Data collection for the cost studies will ideally take place around the same time as the data collection for the descriptive studies. They are summarized in Table A.4.
In December 2020, the study received approval to use the Phase 2 Excel-based cost workbook to collect cost study data across all NextGen Project sites with no changes proposed to the instrument. The instrument does not need to be tailored for each site. The burden for data collection fits within the burden request for Phase 2 instruments submitted in the original ICR submission.
Table A.4. Data collection activities for the cost studies
Data Collection Activity and Associated Instrument |
Respondent, Content, Purpose of Collection |
Mode and Duration |
Proposed Phase 2 Instruments |
||
Cost data collection
Instrument 11. Cost workbook
|
Respondents: Program leader (or a designee)
Content: Excel-based cost workbook to record information on the expenditures associated with the program for a recent 12-month period.
Purpose: To estimate the costs of implementing each evaluated program and to estimate the cost-effectiveness of the programs.
|
Mode: The project team will ask program leaders for their accounting records or financial reports and obtain as much information as possible from these records. If additional information is needed after review of financial records, the project team will ask the programs to complete the workbook in part or in full, depending on the information required.
Duration: 32 hours |
Other Data Sources and Uses of Information
The NextGen Project will collect administrative records data for outcomes of interest; this information is already being collected and represents no additional burden for participants or program staff. The project team will collect administrative data on quarterly earnings, receipt of unemployment insurance, and new hires on all study participants from the National Directory of New Hires (NDNH), which is maintained by the Office of Child Support Enforcement at ACF. If applicable, the project team will also collect records for study participants on the receipt of TANF program data and contact information from state or local TANF agencies. For some programs, administrative data will be collected from SSA on annual taxable earnings and receipt of SSI and Social Security Disability Insurance. In addition, as applicable and informative to the programs’ theories of change, data might also be collected on receipt of Supplemental Nutrition Assistance Program (SNAP) benefits and contact information; receipt of benefits and contact information from the Special Supplemental Nutrition Program for Women, Infants, and Children; state records on child support owed or payed; health care outcomes (Medicare enrollment and claims) from the Centers for Medicare & Medicaid Services; involvement with the criminal justice system from court records; educational attainment and completion from school districts; and receipt of housing benefits (such as participation in a housing choice voucher program) from housing authorities.
The project is using information collected or expected to be collected under the generic clearance for Formative Data Collections for ACF Research (OMB #0970-0356), including information collected to gather feedback from stakeholders, identify sites, and assess activities and characteristics.
A3. Use of Information Technology to Reduce Burden
This project will use multiple applications of information technology to reduce burden. As described below, information technology will be used to collect baseline data and participant identifying and contact information; conduct the two follow-up surveys; collect information on service receipt; conduct surveys with program staff and leaders; and collect cost information from the programs. The semi-structured staff discussions and in-depth participant interviews will be audiorecorded, if respondents consent to being recorded. Additionally, interviews may be conducted via telephone or video dependent upon any COVID-related restrictions.
RAPTER®. RAPTER® is a secure, web-based system that program staff will use to administer consent to participants, collect their identifying and contact information, conduct random assignment, and enter information on the services received by study participants. The use of check boxes and drop-down menus and response categories will minimize data entry burden.
Baseline, follow-up, staff, and leadership surveys. All surveys will have the capability to be hosted on the Internet via a live secure web-link. To reduce burden, the surveys will employ (1) secure log-ins and passwords so respondents can save and complete the survey in multiple sessions, (2) drop-down response categories so respondents can quickly select from a list, (3) dynamic questions and automated skip patterns so respondents only see those questions that apply to them (including those based on answers provided previously in the survey), and (4) logical rules for responses so respondents’ answers are restricted to those intended by the question.
Respondents also have the option to complete the baseline survey and first and second follow-up surveys using computer-assisted telephone interviewing (CATI). CATI reduces respondent burden, relative to interviewing via telephone without a computer, by automating skip logic and question adaptations and by eliminating delays caused when interviewers must determine the next question to ask.
Excel-based workbook for collecting cost data. A Microsoft Excel-based data collection tool will be used to collect cost data. To reduce respondent burden, the project team will ask program leaders for their accounting records or financial reports and obtain as much information as possible from these records to complete the workbook. If additional information is needed after review of financial records, the project team will ask the programs to complete the remaining sections of the workbook. Formatting, data checks, and layout built into the template will assist staff in completing it.
A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency
Information that is already available from alternative data sources will not be collected again for this project. For example, if a program in the study has an existing management information system that collects information needed for this project that is exportable and of sufficient quality, we will accept data from its existing system. In these cases, the project team will request the program only enter into RAPTER® data that the program is not already collecting.
Although information on employment will be collected from administrative records and via the survey, this information is not duplicative because the two sources differ in accuracy and coverage of jobs. NDNH administrative records will provide information on quarterly earnings from jobs covered by unemployment insurance as well as new hires. The baseline survey and follow-up surveys will ask for information about all jobs held, including those not covered by unemployment insurance. The follow-up surveys will also collect information about the characteristics of the jobs (such as the wage rate, hours worked, and benefits offered) that are not included in the NDNH data.
The follow-up surveys will collect information on whether participants received assistance from public assistance programs such as TANF, SNAP, unemployment insurance, and other assistance programs. However, these surveys will not ask for details about the receipt of these benefits, which we will collect via administrative records. It is important to ask about receipt of benefits on the survey because administrative records will not be available for those respondents who do not provide their Social Security number.
As noted in Section A2, the NextGen Project is actively coordinating with OPRE’s BEES study. OPRE is intentionally and strategically coordinating these projects in order to prevent duplication of effort; fully capitalize on the opportunity the projects afford for large-scale, rigorous evaluation; advance the knowledge base regarding effective employment strategies for low-income, vulnerable populations; and meet SSA’s priorities across both projects. The projects intentionally included some common questions within instruments. Areas of overlap with the existing BEES data collection instruments are described in the question-by-question justifications for the baseline data collection and follow-up surveys (Appendices B, C, and D). The projects differ in that BEES is especially interested in evaluating programs for individuals struggling with opioid dependency, abuse of other substances, and/or mental health issues, while the NextGen Project is especially focused on evaluating interventions that are market-oriented and/or employer-driven. Additional domains of focus may emerge as both projects complete knowledge development and identify potential sites for participation.
A5. Impact on Small Businesses
Although we have not yet recruited the specific programs to be evaluated, small organizations, such as businesses or nonprofit organizations, might be involved in implementing a program to be evaluated. If small organizations are involved, we will minimize the burden for respondents by collecting data at times convenient for the respondents, and requiring minimal record keeping or written responses on the part of respondents.
A6. Consequences of Less Frequent Collection
The project team will collect information only once for the baseline survey and identifying participant information, staff characteristics survey, program leadership survey, semi-structured staff discussions, semi-structured employer discussions, in-depth participant interviews, and the Excel-based workbook for collecting cost data.
The project team will administer two similar follow-up surveys. Collecting data at two points of time will allow an examination of whether the impacts of the program changed over time and whether changes in intermediary outcomes (such as health or skills) were associated with changes in longer-term outcomes (namely employment and economic independence outcomes). This also reduces the chance of recall error from respondents when collecting information on their receipt of services and jobs held over a period of time, relative to collecting it only once at the end of the follow-up period. Similarly, updated contact information will be collected from respondents upon administration of the first follow-up survey to assist in locating them for the second follow-up survey.
Program staff will use the RAPTER® system or their existing management information system to record service receipt for each participant each time he or she receives a service. Staff will be asked to enter the information into RAPTER® immediately after the service is provided. Doing so less frequently would contribute to recall error and affect the quality of data collected.
A7. Now subsumed under 2(b) above and 10 (below)
A8. Consultation
Federal Register Notice and Comments
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on January 8, 2020, Volume 85, Number 5, page 906-907, and provided a 60-day period for public comment. A copy of this notice is attached as Appendix P. During the notice and comment period, no substantive comments were received.
ACF published an additional notice in the Federal Register announcing the agency’s intention to request an OMB review of these proposed changes to the information collection activity. This notice was published on January 6, 2021, Volume 86, Number 3, page 541-543, and provided a 30-day period for public comment. A copy of this notice is included in Appendix P.1. During the notice and comment period, no substantive comments were received.
Experts in their respective fields from OPRE and Mathematica were consulted in developing the design, data collection plan, and instruments for which clearance is requested. Select agency staff within SSA and HHS were also consulted. We also consulted with the BEES project staff to coordinate measurement of key outcomes across projects.
A9. Tokens of Appreciation
The proposed structure of tokens of appreciation for this study is designed to support the retention of respondents over the course of the longitudinal data collection and enhance the quality of information derived from in-depth interviews. OMB approved the proposed structure of tokens of appreciation for this study in April 2020.
Study Enrollment
After finishing the study enrollment process, participants will receive a study packet designed to establish their engagement with the study. This packet will include a copy of the consent form, a one-page study flyer that describes upcoming data collection activities (see Appendix G), and a small study-specific item (valued between $1-$3) such as a magnet, keychain, or screen cleaner, that contains the study logo and contact information for our call center. The purpose of these materials is to establish positive association with the study and support familiarity when respondents are contacted to participate in an interview.
Longitudinal
Surveys
To
increase survey participation following successful contact, we
propose that respondents
to the first follow-up survey receive a $40 gift card and that
respondents to the second follow-up survey receive a $50 gift card.
While both surveys are estimated to take 50 minutes, the increase in
amount between the first and second survey reflects an expectation
that respondents, particularly control group members and treatment
group members who may have been less engaged in program services, may
perceive the study as less salient over time. The risk of biased
impact estimates increases with lower overall survey response rates
or larger differences in survey response rates between key research
groups (What Works Clearinghouse 2017). Continued high rates of
participation in the study, through the second follow-up, are
necessary to produce unbiased estimates of the program impacts and
maximize the utility of survey data in this multipart study.
In some study sites, respondents may be offered a small gift instead of a gift card as appreciation for survey participation. The project team will discuss with program staff whether a gift or a gift card would be most effective at encouraging survey response among the population they serve. The gift would be selected with input from program staff and be of similar value to the gift card.
The dollar amount proposed here is based on observational information from recent randomized controlled trials with similar service populations. For each of four recent studies, Table A.5 presents information about the type of data collection, incentive offered, survey duration, timeframe, and response rates obtained in these studies. Three of these studies used tokens of appreciation of between $40 and $50, as proposed for NextGen, and achieved survey participation and non-response bias sufficient to estimate program impacts.
Emerging information from the fourth study, ACF’s Evaluation of Employment Coaching for TANF and Other Related Populations (Employment Coaching) suggests that lower dollar amounts may not be enough to support our targeted response rate. The Employment Coaching study currently offers $35 to respondents for completing a 60-minute follow-up within the first four weeks (and $25 after). Despite intensive outreach and notification efforts, similar to those planned for the NextGen study, in four of six Employment Coaching study sites, patterns of overall nonresponse and differences in nonresponse between the treatment and control groups indicate that estimates of program impacts are at higher risk of bias than expected.
Table A.5. Tokens of appreciation and response rates obtained in similar follow-up surveys
Study |
Instrument |
Duration (minutes) |
Data collection timeframe |
Amount of token of appreciation |
Response rate
|
Evaluation of Employment Coaching for TANF and Other Related Populations, OMB # 0970-0506 |
6- to 12-month follow-up |
60 |
2018-present |
$35 first four weeks $25 after four weeks |
41-81 percent depending on site, for cases that have been in the field for six months or longer 48 to 82 percent treatment 35 to 81 percent control |
Enhanced Transitional Jobs Demonstration, OMB #0970-0413
|
12-month follow-up |
45 |
2012-14 |
$40 |
67 to 82 percent depending on site 69 to 82 percent treatment 65 to 81 percent control |
Self-Employment Training (SET) Demonstration, full sample, OMB #1205-0505 |
18-month follow-up |
20 |
2015-17 |
$50 first four weeks $25 after four weeks |
80 percent overall 83 percent treatment 78 percent control |
YouthBuild, full sample, OMB #1205-0503 |
12-month follow-up |
60 |
2012-14 |
$40 first four weeks $25 after four weeks |
81 percent overall 82 percent treatment 79 percent control |
Note: Treatment and control groups in this table refer to the overall evaluation (that is, the original conditions to which sample members were assigned upon enrollment) and not any incentive experiment. The SET sample includes the full survey sample, including the time before and after the conclusion of the incentive experiments described in the text. The TANF Coaching response rates include only those cases that have been in the field for six or more months.
In-depth
Interviews
Respondents to the in-depth participant
interviews, which are estimated to take 120 minutes on average, will
receive a $60 gift card, intended to offset costs of participation in
the study. Interview data will not be representative in a
statistical sense, in that they will not be used to make statements
about the prevalence of experiences for the entire service
populations. However, it is important to secure participants with a
range of background characteristics in order to capture a variety of
possible experiences with these programs.
Without offsetting the direct costs incurred by respondents
for participating in the interviews, such as arranging child care,
transportation, or time off from paid work, the research team
increases the risk that only those individuals able to overcome the
financial barriers to participate will agree to an interview, which
would reduce the overall quality of the qualitative data collection.
A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing
Personally Identifiable Information
The information provided by or about participants during the baseline data collection, follow-up surveys, service receipt tracking, and in-depth participant interviews will contain participant-level personally identifiable information (PII). This includes names, addresses, email addresses, social media accounts, phone numbers, birth dates, and Social Security numbers. This information is needed to ensure that: the prospective study participant has not already enrolled in the study; the project team can locate study participants to complete the follow-up surveys; and the project team can link participants to their corresponding administrative data. See Section A11 for further details. In addition, the project team will collect the names and email addresses about program staff in order to administer the staff characteristics and program leadership surveys.
Mathematica will share study participants’ information with SSA, which will do additional research on how programs affect earnings and receipt of disability benefits. They will do this research through 2028. Mathematica will share information such as name, sex, date of birth, and Social Security number so researchers at SSA can locate participants’ records. They will only use this information to do research. The information will not be used to make decisions about benefits participants receive from the SSA, now or in the future. The sharing of information with SSA for these purposes and for the specified timeframe are described to participants in the informed consent form (Appendix A).
Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individuals’ personal identifier.
Assurances of Privacy
Mathematica will protect respondents’ privacy to the extent permitted by law and will comply with all Federal and departmental regulations for private information. Mathematica has developed a data safety and monitoring plan that assesses all protections of respondents’ PII. Mathematica will ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor who perform work under this contract are trained on data privacy issues and comply with the above requirements. All study staff with access to PII—including program staff who are entering information about study participants and their service receipt into RAPTER®—will receive study-specific training on (1) limitations on disclosure; (2) safeguarding the physical work environment; and (3) storing, transmitting, and destroying data securely. These procedures will be documented in training manuals for study staff, and refresher training will occur annually.
Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. As specified in the contract, Mathematica (the Contractor) will comply with all Federal and departmental regulations for private information.
The project team is in the process of seeking Institutional Review Board (IRB) approval from the Health Media Lab IRB; to date, IRB approval has been granted for two of the selected programs. The project team is also seeking a Certificate of Confidentiality (CoC) from the National Institutes of Health. The CoC helps assure participants that their information will be kept private to the fullest extent permitted by law.
Data Security and Monitoring
The project team will use Federal Information Processing Standard compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. They will securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Information Processing Standard. They will ensure that it incorporates this standard into its property management/control system and establishes a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically, including audiorecordings of discussions with program administrators, supervisors, and staff, key partner staff, and participants, will be secured in accordance with the most current National Institute of Standards and Technology requirements and other applicable Federal and departmental regulations. In addition, the project team will submit a plan for minimizing, to the extent possible, the inclusion of PII and other sensitive information on paper records, and for the protection of any paper records, field notes, or other documents that contain PII or other sensitive information that ensures secure storage and limits on access.
Information shared with researchers at SSA (see discussion above) and exchanged between programs and Mathematica will be sent via a secure file transfer protocol.
At the end of the study, de-identified project data will be archived to make them available to other researchers. Mathematica will work with ACF to develop a comprehensive data archive plan and to produce an archive data file or files. Any restricted- or public-use files will be reviewed for appropriateness of public or restricted release, including appropriate masking techniques for each level of release. A non-disclosure review will also be conducted to ensure that the data cannot be used to re-identify study participants.
A11. Sensitive Information
To evaluate the effectiveness of employment programs for vulnerable populations, it is necessary to ask some sensitive questions. Before starting the baseline and follow-up surveys and the in-depth interviews, all respondents will be informed that their identities will be kept private to the extent permitted by law, that results will only be reported in the aggregate, that their responses will not affect any services or benefits they or their family members receive, and that they do not have to answer any questions that make them uncomfortable.
The sensitive questions in the data collection instruments and proposed data collection instruments relevant for this ICR include the following:
Respondents’ Social Security numbers. Respondents’ Social Security numbers are necessary to collect administrative data used to estimate impacts on earnings, employment, and public benefit receipt. The consent form will inform study participants that the project team might collect administrative data about them. Social Security numbers will also be used to collect information through online databases containing information on the location of study participants for the follow-up surveys. Along with names, birthdates, and other data from baseline surveys, Social Security numbers will be used to verify respondents’ identities for follow-up surveys. The project team did not want to rely on name and address matching (or similar techniques) for collecting administrative data because it leads to the inability to match administrative data for a high proportion of participants, an unacceptably high uncertainty in match success, or both. This would affect the study’s ability to estimate impacts and draw conclusions for findings that rely on administrative data.
Wage rates and earnings. It is necessary to ask about earnings because increasing participants’ earnings is a key goal of these programs. The follow-up surveys ask about each job worked since random assignment, the wage rate, and the number of hours worked per week. This information will be collected on the first and second follow-up surveys.
Challenges to employment. It is important to ask about challenges to employment both at baseline and at follow-up. The reported challenges at baseline can be used to define subgroups for whom the program might be particularly effective. It is important to ask about challenges to employment in the follow-up survey because the program might have addressed these challenges. Challenges measured through the two follow-up surveys include problems with transportation, needing to take care of a family member, lack of clothes or tools, not having the right education or skills, and having a criminal record. These challenges might also be discussed during the in-depth participant interviews.
Economic hardships. The follow-up surveys ask about economic hardships, such as food insecurity. These outcomes reflect a lack of economic independence and might be affected by the program. Economic hardships might also be discussed as part of the in-depth participant interviews.
Disabilities, mental and physical health, and substance misuse. The baseline and follow-up surveys will collect information about disabilities, mental or other health problems, and substance misuse; the severity of those issues; and how much they impact the ability to work. These issues might also be discussed in the in-depth participant interviews. The Center for Epidemiologic Studies Depression Scale Revised will also be collected for one program during eligibility screening and saved for those determined to be eligible for the program. All of these are important potential challenges to finding or maintaining employment and could play a role in the effectiveness of the program.
Involvement in the criminal justice system. The baseline survey asks about prior involvement in the criminal justice system, including the number of convictions and felony convictions, details about parole or probation, type of crime committed, and time spent in last incarceration because such involvement often makes it harder to find employment. The two follow-up surveys will also ask about arrests, convictions, and incarcerations that occurred after random assignment because these outcomes might be affected by the program. Criminal history might also be discussed during the in-depth participant interviews.
COVID-19-related challenges. The baseline survey asks if respondents are fully vaccinated against the Coronavirus because vaccination is expected to be associated with employment outcomes. It also asks whether COVID-19 posed specific challenges to employment for study participants or if the pandemic impacted previous employment.
A12. Burden
Explanation of Burden Estimates
Table A.6 reflects the burden and cost for information collection proposed in Phase 1 of this ICR. There are no changes proposed for Phase 1 burden as part of this change request. Table A.7 reflects the estimated reporting burden and cost for the Phase 2 data collection instruments that this change request seeks approval to administer (previously included in Appendix E in the ICR approved by OMB in April 2020 under OMB #0970-0545 and revised in December 2020 with the non-substantive change request; the burden and cost for the remaining Phase 2 instruments are included in the revised Appendix E submitted with this request). No changes are proposed for Phase 2 burden as part of this change request. The burden for completing the data collection for the subset of Phase 2 instruments included in this request falls within the original burden estimates; the proposed changes do not change the estimates.
Details of the estimates for data collections in Phase 1 and 2 of this request are as follows:
Baseline data collection. Baseline data collection involves both study participants and program staff. The burden estimates assume that program staff will assist study participants in baseline data collection, which includes collecting the baseline survey (Instrument 1) and using RAPTER® to collect participant identifying and contact information (Instrument 2).
We expect about 10,000 study participants (1,000 in each of 10 programs) will complete baseline data collection. We expect each baseline data collection (inclusive of the baseline survey and RAPTER® identifying and contact information) to last 0.42 hours, for a total of 4,200 burden hours. Annualizing over three years is 1,400 hours per year for study participants.
We assume that 200 program staff across all 10 programs (approximately 20 per program) will perform the baseline data collection. Each staff member will administer the baseline data collection (inclusive of the baseline survey and RAPTER® identifying and contact information) 50 times and each session is expected to last 0.42 hours for a total of 4,200 burden hours. Annualizing over three years is 1,400 hours.
Service receipt tracking. We anticipate 200 program staff (20 in each of the programs) will enter data on program service receipt into RAPTER®. We expect 250 entries per staff member and expect that each entry will take 5 minutes (0.08 hours), or a total of 1,340 annual burden hours.
Staff characteristics survey. We expect to survey 200 program staff who directly interact with participants (20 per program). The survey is expected to take 25 minutes (0.42 hours) to complete, or a total of 28 annual burden hours.
Program leadership survey. We expect to survey 50 program leaders (five per program). The survey is expected to last 15 minutes (0.25 hours) to complete, or a total of four annual burden hours.
Semi-structured program discussion guide—program leaders. We expect to interview 40 program leaders across all ten programs (approximately four per program). We expect each staff interview to last 1.5 hours on average, or a total of 20 annual burden hours.
Semi-structured program discussion guide—program supervisors and partners. We expect to interview 80 program supervisors or partners across all ten programs (approximately eight per program). We expect each interview to last one hour on average, or a total of 27 annual burden hours.
Semi-structured program discussion guide—program staff and providers. We expect to interview 80 direct service staff across all ten programs (approximately eight per program). We expect each staff interview to last 0.75 hours on average, or a total of 20 annual burden hours.
Semi-structured employer discussion guide. We expect to interview 50 employers’ staff across all ten programs (approximately 5 per program). We expect each interview to last one hour on average, or a total of 17 annual burden hours.
In-depth participant interview guide. We expect to interview 200 study participants (20 in each of the ten programs). These interviews are expected to last two hours on average, or a total of 134 annual burden hours.
Cost workbook. We expect that 40 program staff (four in each of the ten programs) will enter data on expenditures and costs into Excel. We expect one entry per staff member and expect that each entry will take 32 hours, or a total of 416 annual burden hours.
Estimated Annualized Cost to Respondents
Phase 1:
The total annual cost for data collection instruments in Phase 1 of this request is $34,258. The total estimated cost figures are computed from the total annual burden hours and an average hourly wage for staff and participants. The wage rate for program staff administering the survey is based on the May 2018 employment and wages from Occupational Employment Statistics survey from the Bureau of Labor Statistics (http://www.bls.gov/oes/current/oes_stru.htm). The rate used for direct service staff, $17.22, is the mean wage for social and human services assistants under SOC code 21-1093. The average hourly wage of study participants is estimated to be $7.25, the federal minimum wage.
Table A.6. Burden and cost for information collection proposed in Phase 1
Instrument |
No. of respondents (total over request period) |
No. of responses per respondent (total over request period) |
Avg. burden per response (in hours) |
Total burden (in hours) |
Annual burden (in hours) |
Average hourly wage rate |
Total annual respondent cost |
Baseline survey & Identifying and contact information – participants |
10,000 |
1 |
0.42 |
4,200 |
1,400 |
$7.25 |
$10,150 |
Baseline survey & Identifying and contact information – staff |
200 |
50 |
0.42 |
4,200 |
1,400 |
$17.22 |
$24,108 |
Estimated annual burden total |
2,800 |
|
$34,258 |
Phase 2:
The total annual cost for data collection instruments in Phase 2 for which we are currently requesting approval is estimated to be $35,319. The total estimated cost figures are computed from the total annual burden hours and an average hourly wage for staff, participants, and employers. Hourly wage estimates were derived from the U.S. Bureau of Labor Statistics 2018 National Compensation Survey (http://www.bls.gov/oes/current/oes_stru.htm).
We estimate the average hourly wage for program leaders to be $50.73, the average hourly wage of Local Government Managers under SOC code 11-1021 (General and Operations Managers).
The rate used for program and partner supervisors, $34.46, is the mean wage for social and community services managers (SOC code 11-9151).
The rate used for direct service staff, $17.22, is the mean wage for social and human services assistants (SOC code 21-1093).
The average hourly wage of study participants is estimated to be $7.25, the federal minimum wage.
The average hourly wage for employers is estimated as $59.56, the average wage of General and Operations Managers across industries (SOC 11-1021).
Table A.7. Burden and cost for information collection proposed in Phase 2
Instrument |
No. of Respondents (total over request period) |
Annual Number of Respondents |
No. of Responses per Respondent (total over request period) |
Avg. Burden per Response (in hours) |
Annual Burden (in hours) |
Average Hourly Wage Rate |
Total Annual Respondent Cost |
Service receipt tracking – staff |
200 |
67 |
250 |
0.08 |
1,340 |
$17.22 |
$23,075 |
Staff characteristics survey – staff |
200 |
67 |
1 |
0.42 |
28 |
$17.22 |
$482 |
Program leadership survey – program leaders |
50 |
17 |
1 |
0.25 |
4 |
$50.73 |
$203 |
Semi-structured program discussion guide –program leaders |
40 |
13 |
1 |
1.50 |
20 |
$50.73 |
$1,015 |
Semi-structured program discussion guide —program supervisors and partners |
80 |
27 |
1 |
1.00 |
27 |
$34.46 |
$930 |
Semi-structured program discussion guide —program staff and providers |
80 |
27 |
1 |
1.00 |
27 |
$17.22 |
$465 |
Semi-structured employer discussion guide – employers |
50 |
17 |
1 |
1.00 |
17 |
$59.56 |
$1,013 |
In-depth participant interviews – participants |
200 |
67 |
1 |
2.00 |
134 |
$7.25 |
$972 |
Cost workbook – staff |
40 |
13 |
1 |
32.00 |
416 |
$17.22 |
$7,164 |
Estimated annual burden total |
2,013 |
|
$35,319 |
The total estimated burden for previously approved Phase 1 instruments (2,800 hours) and the Phase 2 instruments for which we currently request approval (2,013 hours) is 4,813 hours.
A13. Costs
There are no additional costs to respondents.
A14. Estimated Annualized Costs to the Federal Government
Phase 1:
The total cost to the Federal government for the data collection activities under the first phase of this ICR will be about $3,305,200. Annualized costs to the Federal government will be about $1,101,733 for the proposed data collection. These estimates of costs are derived from Mathematica’s budgeted estimates and include labor rates, direct costs, and tokens of appreciation for respondents.
Cost category |
Estimated costs |
PHASE 1 |
|
Instrument development and OMB clearance |
$195,600 |
Field work |
$1,889,800 |
Analysis |
$527,400 |
Publications/dissemination |
$692,400 |
Total costs over the request period |
$3,305,200 |
Annual costs |
$1,101,733 |
Phase 2:
The total cost to the Federal government for all Phase 2 data collection activities (inclusive of the Phase 2 instruments pertinent to this request and the remaining Phase 2 instruments reflected in the revised Appendix E) will be about $13,220,800. Annualized costs to the Federal government will be about $4,406,933 for the proposed data collection. These estimates of costs are derived from Mathematica’s budgeted estimates and include labor rates, direct costs, and tokens of appreciation for respondents.
Cost category |
Estimated costs |
PHASE 2 |
|
Instrument development and OMB clearance |
$782,400 |
Field work |
$7,559,200 |
Analysis |
$2,109,600 |
Publications/dissemination |
$2,769,600 |
Total costs over the request period |
$13,220,800 |
Annual costs |
$4,406,933 |
A15. Reasons for Changes in Burden
The requested changes submitted as part of this request do not change the burden estimates for either Phase 1 or Phase 2.
A16. Timeline
The beginning of participant intake and baseline data collection is expected to be staggered by program. Due to current and expected delays in the study schedule due to COVID-19, we anticipate that the first programs will begin baseline data collection in summer of 2021. Other programs will begin intake later in 2021. For each program, we expect intake and baseline data collection to continue for about 12 to 24 months. Data collection for the descriptive and cost studies will begin in 2020 for some sites and 2021 for other sites. We anticipate that the first follow-up survey will take place in 2021 and 2022, and the second follow-up survey will take place in 2022 and 2023.
Findings from the project will be published throughout the study in technical reports and briefs. We anticipate that reporting on the descriptive and cost studies will begin in 2021 and continue through 2023. Reporting on the intermediate impact findings will likely begin in 2023 and continue through 2024. Reporting on final impact findings will likely begin in 2024 and continue through 2026.
We anticipate that data archives (restricted or public use) would become available in 2026 and hosted on a data archive platform such as the Inter-university Consortium for Political and Social Research (ICPSR).
A17. Exceptions
No exceptions are necessary for this information collection.
Attachments:
Instruments
Instrument 1. Baseline survey - revised
Instrument 2. Identifying and contact information - revised
Instrument 5. Service receipt tracking - revised
Instrument 6. Staff characteristics survey - revised
Instrument 7. Program leadership survey - revised
Instrument 8. Semi-structured program discussion guide - revised
Instrument 9. Semi-structured employer discussion guide - revised
Instrument 10. In-depth participant interview guide - revised
Instrument 11. Cost workbook
Appendices
Appendix A. Informed consent form – revised
Appendix A.1. Bridges consent forms
Appendix B. Question-by-question justification for baseline survey - revised
Appendix C. Question-by-question justification for identifying and contact information - revised
Appendix D. Question-by-question justification for follow-up surveys
Appendix E. Reporting burden and cost for Phase 2 data collection instruments – revised Feb 2021
Appendix F. Instrument 3 (draft): First follow-up survey
Appendix G. Follow-up survey reminders and notifications
Appendix H. Instrument 4 (draft): Second follow-up survey
Appendix P. Federal Register Notice
Appendix P.1. Federal Register Notice – 30-day request
Appendix Q. Summary of requested changes
Supporting Statement A: References
What Works Clearinghouse. “Standards Handbook, Version 4.” 2017. Available at https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_standards_handbook_v4.pdf.
0 One program selected for the evaluation will involve participants under the age of 18. In these cases, informed consent will also be collected from the participant’s parent or guardian, and assent will be collected from the participant. This request seeks approval for these instruments. Some interventions might also involve adults or youths with cognitive disabilities. For these interventions, the NextGen Project will rely on determinations, screenings, or assessments made by site staff to ensure the potential participants are capable of understanding the consent process and implications of participating in the study. If site staff determine that a potential participant is unable to understand, that individual will be exempt from the NextGen Project and will not be included in any data collection.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Lawrence, Marie (ACF) |
File Modified | 0000-00-00 |
File Created | 2021-02-16 |