HPOG 2.0_Supporting Statement Part A_Rev_010918_020118_clean

HPOG 2.0_Supporting Statement Part A_Rev_010918_020118_clean.docx

Evaluation and System Design for Career Pathways Programs: 2nd Generation of HPOG (HPOG Next Gen Design)

OMB: 0970-0462

Document [docx]
Download: docx | pdf

Supporting Statement for OMB Clearance Request


Part A


National and Tribal Evaluation of the 2nd Generation of the Health Profession Opportunity Grants



0970-0462



Revised January 2018



Submitted by:

Office of Planning,
Research & Evaluation

Administration for Children & Families

U.S. Department of Health
and Human Services


Federal Project Officers:

Hilary Forster

Nicole Constance

Amelia Popham

Table of Contents



Attachments:

Previously Approved Instruments

Instrument 1: PAGES Grantee- and Participant-Level Data Items List

Instrument 2: HPOG 2.0 National Evaluation Screening Interview

Instrument 3: HPOG 2.0 National Evaluation first-round telephone interview protocol

Instrument 4: HPOG 2.0 National Evaluation in-person implementation interviews

  • Instrument 4A HPOG 2.0 National Evaluation In-Person Implementation Interview

  • Instrument 4B HPOG 2.0 National Evaluation In-Person Implementation Interviews Basic Skills Training

  • Instrument 4C HPOG 2.0 National Evaluation In-Person Implementation Interviews Career Pathways

  • Instrument 4D HPOG 2.0 National Evaluation In-Person Implementation Interviews Work-Readiness

  • Instrument 4E HPOG 2.0 National Evaluation In-Person Implementation Interviews Sustainability

Instrument 5: HPOG 2.0 National Evaluation welcome packet and participant contact update forms

  • Instrument 5a: HPOG 2.0 National Evaluation welcome packet and contact update form

  • Instrument 5b: HPOG 2.0 National Evaluation participant contact update letter and form

Instrument 6: HPOG 2.0 Tribal Evaluation grantee and partner administrative staff interviews

Instrument 7: HPOG 2.0 Tribal Evaluation program implementation staff interviews

Instrument 8: HPOG 2.0 Tribal Evaluation employer interviews

Instrument 9: HPOG 2.0 Tribal Evaluation program participant focus groups

Instrument 10: HPOG 2.0 Tribal Evaluation program participant completer interviews

Instrument 11: HPOG 2.0 Tribal Evaluation program participant non-completer interviews

New Instruments Included in this Request

Instrument 12: HPOG 2.0 National Evaluation Short-Term Follow-up Survey



Attachment A: References

Attachment B: Previously Approved Informed Consent Forms

  • Attachment B2: Tribal Evaluation informed consent form A (SSNs)

  • Attachment B3: Tribal Evaluation informed consent form B (Unique identifiers)

Attachment C: 60 Day Federal Register Notice

Attachment D: Previously Approved Sources and Justification for PAGES Grantee- and Participant-Level Data Items

Attachment E: Previously Approved Final Updated Attachment E PPR Data List and Mockup

Attachment F: First Round of HPOG Grantees Research Portfolio

Attachment G: Previously Approved Participant Contact Information Update Letter and Form (Obsolete, replaced by Instrument 5a and 5b)



Attachment H: HPOG Logic Model

Attachment I: Previously Approved Focus Group Participant Consent Form

Attachment J: Previously Approved Interview Verbal Informed Consent Form

Attachment K: NEW HPOG 2.0 National Evaluation Short-Term Follow-up Survey Advance Letter

Attachment L: NEW HPOG 2.0 National Evaluation Short-Term Follow-up Survey Sources

Attachment M: NEW HPOG 2.0 National Evaluation Short-Term Follow-up Survey Trying to Reach You Flyer

Attachment N: NEW HPOG 2.0 National Evaluation Short-Term Follow-up Survey Email Reminder Attachment O: NEW Research Questions for Previously Approved Data Collection Efforts (National Evaluation and Tribal Evaluation)





Part A: Justification

This document provides supporting statements for the collection of information for the National and Tribal Evaluations of the Health Profession Opportunity Grants (HPOG) program, funded by the U.S. Department of Health and Human Services (HHS), Administration for Children and Families (ACF). The document is cumulative in nature, in that each section has a description of the new information request followed by the previously approved information collections. The HPOG grants fund programs that provide education and training to Temporary Assistance for Needy Families (TANF) recipients and other low-income individuals for occupations in the health care field that pay well and are expected to either experience labor shortages or be in high demand. ACF awarded the first set of HPOG grants in September 2010, and the second set of HPOG grants in September 2015 (referred to as HPOG 2.0). Under HPOG 2.0, ACF funded 32 grants—five to tribal-affiliated organizations and 27 to non-tribal entities.

The ACF Office of Planning, Research and Evaluation (OPRE) has developed a multi-pronged research and evaluation portfolio for the HPOG 2.0 Program to better understand and assess the activities conducted and their results. This submission is in support of two components of the evaluation portfolio, the HPOG 2.0 National Evaluation and HPOG 2.0 Tribal Evaluation. Abt Associates and their partners, MEF Policy Associates, the Urban Institute, Insight Policy Research, and NORC at the University of Chicago, are leading the evaluation of HPOG 2.0. These studies will use data collected from the HPOG management information system—the HPOG Participant Accomplishment and Grantee Evaluation System (PAGES)—designed under The Evaluation and System Design for Career Pathways Programs: 2nd Generation of Health Profession Opportunity Grants (HPOG Next Gen Design). OMB previously approved baseline data collection in the PAGES system and informed consent forms for the HPOG 2.0 evaluation under OMB Control Number 0970-0462. PAGES is internet-based and gathers data from the HPOG 2.0 grantees on: (1) grantee program designs and offerings; (2) intake information on eligible applicants (both treatment and control) through baseline data collection; and (3) individual enrolled program participants’ activities and outcomes. The original OMB submission was approved in August 2015. A nonsubstantive change request was approved in January 2016 for changes to the informed consent forms for non-tribal grantees. A second nonsubstantive change request was approved in July 2016 for changes to the informed consent forms for Tribal grantees. A third request for OMB approval, covering additional data collection efforts for the HPOG 2.0 National Evaluation and the HPOG 2.0 Tribal Evaluation was approved on June 27, 2017, with a nonsubstantive change for the contact update forms approved in July 2017. This submission seeks approval for the National Evaluation impact study’s Short-Term Follow-up Survey, to be conducted 15 months after randomization.

A1: Necessity for the Data Collection

ACF at HHS seeks approval for the data collection activities described in this request to support the HPOG 2.0 National Evaluation and HPOG 2.0 Tribal Evaluation.

        1. A1.1 Study Background

The HPOG Program, established by the Patient Protection and Affordable Care Act of 2010 (ACA), funds training in high-demand healthcare professions, targeted to TANF recipients and other low-income individuals. The HPOG Program is administered by the ACF Office of Family Assistance (OFA). The first round of HPOG grants was awarded in 2010. In September 2015, OFA awarded a second round of HPOG grants—approximately $72 million was awarded to 32 organizations located across 21 states. Grantees include six community based organizations, four state government entities, seven local workforce development agencies, ten institutions of higher education, two tribal colleges, one tribal human service agency, one tribe, and one Indian Health Board. Those 32 grantees oversee 43 individual HPOG programs.

HPOG programs: (1) target skills and competencies demanded by the healthcare industry; (2) support career pathways, such as an articulated career ladder; (3) result in an employer- or industry-recognized credential (which can include a license, third-party certification, postsecondary educational certificate or degree, as well as a Registered Apprenticeship certificate); and (4) combine supportive services with education and training services to help participants overcome barriers to employment, as necessary.

HPOG’s authorizing legislation calls for a comprehensive evaluation of the funded demonstration projects. Accordingly, ACF plans to evaluate rigorously the effectiveness of funded HPOG 2.0 programs. The federal evaluation activities are intended to expand the career pathways evidence base and to build on what has been learned to date about how to design and implement successful career pathways programs for low-income and low-skilled individuals, and improve the outcomes of individuals who participate in these programs. All grantees will participate in a federal evaluation. The federal evaluation for the non-Tribal HPOG 2.0 grantees involves random assignment of individual participants Tribal grantees are participating in a coordinated evaluation that does not involve random assignment.

The OMB-approved HPOG PAGES data system (OMB Control Number 0970-0462) is collecting and storing uniform data needed for performance management and the federal evaluations, incorporating the required semi-annual grantee performance reports to ACF (Attachment E). These reports include a quantitative section with metrics automatically generated from data in the PAGES system and a narrative section that must be filled out by grantees. The system also provides necessary data for other research and evaluation efforts, including the HPOG 2.0 National Evaluation and HPOG 2.0 Tribal Evaluation.

Abt Associates is the prime contractor for the HPOG 2.0 Evaluation. Abt and the Urban Institute led the design of the PAGES data system and both organizations are overseeing PAGES data collection. Abt is leading the HPOG 2.0 National Evaluation. Partners MEF Associates, Insight Policy Research and the Urban Institute are assisting with the site monitoring, descriptive evaluation, and cost benefit analysis. NORC at the University of Chicago is leading the HPOG 2.0 Tribal Evaluation under subcontract to Abt Associates. Abt and their partners are also conducting several other evaluations on behalf of ACF as part of the HPOG research portfolio on the first round of HPOG grantees, for which there are numerous data collections already approved by OMB (see Attachment F for further details). ACF and its contractors are engaged in many efforts to coordinate these evaluation activities and avoid duplication of work. The HPOG 2.0 Evaluation team has used the extensive knowledge generated to date from the research activities on the first round of HPOG and current Pathways for Advancing Careers and Education (PACE) programs to inform the proposed new data collection efforts for the second round of HPOG grantees.

        1. A1.2 Legal or Administrative Requirements that Necessitate the Collection

H.R. 3590, the ACA requires an evaluation of the HPOG demonstration projects (H.R. 3590, Title V, Subtitle F, Sec. 5507, sec. 2008, (a)(3)(B)). The Act further indicates that the evaluation will be used to inform the final report to Congress (H.R. 3590, Title V, Subtitle F, Sec. 5507, sec. 2008, (a)(3)(C)). The Act calls for evaluation activities to assess the success of HPOG in “creating opportunities for developing and sustaining, particularly with respect to low-income individuals and other entry-level workers, a health professions workforce that has accessible entry points, that meets high standards for education, training, certification, and professional development, and that provides increased wages and affordable benefits, including healthcare coverage, that are responsive to the workforce’s needs” (H.R. 3590, Title V, Subtitle F, Sec. 5507, sec. 2008, (a)(3)(B)).

There were two Funding Opportunity Announcements (FOAs) for the second round of HPOG grants—one for non-Tribal grantees (HHS-2015-ACF-OFA-FX-0951) and one for Tribal grantees (HHS-2015-ACF-OFA-FY-0952). Both FOAs required all HPOG 2.0 grantees to participate in a federal evaluation and to follow all evaluation protocols established by ACF or its designated contractors. Participating in the federal evaluations includes, but is not limited to, the use of the PAGES data system to collect uniform data elements and, for non-Tribal grantees participating in the National Evaluation, the facilitation of random assignment.

Data collected under PAGES will be used to automatically generate the federally required semi-annual reports, to inform ACF reports to Congress, to monitor and manage the performance of the grant-funded projects, to inform the HPOG 2.0 National Evaluation impact, outcomes and implementation studies and HPOG 2.0 Tribal Evaluation, and to inform other future research and evaluation efforts.

A2: Purpose of Survey and Data Collection Procedures

        1. A2.1 Overview of Purpose and Approach

ACF is rigorously evaluating the effectiveness of the second round of HPOG grants. OPRE oversees the federal evaluation activities, which include an implementation and impact evaluation (with long-term follow-up) of the non-Tribal HPOG grantees under the National Evaluation, plus a cost-benefit study, and a separate but coordinated Tribal Evaluation of the Tribal HPOG grantees. The federal evaluation activities are intended to expand the career pathways evidence base and to build on what has been learned to date about how to design and implement successful career pathways programs for low-income individuals, and improve the outcomes of individuals who participate in these programs. OMB has previously approved several information collection requests under OMB Control Number 0970-0462 (each described in Section A.2.4) in support of both the National and Tribal evaluations. Under this information collect request, ACF seeks approval for an additional data collection protocol for the National Evaluation impact evaluation.

HPOG 2.0 National Evaluation

The National Evaluation involves random assignment of individual participants. As stated in the FOA, the non-Tribal HPOG 2.0 grantees are required to abide by random assignment procedures and facilitate the random assignment process for individuals by entering eligible HPOG program applicants into a lottery to determine if they will be invited to participate in the program.

Applicants who are not invited to participate will serve as a control group in the evaluation. The control group members will not receive HPOG program services, but may enroll in any other program or service for which they are eligible. Individuals must complete the application process prior to random assignment; only individuals who have been deemed both eligible and suitable for program participation may be entered into the lottery.

For the National Evaluation, this information collection request covers the Short-Term Follow-up Survey data collection for the impact evaluation (Instrument 12).



HPOG 2.0 Tribal Evaluation

The purpose of the HPOG 2.0 Tribal Evaluation is to conduct a comprehensive implementation and outcome evaluation of the five Tribal HPOG 2.0 grantee programs. The HPOG 2.0 Tribal Evaluation will employ sound scientific methods and be grounded in sound cultural methods. At the start and throughout the process, the evaluators will engage with tribal leadership or authorized designee(s) to ensure that the evaluation is firmly anchored in questions that are meaningful to local stakeholders and that assist local service providers in better serving their communities. The evaluation of the Tribal HPOG 2.0 grantees will not involve random assignment. The evaluation will assess the HPOG 2.0 programs administered by tribes, tribal organizations, and tribal colleges to identify and assess how programmatic health profession training operations are working; determine differences in approaches used when programs are serving different sub-populations, including participants with different characteristics and skill levels; and identify programs and practices that seem to be successful in supporting the target population to achieve portable industry-recognized certificates or degrees as well as employment-related outcomes. For the Tribal Evaluation, data collection protocols to be used in the evaluation (Instruments 6-11) were previously approved in June 2017.

        1. A2.2 Research Questions

The National Evaluation will address several research questions through the descriptive and impact evaluations. There is alignment between the National and Tribal evaluations. The research questions for the National Evaluation descriptive evaluation and the Tribal Evaluation were summarized in a previously approved request for clearance, along with their respective data collection protocols (OMB Control Number 0970-0462, approved June 2017). In that submission, key research questions applicable to both studies were shown in bold. The research questions from that prior submission are included in Attachment O.

This request for clearance covers the research questions applicable to the National Evaluation impact evaluation—to be addressed by the Short-Term Follow-up Survey. Exhibit A-1 provides a schematic and theory of action for the HPOG 2.0 National Evaluation’s impact evaluation. The top row of the exhibit represents the experiences of applicants randomized to the treatment group—that is, those offered a “slot” in an HPOG 2.0 program, where “slot” means the package of training and associated support services offered by the program, whether or not the individual uses the components of that package. Conversely, the bottom row represents the experiences of those in the control group, who are not offered an HPOG 2.0 slot.

Exhibit A-1: Schematic and Theory of Action of the HPOG 2.0 Impact Evaluation

From left to right in the top row of the exhibit, an applicant randomly assigned to the treatment group is offered an HPOG 2.0 slot gets access to the training and associated support services from the HPOG 2.0 program and, potentially, from other sources. (Nevertheless, but not explicitly shown in the exhibit, not everyone offered access to HPOG services will use everything—or even anything—offered.) The hypothesis to be tested is that job training and these HPOG services lead to educational and occupational credentials and employment with certain working conditions (hours, hourly wage, shift work, benefits) and to earnings. Impacts on public assistance receipt (TANF, SNAP, Medicaid, and unemployment insurance) and broader aspects of well-being (food security, housing stability, and marital status) may also emerge.

In contrast, those randomly assigned to the control group (the bottom row of the exhibit) are not offered access to the HPOG 2.0 program, but may obtain training and other support services from other sources. The same set of outcomes emerges, though possibly at different levels: education and credentials, employment and earnings, public assistance, and overall well-being.

Though not everyone in the treatment group gets training and many in the control group do get training, the two contrasting flows in Exhibit A-1 represent the very contrast relevant to future policy decisions on funding HPOG-like services. Random assignment creates a treatment group and a control group that differ only by the offer of HPOG 2.0 and chance. Because the two groups are otherwise statistically equivalent, comparisons of outcomes between them provides an unambiguous estimate of the impact of HPOG 2.0; by “impact,” we mean outcomes for those offered HPOG in a world with the program relative to what outcomes for those same individuals would have been had HPOG not existed.

The National Evaluation considers all aspects of Exhibit A-1:

  • Addressing implementation research questions, the descriptive evaluation describes the HPOG 2.0 program as implemented.

  • Addressing service contrast research questions, the service contrast analysis estimates the impact of the offer of HPOG 2.0 on services received—training and other support services.1

  • Addressing impact research questions (shown below), the impact analysis estimates the impact of the offer of HPOG 2.0 on outcomes of interest—including educational programs completed, credentials received, employment, earnings, and participation in public assistance programs.2

  • Assessing the costs of implementing the program relative to the benefit to participants and grantees is the primary goal of the cost benefit analysis.

The Impact Evaluation’s major research questions can be summarized as:

  • What is the impact of an offer of an HPOG 2.0 slot on participants’ receipt of training and support services, earnings, and broader measures of well-being?

  • How does the impact of an offer of an HPOG 2.0 slot on participants vary with baseline characteristics or HPOG 2.0 program characteristics?

RQ1: What is the impact of an offer of an HPOG 2.0 slot on participant earnings? (Confirmatory outcome for the intermediate impacts report.)

RQ1A: What is the impact of an offer of an HPOG 2.0 slot on successful educational progress—defined as still enrolled in or having completed an education or training program? (Confirmatory outcome for the early impacts report only.)

RQ2: What is the impact of an offer of an HPOG 2.0 slot on receipt of training, financial assistance for training, child care and financial assistance for child care, and various forms of personal and supportive services such as tutoring, academic or financial advising, or case management?

RQ3: What is the impact of an offer of an HPOG 2.0 slot on credentials earned internal to an education or training program and 2) on receipt of external credentials or certifications?

RQ4: What is the impact of an offer of an HPOG 2.0 slot on participant employment, employment in a healthcare profession, hours of work, hours of work in a healthcare profession, receipt of employment benefits (e.g., health insurance, retirement, paid sick leave, paid vacation), and other terms of employment (e.g., shift work)?

RQ5: What is the impact of an offer of an HPOG 2.0 slot on broader measures of well-being (e.g., household income, marital status, and health)?

RQ6: How does the impact of an offer of an HPOG 2.0 slot on key outcomes—educational progress, productive activity, and earnings—vary with baseline (i.e., pre-randomization) characteristics of individuals, including gender, education, race/ethnicity, age, and receipt of public assistance?

RQ7 For specific programs, what is the impact of an offer of an HPOG 2.0 slot on key outcomes of educational progress, productive activity, and earnings?

RQ8 How does the impact of an offer of an HPOG 2.0 slot on key outcomes—educational progress, productive activity, and earnings—vary with HPOG 2.0 program characteristics, including median starting wage of targeted professions and the quality of instruction?

RQ9 How do the benefits of being offered an HPOG 2.0 slot compare to the costs of providing an HPOG 2.0 slot—from the perspective of the applicant randomly assigned to the offer of treatment, the government, and society?

These research questions are framed as the impact of being offered a slot. This is both because the offer is what the program can control and because the impact of the offer is what is naturally estimated from a random assignment design. If sufficient resources are available, the evaluators will address an additional research question:

RQ10 What is the impact of receipt of HPOG 2.0 training—not merely the offer of an HPOG 2.0 slot—on earnings? How does that impact compare with the impact of receipt of non-HPOG 2.0 training on earnings?



        1. A2.3 Study Design

HPOG 2.0 National Evaluation Study Design

The HPOG 2.0 National Evaluation is guided by the career pathways framework, as shown in the HPOG logic model (Attachment H). The framework puts into practice the assertion that “post-secondary training should be organized as a series of manageable and well-articulated steps accompanied by strong supports and connections to employment” (Fein et al., 2012). These articulated steps provide opportunities for students to advance through successively higher levels of education and training, exiting into employment at multiple possible points. The framework also incorporates customization, supports and employer connections.

The design for the HPOG 2.0 National Evaluation features a descriptive evaluation (including implementation, systems, and outcome studies) and a cost benefit study. In addition, the National Evaluation will conduct an impact evaluation, using a classic experimental design to measure and analyze key participant outcomes including completion of education and training, receipt of certificates and/or degrees, earnings, and employment in a healthcare career.









Exhibit A-2 provides a visual description of the major components and sub-components of the HPOG 2.0 National Evaluation.

Exhibit A-2: Components of the HPOG 2.0 National Evaluation

Shape1











Briefly, and as discussed above, the impact evaluation design includes randomizing program-eligible participants to treatment and control status in all non-Tribal sites. Follow-up to answer the research questions will involve both queries of administrative data systems and surveys. The research team will match participant data collected through the impact evaluation for both the treatment and control groups to long-term employment and earnings data from ACF’s National Directory of New Hires (NDNH) and to school enrollment data from the National Student Clearinghouse (NSC). Agreements with the Office of Child Support Enforcement (OCSE) to use the NDNH and with the NSC to use their data are underway.

The impact evaluation will collect two types of participant data 1) quarterly contact update requests; and 2) two follow-up surveys—the Short-Term Follow-up Survey roughly 15 months after randomization and the Intermediate Survey 36 months after random assignment. The contact update requests were approved in June 2017. The Short-Term Follow-up Survey data collection is the subject for this request for approval. The 36-month follow-up survey and materials for the cost benefit study will be submitted for OMB review and approval at a later date.

HPOG 2.0 Tribal Evaluation Study Design

The HPOG 2.0 Tribal Evaluation is designed as a comprehensive implementation and outcome evaluation. The approach for the evaluation is guided by the seven values outlined in the Roadmap for Collaborative and Effective Evaluation in Tribal Communities, developed by the Child Welfare Research and Evaluation Tribal Workgroup.3 The values provide guidance for partnering with tribal communities and are grounded in community-based participatory research. All five tribal grantees will participate in the HPOG 2.0 Tribal Evaluation. The evaluation will use a mixed-methods approach, including collection of qualitative data through interviews and focus groups and analysis of program documentation and program data. Qualitative data will be collected during annual site visits to each of the five Tribal HPOG 2.0 grantees.



HPOG Participant Accomplishment and Grantee Evaluation System (PAGES)

The previously approved PAGES system is designed to meet the performance data needs of the grantees and of OFA to monitor the grantee performance and prepare the report to Congress on the grants. PAGES will support the National and Tribal Evaluations, as well as other future research and evaluation efforts sponsored by ACF. The information collection gathers data on (1) grantee program designs and offerings; (2) intake information on eligible applicants (at both the tribal and non-Tribal grantees) through baseline data collection and (3) individual enrolled program participants’ activities and outcomes.

Grantees will use the data collected through the system to generate the required Performance Progress Reports (PPRs) for OFA. The PPR includes two sections—a narrative section and a quantitative section. (See the full list of PPR items and a mockup of the PPR in the previously approved Attachment E.) Data collected in PAGES will also be used in other components of the HPOG 2.0 National Evaluation and HPOG 2.0 Tribal Evaluation. Program level data will help analyze each grantee’s and program’s inputs and outputs and place analytic results into the appropriate context. Participant-level data will be used in the impact evaluation to identify balance between the treatment and control groups, to increase the precision of estimates regarding the impact of program components, and to identify subgroups for subgroup impact analysis at follow-up. PAGES will support the National Evaluation descriptive evaluation by providing information on grantee program characteristics and program performance to date. Participant-level data will also enable the HPOG 2.0 National Evaluation and HPOG 2.0 Tribal Evaluation teams to track the participant educational and employment outcomes.

        1. A2.4 Universe of Data Collection Efforts

To address these research questions, the HPOG 2.0 National and Tribal Evaluation will use a number of data collection instruments. This request for clearance covers just one instrument:

  1. HPOG 2.0 Short-Term Follow-up Survey. This survey will be administered to a subset of the HPOG 2.0 National Evaluation Sample--those enrolled in the study between March 2017 and March 2018—about 15 months after randomization. (Instrument 12).

As shown in Exhibit A-3, the Short-Term Follow-up Survey will be administered to HPOG 2.0 impact evaluation participants in selected randomization cohorts only (March 2017-March 2018). For participants in those selected cohorts, the survey data collection will take place approximately 15 months following random assignment. Pending OMB approval, data collection will begin in late June 2018 and continue for approximately 18 months. Local interviewers will attempt to complete interviews first by telephone and then in-person for those respondents who cannot be reached by telephone.

These data are not available through any current sources. Many of the questions to be asked in this survey were approved for other studies in ACF’s Career Pathways portfolio, specifically the Pathways for Advancing Careers and Education 15-, 36-, and 72-month follow-up surveys (OMB #: 0970-0397); and the first round of the Health Profession Opportunity Grant (HPOG) 15-, 36-, and 72-month surveys (OMB #: 0970-0394). A summary of the survey item sources is provided in Attachment L.

Exhibit A-3 describes the target respondents, content, and reason for collection (i.e., which analyses will use the information) the new data collection activity submitted with this current request. All other survey support materials are provided in Attachments K, M, and N.

Exhibit A-3: HPOG 2.0 National Evaluation Short-Term Follow-up Survey Instrument Overview

Data Collection Activity

Data Collection Instrument(s)

Respondents, Content, and Reason for Collection

Impact Evaluation Participant Follow-up Survey

Short-Term Follow-up Survey (15-months after randomization)


(See Instrument 12)

Respondents: Overall expected sample of 13,000 (All participants randomized between March 2017 and March 2018).


Content:

  • Training and Employment History from randomization through date of interview

  • School Experiences

  • Earned Credentials

  • Current/most recent job conditions, job quality, benefits, on the job training

  • Income and economic well-being, student debt, financial resilience

  • Adult Well-Being- physical health, housing conditions

  • Household composition, family formation, and marital stability

  • 21st Century Skills/Cognitive Skills

  • Contact information


Reason: The Short-Term Follow-up Survey will collect information on events subsequent to random assignment in many areas—particularly the receipt of training and related supports, and receipt of credentials. For participants randomly assigned to the treatment group, the Short-Term Follow-up Survey will also collect opinions on the HPOG 2.0 services provided. Finally, the surveys will collect information on attitudes about work and self; current employment status and job characteristics (e.g., hours worked and job quality); current earnings; household composition; receipt of public benefits; household income; and economic hardship. This survey information will be used as outcomes for the impact analysis and to construct mediators for impacts at 36 months.

.



Study instruments approved by OMB in prior information collection requests include the following:

  1. PAGES Grantee- and Participant-Level Data Collection. This includes grantee-level data collection on program components (e.g. training courses offered, types of supports offered) and participant-level data on participation, services provided, and program outputs. (Instrument 1, approved in August 2015)

  2. PAGES Participant-Level Baseline Data Collection (participants at non-tribal grantees participating in the impact evaluation). This includes data on characteristics of eligible individual participants at intake (e.g., demographics, household characteristics, employment and education experiences, a child roster, and baseline data on expectations for the program) at the non-tribal grantees. (Instrument 1 approved in August 2015)

  3. PAGES Participant-Level Baseline Data Collection (participants at tribal grantees). This includes data on characteristics of eligible individual participants at intake (e.g., demographics, household characteristics, employment and education experiences, and tribal specific data items) at the tribal grantees. (Instrument 1 approved in August 2015)

  4. Informed Consent Forms (Form A: National Evaluation lottery required; Form B: National Evaluation, lottery not required; Tribal Informed Consent Form A -SSNs Included; Tribal Informed Consent Form B-Unique Identifiers Included). The informed consent forms provide information to participants to ensure they understand the nature of the research and evaluation activities being conducted. (Attachment B, B2 and B3 approved in August 2015, with revisions approved in January and July 2016)

  5. Screening Interview to identify respondents for the HPOG 2.0 National Evaluation first-round telephone grantee interviews. (Instrument 2 approved in June 2017, now complete.)

  6. HPOG 2.0 National Evaluation first-round telephone interviews. These interviews, conducted with management staff, partners and stakeholders, will collect information about the HPOG program context and about program administration, activities and services, partner and stakeholder roles and networks, and respondent perceptions of the program’s strengths. (Instrument 3 approved in June 2017, now complete.)

  7. HPOG 2.0 National Evaluation in-person implementation interviews will collect information from five to ten HPOG 2.0 programs with promising approaches to the topic areas of specific interest to ACF, including employer engagement, basic skills instruction; career pathways training opportunities, work-readiness training, and program sustainability after the end of the HPOG 2.0 grant period. In consultation with ACF, the programs selected for this limited data collection will be identified through the first-round telephone interviews as those that show the most promising or innovative approaches in each topic area. (Instrument 4 approved in June 2017)

  8. HPOG 2.0 National Evaluation contact update forms. This form will collect updated participant contact information for impact evaluation participants (treatment and control) during the follow-up period. This form is included as part of the welcome packet (Instrument 5a) and then sent every three months accompanied with the contact update letter and form. Attachment G is replaced by this contact update letter and form. (Instrument 5b approved in June 2017, with revisions approved in July 2017)

  9. HPOG 2.0 Tribal Evaluation grantee and partner administrative staff interviews will collect information on high-level program strategies, partnerships in place to implement the Tribal HPOG 2.0 program, program development and lessons learned. (Instrument 6 approved in June 2017)

  10. HPOG 2.0 Tribal Evaluation program implementation staff interviews will collect information from instructors, trainers, recruitment and orientation staff, and providers of program or supportive services on Tribal HPOG 2.0 program processes including recruitment, screening, orientation, provision of supportive services, and program implementation. (Instrument 7 approved in June 2017)

  11. HPOG 2.0 Tribal Evaluation employer interviews will collect information from local or regional employers that are partnering with Tribal HPOG 2.0 programs or have employed program participants and collect information on employers’ impressions of the Tribal HPOG 2.0 program and program graduates. (Instrument 8 approved in June 2017)

  12. HPOG 2.0 Tribal Evaluation program participant focus groups will collect information on participants’ perceptions, experience, outcomes and satisfaction with the Tribal HPOG 2.0 program. (Instrument 9 approved in June 2017)

  13. HPOG 2.0 Tribal Evaluation program participant completer interviews will collect information on the current employment status of the participants who completed a training program and their perceptions of and satisfaction with the Tribal HPOG 2.0 program. (Instrument 10 approved in June 2017)

  14. HPOG 2.0 Tribal Evaluation program participant non-completer interviews will collect information on reasons participants left the program, short-term outcomes, how they feel the program could be improved, and any plans for future academic training. (Instrument 11 approved in June 2017)

As part of the HPOG 2.0 data collection, we anticipate submitting two additional OMB clearance requests for the HPOG 2.0 National Evaluation. The first submission will support the descriptive, cost benefit and impact evaluations. The protocols will include:

  1. HPOG 2.0 National Evaluation HPOG 2.0 National Evaluation descriptive evaluation second-round telephone interviews interview guides with management and staff. These interviews will collect information about notable implementation and performance issues as well as changes to the HPOG network and systems.

  2. HPOG 2.0 National Evaluation descriptive evaluation in-depth participant interview guides. These interviews will collect information about participant experiences not otherwise available through the follow-up surveys or the PAGES data.

  3. HPOG 2.0 National Evaluation cost benefit study cost forms which will collect data on costs associated with the implementation of the HPOG program to support a cost benefit analysis.

  4. HPOG 2.0 National Evaluation impact evaluation academic assessment pilot survey which will help to assess if HPOG 2.0 programs are addressing the low levels of skills among participants, and if improvements to basic literacy and math skills are related to key outcomes of interest, including educational attainment and employment.

The final OMB clearance request will include a protocol for use in the HPOG 2.0 National Evaluation impact evaluation:

  1. HPOG 2.0 National Evaluation Intermediate Follow-up Survey to measure participant outcomes at 36-months post random assignment.

Other extant data sources will be used for the HPOG 2.0 National and Tribal Evaluation. These include the following:

  1. National Directory of New Hires (NDNH). These data will provide information about employment and earnings of HPOG participants.

  2. National Student Clearinghouse (NSC). These data will provide information on student enrollment in credit-bearing courses (and some enrollment in non-credit bearing courses) and receipt of post-secondary degrees.

  3. HPOG program management information, including initial applications and ongoing management reports, which will provide supplemental information in tracking the evaluation of the grant, and information on the local healthcare labor market and needs for occupational training.

  4. Government sources of labor market data, from the U.S. Census Bureau and Bureau of Labor Statistics (BLS), such as County Business Patterns, Local Area Unemployment Statistics (LAUS), Quarterly Workforce Indicators QWI), which will provide a picture of the local labor market.

A3: Improved Information Technology to Reduce Burden

The HPOG 2.0 National Evaluation and HPOG 2.0 Tribal Evaluations will generate a substantial amount of data using a combination of data collection methods. The evaluation team designs each data collection protocol to limit the reporting burden for respondents. For each data collection activity, the study team has selected the form of technology that enables the collection of valid and reliable information in an efficient way while minimizing burden. As described in the originally approved supporting statement May 2015, with revisions in January and July 2016, and June 2017, participant- and grantee-level data will be collected through PAGES, a cloud-based data system. The evaluation teams will use the quantitative data collected through PAGES to reduce respondent burden wherever possible. The team will rely on administrative data—such as NDNH—to capture employment and wage data. This removes the burden from collecting this information from participants during the follow-up survey. Any requests for program documentation will be collected electronically as well.

The HPOG 2.0 National Evaluation impact evaluation will offer study participants the option to update their contact information online, by mail, or by telephone. The follow-up survey will be administered using computer assisted personal interviewing (CAPI) technology for all interviews. CAPI technology reduces respondent burden, as interviewers can proceed more quickly and accurately through the survey instruments, minimizing the interview length. Computerized questionnaires ensure that the skip patterns are properly implemented, minimizing respondent burden by not asking inappropriate or non-applicable questions. For example, respondents who did not participate in postsecondary training will be routed past questions only relevant to those who did. Computer-assisted interviewing can build in checkpoints, which allow the interviewer or respondent to confirm responses thereby minimizing data entry errors. Finally, automated survey administration can incorporate hard edits to check for allowable ranges for quantity and range value questions, minimizing out of range or unallowable values.

A4: Efforts to Identify Duplication

The purpose of the HPOG 2.0 National Evaluation’s Short-Term Follow-up Survey is to obtain current information on the status and wellbeing of individuals in the HPOG 2.0 National Evaluation study sample members 15 months after randomization. Information about these respondents' educational achievement, economic well-being, job skills development and progression, and overall well-being are not available through any other source, nor is information about family composition, student debt, or 21st century skills. The evaluation will utilize administrative data (e.g., wage records) in conjunction with survey data to avoid duplication of reporting.

The research team will also avoid duplication in this study by use of a study-specific database, maintained by Abt, which links all the data collected at baseline with subsequent information gathered from future surveys and administrative sources. This eliminates the need to ask about personal characteristics or background factors for known household members on follow-up surveys.

A5: Involvement of Small Organizations

The National Evaluation and Tribal Evaluation will have minimal impact on small organizations. The primary organizations involved in this study will be tribal and community colleges, workforce development agencies, tribal organizations, and community-based organizations that operate occupational training programs. The funding announcement informed all grantees of the federal evaluation and reporting requirements, and adequate resources have been provided to coordinate the data collection and reporting. There should be no adverse impact for any grantees participating in the study.

Small business professionals will only be interviewed if they are employers of National or Tribal HPOG 2.0 program graduates or grantee administrative partners. In an effort to reduce burden, the duration of each employer interview will be no more than 45 minutes.

There is no small business involvement in the National Evaluation’s Short-Term Follow-up Survey data collection.

A6: Consequences of Less Frequent Data Collection

For the HPOG 2.0 National Evaluation impact evaluation, the evaluation team is planning only two rounds of substantive data collection with individual participants. One round will start at 15 months following randomization, and the second will start at 36 months following randomization. Skipping the data collection at 15 months would compromise the National Impact Evaluation in several ways. Most seriously, it would make it nearly impossible to collect good data on the receipt of support services during training. These data are essential for calculation of costs in support of the cost benefit analysis. Second, it would jeopardize the quality of data collected on the classroom experiences of students due to respondent recall issues. These data are essential for research into the reasons for variation in impacts across programs. Third, it would eliminate the ability for policy makers to determine whether there are early signs that the HPOG 2.0 grants are achieving their purpose.

A7: Special Circumstances

There are no special circumstances for the proposed data collection.

A8: Federal Register Notice and Consultation

        1. Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on August 30, 2017, Volume 82, Number 167, page 41269-41270, and provided a sixty-day period for public comment. A copy of this notice is attached as Attachment C. During the notice and comment period, the government received no requests for information or substantive comments.

        1. Consultation with Experts Outside of the Study

The HPOG 2.0 National Evaluation team had limited consultation with external experts in developing the Short-Term Follow-up Survey. The external consultation focused primarily how to measure basic skills. Having designed similar instruments for PACE (OMB Control Number 0970-0397) and the evaluation of the first round of HPOG, no further external consultation was required.

The HPOG 2.0 Tribal Evaluation team consulted with outside experts about the proposed data collection and evaluation plan. Experts in the fields of health professions training and research in tribal communities reviewed the HPOG 2.0 Tribal Evaluation design and all comments, questions and suggestions were resolved during consultation. Additionally, the HPOG 2.0 Tribal Evaluation team worked closely with each tribal grantee for their input on the evaluation design and draft protocols.

Consultants are listed in Exhibit A-4 below. This consultation took place in 2016.

Exhibit A-4: Experts Consulted Outside of the Study

Name

Title/ Organization

Contact Information

HPOG 2.0 National Evaluation

Meredith Larson

Education Research Analyst
National Center for Education Research
Institute of Education Sciences

Meredith.Larson@ed.gov
(202) 245-7037

Stephen Provasnik

National Center for Education Research

Stephen.Provasnki@ed.gov
(202) 245-6442

Irwin Kirsch

Director of the Center for Global Assessment, Education Testing Services

ikirsch@ets.org

1-609-921-9000


HPOG 2.0 Tribal Evaluation

Mark Doescher, MD, MSPH

Stephenson Cancer Center, University of Oklahoma

Mark-Doescher@ouhsc.edu

Loretta Heuer, PhD, RN, FAAN

School of Nursing, North Dakota State University

loretta.heuer@ndsu.edu

701.231.8205

Joan LaFrance, Ed.D

Mekinak Consulting

lafrancejl@gmail.com

Myra Parker, JD, MPH, PhD

Center for the Study of Health and Risk Behaviors, University of Washington

myrap@uw.edu

(206) 616-5887

Rick Haverkate

Deputy Director, Indian Health Service

Richard.Haverkate@ihs.gov

301-945-3224



The majority of PAGES grantee- and participant-level data items are adapted from previously approved data collection instruments for PACE (clearance number 0970-0343) and HPOG ISO and HPOG-Impact (both under clearance number 0970-0394), as described in Attachment D.

PAGES data items were also developed in consultation with senior methodological and substantive experts, including: Karen Staha, Department of Labor; Yvette Chocolaad, National Association of State Workforce Agencies; Burt Barnow, George Washington University; Tim Harmon, Workforce Enterprise Services, Sung-Woo Cho, Matthew Zeidenberg, and David Fein, Abt Associates; Keith Watson, Lauren Eyster, Alan Dodkowitz, Urban Institute.

A9: Incentives for Respondents

There are no incentives provided to respondents for the data collection via PAGES, as that information is necessary for program participation, not simply for evaluation purposes. The evaluators plan to offer incentives to respondents for both the National Evaluation impact evaluation and the Tribal Evaluation. The justification and incentive plans for each study are provided below.

        1. Incentives—National Evaluation

Monetary incentives show study participants that the researchers appreciate their continued involvement in the HPOG 2.0 National Evaluation information collection activities. The HPOG 2.0 National Evaluation impact evaluation is a panel study intended to follow selected impact evaluation participants for up to three years. Although there is little published evidence of the effectiveness of incentives in reducing nonresponse bias, it is well established that incentives strongly reduce attrition (i.e., increase response rates) in panel studies such as the HPOG 2.0 National Evaluation.45 In accordance with OMB guidelines, the team took several factors into consideration when determining whether or not to use incentives.6 Specifically, the team took into account data quality issues, efforts to reduce non-response bias, the complexity of the study design and panel retention over a 36-month period, and prior use of incentives for this study population.

In a panel study such as the HPOG 2.0 National Evaluation, panel retention during the follow-up period is critical to minimizing the risk of nonresponse bias and to achieving sufficient sample for analysis. Although low response rates do not necessarily lead to nonresponse bias and it is at least theoretically possible to increase nonresponse bias by employing some techniques to boost response rates (Groves, 2006), most statisticians and econometricians involved in the design and analysis of randomized field trials of social programs agree that it is generally desirable to obtain a response rate close to 80 percent in all arms of the trial (Deke and Chiang, 2016). The work of Deke and Chiang underlies the influential guidelines of the What Works Clearinghouse (WWC). Under those guidelines, the evidential quality rating of an evaluation is sharply downgraded if the differential exceeds a certain tolerance (e.g., 4 percentage points at an overall response rate of 80 percent). Based on the research team’s experience with differential response rates in the PACE and HPOG 1.0 data collection efforts, the team believes that there is some risk that the HPOG 2.0 study might be in the penalized range if the team continues to use the follow-up protocols employed in the prior studies. PACE had a differential response rate of 5.1 percentage points. In the HPOG 1.0 three-armed experiment, the response rate differential for the standard treatment vs. the control group was 7.17 percentage points, and 6.57 percentage points for the enhanced treatment vs. the control group.

Incentives show study participants that the study team appreciates their continued support and cooperation with the study and their ongoing participation in information collection activities related to the study. The team theorizes that incentives will be a particularly powerful tool for maintaining a high response rate in the control group given that these sample members do not receive any (other) program benefits or services.

In most panel studies, response rates decline over follow-up rounds. The team has tried to minimize this expected decline and ensure a high response rate with a low treatment-control differential through the repeated use of the previously approved welcome packet and participant contact update forms and the provision of incentives, as discussed below. Through these tools the team hopes to address three goals:

  • Overcome participant mobility—over a long follow-up period, many study participants relocate multiple times, making it difficult to find them to complete a follow-up interview;

  • Reduce survey data collection costs— the more quickly interviewers can locate the respondent and complete an interview, the lower the costs per completed survey; and

  • Maintain participant engagement in a complex panel study—the ability to keep participants engaged in the research study for at least three years after enrollment is crucial to understanding long-term outcomes and the effectiveness of the HPOG Program.

          1. Previously Approved Incentives—National Evaluation

In an effort to strengthen participants’ engagement, the study team sends each study participant selected for participation in the Short-Term Follow-up Survey a welcome packet the month after enrollment followed by quarterly requests for updated contact information thereafter. (See Instrument 5a for the welcome packet previously approved in July 2017.) We include a non-monetary incentive (a portable cell phone charger) to all participants as part of the welcome packet. This item is branded with the HPOG 2.0 study logo and toll-free number. This incentive is intended in part to remind the participant about the study (rather than the program).

All study participants selected for participation in the Short-Term Follow-up Survey receive periodic requests to update their contact information using the previously approved contact update form, in the time between randomization and the Short-Term Follow-up Survey (about 15 months later) (see Instrument 5b for the contact update form, also previously approved in June 2017).

The participant contact update form does not collect any data for analytic use, but these updates are crucial to ensuring that the contact information in the sample database is as up to date as possible during the follow-up period. The study team will offer an incentive valued at $5 for each round of quarterly participant contact updates. Participants will receive their incentive after they provide updated contact information. These incentives were approved under OMB control number 0970-0462 in June 2017.

          1. Incentives under this Request for Clearance—National Evaluation

In addition to the previously approved very modest incentives for contact updates, in this request for clearance, the National Evaluation team requests permission to provide incentives for completion of the Short-Term Follow-up Survey.

Three factors informed the study’s choice of the incentive amounts for survey respondents:

  1. Respondent burden, both at the time of the interview and over the life of the study;

  2. Costs associated with participating in the interview at that time; and

  3. Other studies of comparable populations and burden.


Given a target response rate of 80 percent for the Short-Term Follow-up Survey, and based on the incentive amounts approved for previous rounds of data collection on prior Career Pathways studies (Pathways for Advancing Careers and Education or PACE, and the first round HPOG Impact studies; OMB control numbers 0970-0397 and 0970-0394 respectively), we feel that the appropriate incentive level is $40. The Short-Term Follow-up Survey respondents will receive a gift card valued at $40. Respondents will receive an email with instructions to log in to a secure study portal where they can redeem a $40 gift card from their choice of approved vendors. 7

The impact evaluation team feels that a significant incentive is required at this contact in order to meet the quality targets set by the WWC and to keep participants actively engaged in the ongoing contact updating over the 36 month follow-up period. These incentives are provided to help offset any potential expenses incurred by the participant such as cell phone minutes for those completed by telephone or childcare/transportation costs for those completed in-person.

Following completion of the Short-Term Follow-up Survey, a participant will continue to receive quarterly contact update requests in preparation for the Intermediate Survey. The Intermediate Survey will occur approximately 36-months after randomization for selected study participants. The instrument and supporting materials for that effort will be submitted as a separate information collection request.

        1. Incentives—Tribal Evaluation

OMB previously approved the use of incentives for the Tribal Evaluation, as described below, in June 2017 under this OMB Control Number (0970-0462). The Tribal Evaluation will use incentives to encourage participation in focus groups and individual follow-up interviews. Offering incentives to gain cooperation and solicit participation is a well-established practice in social science research and program evaluation for both small-scale studies and sample surveys. Participants are provided incentives as a gesture of appreciation for voluntary participation in data collection activities.

The Tribal Evaluation team worked closely with the tribal grantees to design and implement a culturally responsive evaluation. Based on our previous experience with the Tribal Evaluation of the first round of HPOG, we learned that there is the potential for non-response bias due to circumstances experienced by Tribal HPOG participants. HPOG participants in tribal programs very often have substantial family commitments, including caregiving for both children and elderly family members, and may live considerable distances from grantee organizations (where focus groups and interviews typically are conducted). These commitments and travel time required pose additional burdens to participating in research compared to other populations. In addition, the expenses associated with participation, including childcare and transportation, place additional burden on potential respondents.

Additionally, based on our experience working with Tribal grantees and HPOG participants during the first round of HPOG, tribal members can be reluctant to participate in research activities. Monetary incentives are used regularly when conducting research in tribal communities (Sobeck, 2003). Researchers have found financial incentives to be a motivator for tribal participation in research. Use of incentives also increases the likelihood that recruited participants will participate in the data collection activities.

Given these circumstances, there is the potential for non-response bias in the data collection as participants who have family commitments, longer travel time, expenses associated with research participation, or reluctance to participate in research may not participate in data collection activities. Our prior work with this population showed that participants were more likely to be single mothers, and many participants traveled significant distance to participate in evaluation activities. An insufficient incentive is likely to reduce participation among those with family commitments or longer travel requirements such that they would be underrepresented in data collection activities, thereby resulting in non-response bias. Offering an incentive to participate in the study will therefore help to offset the potential of non-response bias.

Given the travel time required for an in-person focus group or interview, incentives for participation in the in-person 90 minute focus group or in-person 60 minute completer or non-completer interview will be a non-cash honorarium valued at $50. The HPOG 2.0 Tribal Evaluation team will consult with each grantee to determine the most appropriate non-cash honorarium (e.g., gift certificate to a local grocery store) to send to the participant.

A10: Privacy of Respondents

Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law.

Participants will be allowed to receive services under the Tribal Evaluation if they do not provide a social security number. For all the non-tribal grantees participating in the national evaluation—and some of the tribal grantees participating in the tribal evaluation—study participants must provide an SSN in order to enroll in the program. For national evaluation participants and tribal evaluation participants who provide SSNs, the previously approved consent forms (Informed Consent Form A, Informed Consent Form B, and Tribal Informed Consent Form A) clearly state how SSNs will be used in the evaluation.

As specified in the evaluator’s contract, the Contractor shall protect respondent privacy to the extent permitted by law and will comply with all Federal and Departmental regulations for private information. The Contractor has developed a Data Security and Monitoring Plan that assesses all protections of respondents’ personally identifiable information. The Contractor shall ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor, who perform work under this contract/subcontract, are trained on data privacy issues and comply with the above requirements. All project and grantee staff with access to PAGES sign a New User Data Security Agreement and they undergo training on data privacy and security. Grantees participating in the National Evaluation that do not have their own Institutional Review Board (IRB) or Federalwide Assurance Number (FWA) sign individual investigator agreements, which allows the protection under Abt’s FWA. Grantees participating in the Tribal Evaluation that do not have their own IRB or FWA will sign individual investigator agreements, which will allow them protections under NORC’s FWA.

As specified in the evaluator’s contract, the Contractor shall use Federal Information Processing Standard (currently, FIPS 140-2) compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. The Contractor shall securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Processing Standard. The Contractor shall: ensure that this standard is incorporated into the Contractor’s property management/control system; establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically will be secured in accordance with the most current National Institute of Standards and Technology (NIST) requirements and other applicable Federal and Departmental regulations. In addition, the Contractor has submitted a plan for minimizing to the extent possible the inclusion of sensitive information on paper records (e.g., the consent forms) and for the protection of any paper records, field notes, or other documents that contain sensitive or personally identifiable information that ensures secure storage and limits on access.

For the HPOG 2.0 National Evaluation and Tribal Evaluation, none of the respondents that participate in interviews or focus groups will be identified in any report or publication of this study or its results; their participation will be voluntary; and their information will be kept private. This information will be provided verbally to interview respondents in both studies, and verbal consent will be requested.

As a part of informed consent, the following rationale for data collection and privacy assurances will be provided to HPOG 2.0 participants by grantees:

  • Research is being conducted to see if and how HPOG 2.0 makes a difference in people’s lives by helping them complete training and get healthcare jobs. This program and research are funded by HHS, and HHS may fund other research on this program in the future.

  • In this program, grantees will collect some personal information from individuals, such as their name, date of birth, Social Security number, and involvement in other programs.8 The researchers studying the program for the government also need this information. Researchers will use data security procedures to keep all of the study data private and to protect individuals’ personal information. All of the information collected for the program or for the research studies will be kept completely private to the extent allowed by law, and no one’s name will ever appear in any report or discussion of the evaluation results.

  • Researchers may contact applicants at grantees participating in the impact evaluation in the future. Individuals may refuse to answer any of their specific questions at any time.

        1. A.10.2 PAGES

OPRE published a Privacy Impact Assessment (PIA) to ensure that information handling conforms to applicable legal, regulatory, and policy requirements regarding privacy; determine the risks of collecting and maintaining PII; assists in identifying protections and alternative processes for handling PII to mitigate potential privacy risks; and communicates an information system’s privacy practices to the public. This PIA, titled Participant Accomplishment and Grant Evaluation System (PAGES), was approved on October 9, 2015 and is available online through HHS at http://www.hhs.gov/pia/#System.

PAGES was developed using the highest standards of technology and data security. Data for grantee-level and individual-level records will be stored securely in an SQL server database. The web interface for data entry and reporting is built on the industry leading Microsoft Dynamics customer relationship management (CRM) platform. The system is hosted and maintained on Microsoft Dynamics CRM Online Government, highly secure Federal Information Security Management Act (FISMA) Moderate compliant cloud-based Software as a Service (SaaS) solution.

Accounts on the web server will be protected with dual factor authentication, to include a password and an additional means of authentication. Dynamics CRM uses HTTPS with the SSL/TLS protocol providing encrypted communication and secure identification of the network web server.

The platform is heavily utilized in other Federal Government organizations with externally facing instances and has undergone and passed all Authority to Operate (ATO) and security protocols within these organizations. All data is filtered using the security model so records and fields containing Personally Identifiable Information (PII) data are removed for users that do not have authorization. Any logging or output files will not contain private data and will be limited to generic system and error data.

PAGES will support field-level security so users without authorization to specific data do not see the data on forms, views or reports. Thus, private participant data such as Social Security numbers will be entered into the system and encrypted at the field level, but will not be visually displayed or downloadable by system users. User-identifiable participant-specific data will be stored separately from grantee-level data and will be available for updating only by the grantee representative who originally entered the data. Grantee-specific data will be available to the project team in specific extracts and reports once the information has been entered and submitted. Information will not be maintained in a paper or electronic system from which they are actually or directly retrieved by an individuals’ personal identifier.

Grantees have received detailed guidelines and training on data entry and security procedures. Clearly defined variables and labeled fields specify how to enter each data element. Training and supporting guidance documents have been provided to grantees and technical assistance on the system is available to grantees throughout their grant period of performance.

        1. A.10.3 Data Storage and Handling of Survey Data

To ensure data security and enhance data quality, the Short-Term Follow-up Survey data collection will be done using Computer Assisted Personal Interviewing or CAPI technology. Survey data will be collected using the Confirmit CAPI System. The Confirmit CAPI System has the following security features:

  1. Data on the CAPI console is encrypted with Rijndael algorithm (256 bit key).

  2. CAPI data transfers use Web Services Enhancements (WSE 3.0) for security. The messages sent and received from the console are encrypted. WSE 3.0 provides AES128 + RSA 1.5 as default algorithms for symmetric encryption and key-wrap. The research team has also implemented Secure Conversation with an X509 certificate (which uses 1024 bit key).


In addition to the standard security features offered through the CAPI software, the research team has implemented the following enhancements:

  1. Use of PGP whole disk encryption on all CAPI laptops and tablets, and

  2. The file transfers are made to servers running SSL.



As surveys are completed, data will be transferred from the CAPI system to the study’s database. Transfer to the database will be done in a secure manner, using a FIPS-certified encryption algorithm. Once the Short-Term Follow-up Survey data collection is complete, all survey records will be transferred to the analytic database, stored on Abt Associates’ secure Analytical Computing Environment (ACE3), the FISMA moderate server, where most analyses will be conducted. ACE3 currently provides:

  • A secure, isolated environment utilizing Amazon's FedRAMP Moderate accredited services as infrastructure

  • Secure server and application configurations that meet NIST SP 800-53 Revision 4 FISMA Moderate standards where appropriate, with compliant policies and procedures

  • FedRAMP Moderate accredited file transfer services for moving data in and out of the system

  • Fully redundant architecture where possible, with architected scalability and elasticity to deal with the storage and processing of large data sets by increasing available memory, CPU, or disk space availability

  • Enhanced monitoring by AWS CloudWatch and the leading third party vendor for Log Monitoring: Dell SecureWorks

  • Enhanced availability and backups using native AWS services

The analytic databases are designed to limit access to authorized users with levels of access commensurate with each person’s role on the project. PII will be separated from the rest of the information and stored in a separate folder which only the project director (PD), the deputy project director (DPD), and one “firewalled” analyst will be able to access. The de-identified survey data will be stored in folders that will be accessible by the PD, DPD, PI, the director of analysis, and a small team of other statisticians, economists, and analysts. Only tabular data and other high-level summaries (such as regression coefficients) will be stored in the general servers of the Prime contractor, shared with subcontractors via email, and eventually published. The web server hosting the database is maintained in a secure facility with power back up, network redundancy, and system monitoring. In addition, daily back up of the server is maintained at the data center and an off-site location. The database and website are password protected, and access is provided only after user authentication.

For participant-level data collected from both survey data and corresponding administrative data from the National Student Clearinghouse ( NSC), computer security will be maintained by individual passwords and folder permissions which limit access to files to only those project staff members who require access to these files and have appropriate permission to do so.

All administrative data from the National Directory of New Hire (NDNH) will reside on ACF secure servers. Only Abt staff members granted ACF security clearance will have access to the data on ACF loaned laptops and the secure folder. All the analysis of NDNH data will be conducted on ACF’s secure server.

Information will not be maintained in a paper or electronic system from which they are actually or directly retrieved by an individuals’ personal identifier.

A11: Sensitive Questions

This section summarizes the sensitive questions asked of respondents under the HPOG 2.0 National and Tribal Evaluations. It first summarizes the items that may be perceived as sensitive in nature by participants for the previously approved participant-level data collection protocols—the PAGES and Tribal Evaluation protocols. It then provides an overview of the sensitive questions contained in the National Evaluation impact evaluation’s Short-Term Follow-up Survey, the subject of this information collection request.

        1. PAGES Participant-Level Baseline Questions

The previously approved PAGES participant-level baseline questions—those pertaining to individual participant characteristics—may be considered sensitive by some program participants. For example, questions about criminal records, disabilities, and limited English proficiency. These data are standard items for other workforce development programs and will allow important comparisons between HPOG and other similar efforts for evaluation and management purposes. In addition, these data are needed to fully identify programs and program characteristics that are most successful for serving the vulnerable populations that HPOG was designed to support and that are a focus of ACF’s assistance programs.

Individual identifying information of a sensitive, personal, or private nature that all HPOG 2.0 grantee applicants will complete includes: (1) last and first name; (2) Social Security number; (3) date of birth, (4) ethnicity and race; (5) marital status; (6) number of children; (7) whether the individual is a TANF or SNAP recipient; (8) disabilities, (9) limited English proficiency, and (10) employment status at program intake and exit. These items were previously approved under this OMB control number (0970-0462) in August 2015.

        1. HPOG 2.0 Tribal Evaluation Participant Data Collection Protocols

Several questions in the HPOG 2.0 Tribal Evaluation program participant focus groups (Instrument 9), HPOG 2.0 Tribal Evaluation program participant completer interviews (Instrument 10), and HPOG 2.0 Tribal Evaluation program participant non-completer interviews (Instrument 11) may be considered sensitive by some program participants. These questions ask about participant and family needs and what types of supportive services received, including academic, social, and employment related. These questions are necessary because supportive services are a key component of the HPOG Program. Data collected will be used to identify how HPOG programs assess student needs and what types of services they are offering as part of their program. Participants will be informed that their participation is voluntary, that they may decline to answer any question that they wish, and that their information will be kept private and they will not be identified in any report or publication of this study or its results. These questions were previously approved under this OMB control number (0970-0462) in June 2017.

        1. HPOG 2.0 National Evaluation Short-Term Follow-up Survey

The Short-Term Follow-up Survey includes several questions about overall physical health, income, receipt of government benefits, fertility, and household composition, items that some respondents may consider sensitive. As it is hoped that HPOG 2.0 will have favorable impacts in all these areas, failure to ask any of them would limit the findings of the evaluation. Interviewers will remind study members during the interview that they may refuse to answer individual items. Study members will also be reminded that their responses will be kept private to encourage their candid responses.

A12: Estimation of Information Collection Burden

        1. A12. 1 Previously Approved Information Collections

Total Burden Previously Approved


The previously approved burden estimates included: 1) burden on grantee staff members who enter grantee-level and ongoing participant-level data into PAGES to complete the HPOG PPRs; 2) burden on HPOG applicants to complete the baseline questions; and 3) burden on grantee staff who enter the baseline data into PAGES.

The total burden for all previously approved instruments was estimated to be 7,609 hours annually, or 22,828 hours total.

Burden Remaining from Previously Approved Information Collection

Estimated burden remaining to continue use of the previously approved instruments is 5, 849 hours annually or 17,560.04 total hours over the next three years. Exhibit A-5 shows the remaining hourly and cost burden.

Exhibit A-5: Burden Remaining from Previously Approved Information Collection

Instrument

Total Number of Respondents

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Annual Burden Hours

Average Hourly Wage

Total Annual Cost

Instrument 1: PAGES Grantee- and Participant-Level Data Collection (all grantees)

56

19

2

31.75

1,207

$28.29

$34,146.03

Instrument 1: PAGES Participant-Level Baseline Data Collection (participants at non-Tribal grantees)

12,780

4,260

1

.5

2,130

$3.94

$8,392.20

Instrument 1: PAGES Participant-Level Baseline Data Collection (participants at Tribal grantees)

615

205

1

.25

51

$3.94

$200.94

Instrument 4: HPOG 2.0 National Evaluation in-person implementation interviews9

100

33

1

1.5

50

$28.29

$1,414.50

Instrument 5a: HPOG 2.0 National Evaluation welcome packet and participant contact update forms

13,650

4,550

1

.1

455

$10.15

$4,618.25

Instrument 5b: HPOG 2.0 National Evaluation letter and participant contact update form

14,700

4,900

3

.1

1,470

$10.15

$14,920.5

HPOG 2.0 Tribal Evaluation

Instrument 6: HPOG 2.0 Tribal Evaluation grantee and partner administrative staff interviews

105

35

1

1

35

$28.29

$990.15

Instrument 7: HPOG 2.0 Tribal Evaluation program implementation staff interviews

150

50

1

1.5

75

$28.29

$2,121.75

Instrument 8: HPOG 2.0 Tribal Evaluation employer interviews

90

30

1

.75

23

$50.99

$1,172.77

Instrument 9: HPOG 2.0 Tribal Evaluation program participant focus groups

405

135

1

1.5

203

$10.15

$2060.45

Instrument 10: HPOG 2.0 Tribal Evaluation program participant completer interviews

300

100

1

1

100

$10.15

$1,015

Instrument 11: HPOG 2.0 Tribal Evaluation program participant non-completer interviews

150

50

1

1

50

$10.15

$507.50

Estimated Annual Burden Previously Approved

5,849


$ $71,560.04


        1. A12. 2 Additional Burden for Previously Approved Information Collection

The actual number of expected participants in the National Evaluation and the Tribal Evaluation over the next three years exceed the expected enrollment numbers reflected in the revised participant burden estimates for the previously approved information collection (See Section A15 for more detail). See Supporting Statement B1, for details on the respondent universe. Exhibit A-6 shows the additional burden estimates in both hours and cost, associated with the higher than expected projected enrollment.

Exhibit A-6: Additional Burden for Previously Approved Information Collection

Instrument

Total Number of Respondents

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Annual Burden Hours

Average Hourly Wage

Total Annual Cost

Instrument 1: PAGES Participant-Level Baseline Data Collection (participants at non-Tribal grantees)

9,400

3,133

1

.5

1,567

$3.94

$6,173.98

Instrument 1: PAGES Participant-Level Baseline Data Collection (participants at Tribal grantees)

1,463

488

1

.25

122

$3.94

$480.68

Estimated Additional Annual Burden Previously Approved

1,689


$6,654.66


        1. A12. 3 Newly Requested Information Collections

Exhibit A-7 presents the reporting burden in both hours and cost for National Evaluation impact evaluation participants who complete the Short-Term Follow-up Survey.


Exhibit A-7: Burden for Newly Requested Information Collection

Instrument

Total Number of Respondents

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Annual Burden Hours

Average Hourly Wage

Total Annual Cost

HPOG 2.0 National Evaluation

Instrument 12: Short-Term Follow-up Survey for the HPOG 2.0 National Evaluation Impact Evaluation

10,400

3,467

1

1

3,467

$10.15

$35,190.05

Estimated Annual Burden Total

3,467


$35,190.05



        1. A12.4 Total Burden under OMB # 0970-0462

Exhibit A-8 shows the estimated annual respondent burden over the next three years is 11,005 hours. This represents the total burden remaining from the previously approved information collection and the additional burden hours for previously approved information collection, due to higher enrollment projections. The table then shows the total annual burden estimates for the new information collection.

Exhibit A-8: Total Burden under OMB #0970-0462

Instrument

Annual Burden Hours

Burden Remaining from Previously Approved Information Collection

5,849

Additional Burden for Previously Approved Information Collection

1,689

Burden for New Information Collection

3,467

Total Annual Burden Hours

11,005

Total Annual Cost

To compute the total estimated annual cost reported in Exhibits A-5, A-6 and A-7, evaluators used the average wage for HPOG 1.0 participants employed at program intake ($10.64) and multiplied that by the proportion of those working at intake (0.37) for an average hourly total of $3.94. Evaluators believe the HPOG 1.0 data provide an accurate basis for estimating wages for HPOG 2.0 study participants for the previously approved information collection under PAGES. The baseline wage was appropriate for the original HPOG Next Generation submission as the PAGES system collects wage information at the time of enrollment. Since, this is a job training program we have revised the cost in the burden table in Supporting Statement A to reflect the loaded federal minimum wage. The loaded federal minimum wage was used in the previously approved information collection requests for the HPOG 1.0 15- and 36-Month Follow-up Surveys and the PACE 15 and 36-Month Follow-up Surveys (OMB Nos. 0970-0394 and 0970-0397 respectively). For the cost to grantees and partner organizations data collection efforts, the total burden costs were multiplied by the average hourly wage, according to the Bureau of Labor Statistics, National Compensation Survey, 2010 ($28.29/hour). 10 The average hourly wage for the employer interviews is based on Bureau of Labor Statistics code 11-9111, Medical and Health Services Managers ($50.99). The total annual cost burden of collecting this new information is $35,190.05. We estimate that the annual costs for the remaining previously approved data collection is $71,560.04 over the next three years. The cost associated with the additional previously approved information collection is $6,654.66. The total annual cost burden for all efforts combined is estimated at $113,404.75.

A13: Cost Burden to Respondents or Record Keepers

Not applicable. The proposed information collection activities do not place any new capital cost or cost of maintaining capital requirements on respondents.

A14: Estimate of Cost to the Federal Government

The total cost for the data collection activities under this request are $8,473,750. The costs for the previously approved requests will be $1,788,164 for the National Evaluation descriptive evaluation and $1,225,193 for the Tribal Evaluation, for a total of $3,013,357, plus $2,020,248 for the original submission. Thus, the total costs to the Federal government are $14,732,548. Annual costs to the Federal government will be $4,910,849 per year for three years.

A15: Change in Burden

This is an additional information collection request under OMB #0970-0462. The burden estimates include estimates for the new HPOG 2.0 National Evaluation Short –Term Follow-up Survey, the subject of this information collection request. It also includes changes to the previously approved burden estimates. The previous burden estimates were approved and covered the first three years of study enrollment under the HPOG 2.0 grants. The tribal and non-tribal grantees will enroll participants over a four and a half year period. The previously approved burden estimates covered the first three years of the study; this revised burden estimate covers the next three years—incorporating the last year and a half of enrollment. As a result, there are higher enrollment projections over the next three years and, by extension, the number of participants expected to enroll in HPOG 2.0 and complete the baseline intake form (Instrument 1) is higher for both the tribal and non-tribal grantees.

A16: Plan and Time Schedule for Information Collection, Tabulation and Publication

16.1 Analysis Plan

Exhibit A-9 summarizes the primary domains covered in Short-Term Follow-up Survey instrument and provides a brief discussion of how they will be used.

Exhibit A-9: Domains for HPOG 2.0 Short-Term Follow-up Survey Instrument

Domain

Notes

Uses

A. Training and employment history

Dates of every school and job spell since randomization. Reasons for no school/job during gaps. Careful probing for simultaneous study and work as well as multiple job holding. Dates of every school and job spell since randomization. Reasons for no school/job during gaps. Careful probing for simultaneous study and work as well as multiple job holding.

  • Collecting school names to match to the Integrated Postsecondary Education Data System ( IPEDS) enhances the ability to classify school type and control than by using IPEDS alone

  • Maximize reporting of short-term job and training spells

  • Get accurate measurement of total months of training

  • Maximize reporting of short-term job and training spells


B. School Experiences

For each school spell:

  • Length of break periods,

  • Credits,

  • Typical weekly instructional hours,

  • Program completion,

  • Financing of training,

  • Support services,

  • Employer involvement,

  • Other skills training,

  • Student evaluation of teaching,

  • Student evaluation of counseling services

  • Improve measurement of total hours of training

  • Measure credits and program completion (confirmatory outcome) as signs of progress toward credentialing

  • Measure program costs for the cost- benefit analysis

  • Measure variation in implementation across programs for use in attempting to explain variation in program impact

C. Credential attainment and education/career goals

Mostly about credentials, both those issued by schools and those issued by other authorities.

  • Secondary and exploratory outcomes for short-term report


D. Terms of employment and conditions at current/last job

Occupation, scheduling, hourly wage rate, typical hours, benefits, other quality measures.

  • Exploratory outcomes for short-term report

E. Household composition

Living arrangements, counts of adults and children, family formation, child bearing.

  • Exploratory outcomes for short-term report

F. Income and financial well-being

Includes personal and household participation in government anti-poverty programs as well as income; includes questions on financial well-being and material hardship.

  • Secondary and exploratory outcomes for short-term report


G. 21st Century Skills

Use of literacy and numeracy skills at work and in everyday life; self-directed learning

  • Exploratory outcomes for short-term report.

  • Possible mediators for 36-month report

The HPOG 2.0 National Evaluation team will produce several reports using the data collected for the descriptive evaluation. The reports will include:

  • Descriptive Evaluation Report. This report will summarize the information on program implementation features, challenges, and best practices using the descriptive evaluation interviews, site visit data, and data from PAGES. This report will include the implementation, outcome, and systems studies. The evaluation will also use results from the implementation study to produce short case study reports on focus areas of specific interest to ACF.

  • Impact Evaluation Reports. Findings from the implementation study will inform the analysis in the evaluation’s impact evaluation reports. The evaluation is expected to produce reports on results based on 15-month and 36-month follow-up surveys and associated administrative data analysis.

The HPOG 2.0 Tribal Evaluation will use a systematic approach to analyze the data obtained through the interviews and focus groups conducted during and following annual site visits. The evaluation team will use NVivo software to store and analyze the large volume of data collected over the course of the evaluation. NVivo will be used to develop a coding scheme for analyzing these data. The coding scheme will be organized around evaluation topic areas derived from the evaluation questions. The coding scheme will be applied to all data and emergent key themes relating to evaluation topic areas will be identified.

The HPOG 2.0 Tribal Evaluation team will prepare a variety of reports, including site visit reports, practice briefs, and a final report.

  • Site Visit Reports. These reports will be developed after each annual site visit and summarize the findings from the interviews and focus groups.

  • Practice Briefs. Practice briefs will be shorter documents that highlight findings from the evaluation and share lessons learned.

  • Final Report. The final report will reflect the aggregated analysis of all qualitative and quantitative data collected throughout the evaluation

PAGES will provide a platform for grantee representatives monitoring the overall grant implementation to enter semi-annual progress on achieving grant objectives that will be submitted as required by ACF. The system will automatically generate quantitative measures for the federally required semi-annual PPRs, which will include aggregated participant-level data, and will also store narrative-based grantee level performance information. Grantees will print the PPRs from PAGES, sign the paper documents, and submit them to ACF. ACF will use these tables when preparing reports to Congress on the HPOG initiative. HPOG PAGES data collection activities will also support three annual report deliverables that will include information such as characteristics of grantee programs, number and characteristics of participants, and information on program participants’ receipt of training and services and employment and training outcomes. The PAGES team will produce a number of reports using data collected, including the six semi-annual PPRs and three annual reports.

        1. 16.2 Time Schedule and Publications

The National Evaluation descriptive evaluation data collection began in July 2017, following OMB approval on the previous package. Contact updates for those participants in the National Evaluation impact evaluation sample began in November 2017 and continue throughout the follow-up period. The Tribal Evaluation data collection began in October 2017. PAGES data collection will occur as individuals apply for the programs and enrollees receive training and services throughout the next three years of the grant period. Exhibit A-10 presents an overview of the project schedule for information collection.

Exhibit A-10: Project Schedule for Data Collection, Analysis, and Publication

Task

Timing

National Evaluation: Descriptive Evaluation

Descriptive evaluation data collection (includes costs, systems, and program implementation)

June 2017-December 2018

National evaluation descriptive study site visits

Fall 2018

Descriptive evaluation Analysis Plan

Fall 2017

Descriptive evaluation Report (including implementation, outcome and systems studies)

Final March 2020

National Evaluation: Impact Evaluation Participant Contact Updates

Welcome Packets

Monthly, one month after random assignment beginning in Fall 2017

Contact Update Mailing

Quarterly beginning 3 months after random assignment (November 2017)

Short-Term Follow-up Survey Data Collection

June 2018-September 2019 (15-months after randomization for participants enrolled between March 2017 and March 2018)

Draft Report to ACF

May 2020

HPOG 2.0 Tribal Evaluation

Site visits to tribal grantees (1/year)

Annually spring/summer of 2017, 2018, 2019, 2020

Conduct data analysis

2017-2021

Develop Practice Briefs

Annually September 2017-September 2021

Develop Final Report

September 2021

HPOG 2.0 National and Tribal Evaluation Participant Accomplishment and Grantee Evaluation System (PAGES)

PAGES grantee-level and ongoing participant-level data collection

September 2015 – September 2018

6 Semi-annual PPRs

September 2015 – September 2018

Two annual reports

September 2015 – September 2018

A17: Reasons Not to Display OMB Expiration Date

All instruments created for the HPOG 2.0 National Evaluation and Tribal Evaluation will display the OMB approval number and the expiration date for OMB approval.

A18: Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

1 By “counseling” we mean services such as tutoring, academic advising, financial aid advising, career counseling, job search or placement assistance, and case management.

2 The research questions for the previously approved National Evaluation descriptive study and the Tribal Evaluation are shown in Attachment O. This request for clearance focuses on the research questions for the National Evaluation impact study. Research questions for the National Evaluation cost benefit study will be provided in a subsequent request for clearance.

3

4 The HPOG 2.0 impact evaluation is a panel study. The three primary points of data collection are the previously approved Baseline Intake Form administered immediately prior to randomization, the Short-Term Survey, initiated 15 months after randomization (for which we request clearance in this package) and the Intermediate Survey, projected to begin 36 months after randomization (but for which clearance is not requested in this package).

5 See Chapter 12 of Lynn (2009), in particular, section 12.5 that reviews the effects of incentives in several prominent panel studies.

7 In accordance with HPOG funding requirements, the incentives can be redeemed only through vendors that do not sell alcohol, tobacco, firearms or other entertainment.

8 Two Tribal grantees will not collect social security numbers from some or all of their participants. A unique identifier will be assigned for these participants. Two versions of the Tribal informed consent forms were developed, one that includes social security numbers and one for grantees using unique identifiers.

9 This burden estimates reflects the average across all instruments included in Instrument 4.

10 Source: Bureau of Labor Statistics, National Compensation Survey, 2010: Combined average hourly wage across education training and library occupations and community and social services occupations.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Single-Sided Body Template
AuthorKatheleen Linton
File Modified0000-00-00
File Created2021-01-21

© 2025 OMB.report | Privacy Policy