Supporting Statement for OMB Clearance Request
Part B
National and Tribal Evaluation of the 2nd Generation of the Health Profession Opportunity Grants
0970-0462
Revised September 2018
Submitted by:
Office of Planning,
Research & Evaluation
Administration for Children & Families
U.S. Department of
Health
and Human Services
Federal Project Officers:
Hilary Forster
Nicole Constance
Amelia Popham
B.1 Respondent Universe and Sampling Methods 1
B.2 Procedures for Collection of Information 5
B.3 Methods to Maximize Response Rates and Deal with Non-response 9
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 14
Attachments:
Previously Approved Instruments
Instrument 1: PAGES Grantee- and Participant-Level Data Items List
Instrument 2: HPOG 2.0 National Evaluation Screening Interview
Instrument 3: HPOG 2.0 National Evaluation first-round telephone interview Protocol
Instrument 4: HPOG 2.0 National Evaluation in-person implementation interviews
Instrument 4A HPOG 2.0 National Evaluation In-Person Implementation Interview
Instrument 4B HPOG 2.0 National Evaluation In-Person Implementation Interviews Basic Skills Training
Instrument 4C HPOG 2.0 National Evaluation In-Person Implementation Interviews Career Pathways
Instrument 4D HPOG 2.0 National Evaluation In-Person Implementation Interviews Work-Readiness
Instrument 4E HPOG 2.0 National Evaluation In-Person Implementation Interviews Sustainability
Instrument 5: HPOG 2.0 National Evaluation welcome packet and participant contact update forms
Instrument 5a: HPOG 2.0 National Evaluation welcome packet and contact update form
Instrument 5b: HPOG 2.0 National Evaluation participant contact update letter and form
Instrument 6: HPOG 2.0 Tribal Evaluation grantee and partner administrative staff interviews
Instrument 7: HPOG 2.0 Tribal Evaluation program implementation staff interviews
Instrument 8: HPOG 2.0 Tribal Evaluation employer interviews
Instrument 9: HPOG 2.0 Tribal Evaluation program participant focus groups
Instrument 10: HPOG 2.0 Tribal Evaluation program participant completer interviews
Instrument 11: HPOG 2.0 Tribal Evaluation program participant non-completer interviews
New Instruments Included in this Request
Instrument 12: HPOG 2.0 National Evaluation Short-Term Follow-up Survey
Attachment A: References
Attachment B: Previously Approved Informed Consent Forms
Attachment B2: Tribal Evaluation informed consent form A (SSNs)
Attachment B3: Tribal Evaluation informed consent form B (Unique identifiers)
Attachment C: 60 Day Federal Register Notice
Attachment D: Previously Approved Sources and Justification for PAGES Grantee- and Participant-Level Data Items
Attachment E: Previously Approved Final Updated Attachment E PPR Data List and Mockup
Attachment F: First Round of HPOG Grantees Research Portfolio
Attachment G: Previously Approved Participant Contact Information Update Letter and Form
Attachment H: HPOG Logic Model
Attachment I: Previously Approved Focus group participant consent form
Attachment J: Previously Approved Interview Verbal Informed Consent Form
Attachment K: NEW HPOG 2.0 National Evaluation Short-Term Follow-up Survey Advance Letter
Attachment L: NEW HPOG 2.0 National Evaluation Short-Term Follow-up Survey Sources
Attachment M: NEW HPOG 2.0 National Evaluation Short-Term Follow-up Survey Trying to Reach You Flyer
Attachment N: NEW HPOG 2.0 National Evaluation Short-Term Follow-up Survey Email Reminder
Attachment O: NEW Research Questions for Previously Approved Data Collection Efforts (National Evaluation and Tribal Evaluation)
Part B of the Supporting Statement for The Health Profession Opportunity Grants 2.0 (HPOG 2.0) National and Tribal Evaluation —sponsored by the Administration for Children and Families (ACF) in the U.S. Department of Health and Human Services (HHS)—considers the issues pertaining to Collections of Information Employing Statistical Methods. Abt Associates (Abt) is the prime contractor for the study. Abt and its partners MEF Associates, the Urban Institute and Insight Policy Research are conducting the HPOG 2.0 National Evaluation and partner NORC is conducting the Tribal Evaluation. Abt and partners the Urban Institute and Green Beacon are responsible for the Participant Accomplishment and Grantee Evaluation System (PAGES) design and implementation. PAGES is a data collection system for collecting information from all HPOG grantees on their program designs and offerings, intake information on eligible applicants (both treatment and control group members) through baseline data collection, and a record of participants’ activities and outcomes.
The federal evaluations of the HPOG 2.0 National and Tribal grantees will evaluate postsecondary career pathway programs, focused on the healthcare sector, that target Temporary Assistance for Needy Families (TANF) recipients and other low-income individuals. This submission seeks approval for three nonsubstantive changes to previously approved information collection requests. First, it requests approval for minor changes in content to the National Evaluation impact study’s Short-Term Follow-up Survey (Instrument #12, approved in June 2018). Second, it seeks approval to add one additional question to Instrument #12. Third, it seeks approval for a modest increase in burden for the previously approved in-person implementation interviews (Instrument #4, approved in June 2017). Justification for these non-substantive changes can be found in the supplementary document HPOG 2.0 Memo to OMB_Pretest changes_Expanded Site Visits_V4_REV091118.docx.
Thirty-two HPOG grants were awarded to government agencies, community-based organizations, post-secondary educational institutions, and tribal-affiliated organizations in September 2015. Of these, 27 were awarded to non-tribal entities and five were awarded to tribal organizations.
All 32 grantees will participate in this federally sponsored evaluation. There is no statistical sampling required for the HPOG 2.0 National Evaluation descriptive evaluation or the HPOG 2.0 Tribal Evaluation. Evaluators will work closely with grantees to identify participants in the respective studies. Under the National Evaluation impact evaluation, the evaluators will select up to 13,000 study participants beginning with the cohort of participants enrolled in March 2017 for inclusion in the follow-up survey sample. Rather than randomly sampling from everyone randomized, the surveys will sample only those randomized in a narrow time period. To ensure that programs have gotten over initial implementation issues and to ensure that survey results are available as soon as possible, the Short-Term Follow-up Survey will sample all of the projected 13,000 people randomized from March 2017 through March 2018 (i.e., 13 monthly cohorts of approximately 1,000 cases per month). The evaluators waited to begin survey sample selection until the March 2017 cohort in order to maximize efficiency for the survey data collection effort (i.e., to lower survey costs relative to taking a true random sample of everyone randomized) and to allow all programs time to complete start-up activities and reach steady-state operations. Allowing time for all programs to mature helps to alleviate some of the challenges typically associated with early enrollment cohorts on random assignment studies, such as very small monthly enrollment cohorts, or grantees modifying eligibility criteria or intake processes. Compressing the length of the field period was the most efficient way to ensure that evaluators could meet the survey sample size requirements within the available resources. The evaluators will rely on baseline equivalency testing to determine whether there are significant differences in participant characteristics between those enrolled prior to March 2017 and those enrolled after. The evaluators will also use post-randomization administrative data from the National Student Clearinghouse (NSC) and National Directory of New Hires (NDNH) to determine if impact on college persistence and earnings vary by period. If noteworthy differences by enrollment period are discovered, then appropriate caveats will be added to impact findings based on survey outcomes.
Study participants will receive contact update requests every three months leading up to the Short-Term Follow-up Survey. Once the Short-Term Follow-up Survey data collection period ends, the contact update requests will resume in preparation for the Intermediate Follow-up Survey (to be conducted 36-months after randomization, under a later OMB information collection request.
All five tribal grantees will participate in the federally sponsored HPOG 2.0 Tribal Evaluation. For the HPOG 2.0 Tribal Evaluation, there are two major respondent universes: (1) Tribal HPOG 2.0 grantees, partners, and employers; and (2) Tribal HPOG participants, including program completers and non-completers. Exhibit B-1 presents the sampling methods and target response rates for each of the HPOG 2.0 National and Tribal Evaluation respondent subgroups. The respondent subgroup and instrument (the Short-Term Follow-up Survey) shown in bold font are the subjects of this information collection request. All other instruments and their corresponding subgroups were previously approved under this OMB control number.
Exhibit B-1: HPOG 2.0 National and Tribal Evaluation Respondents
Respondent Universe |
Respondent Subgroup |
Sampling Methods and Target Response Rates1 |
Data Collection Strategies |
National HPOG 2.0 Evaluation |
|
||
Grantees, partners, and employers |
Grantees |
Evaluation team members review the topics of interest with grantees using the HPOG 2.0 Screening Interview to identify appropriate respondent(s) based on who is most knowledgeable about the topics of interest. (See Instrument 2). Grantees have agreed to participate in the evaluation as a condition of receiving HPOG grant funding. Therefore, the team expects a 100 percent response rate. |
Semi-structured telephone interviews (Instruments 2, 3 and 4) |
|
Managers and staff |
A very high response rate (at least 80 percent) is expected among grantee managers and staff. |
Semi-structured in-person interviews (Instruments 2, 3 and 4) |
|
Partners |
A very high response rate (at least 80 percent) is expected among grantee partners. |
Semi-structured in-person interviews (Instruments 2, 3 and 4) |
|
Employers |
A very high response rate (at least 80 percent) is expected among employers. |
Semi-structured in-person interviews (Instruments 2, 3 and 4) |
Impact evaluation participants selected for Short-Term Follow-up Survey sample |
A sample of participants (up to 13,000) beginning with those enrolled in March 2017 |
Up to 13,000 study participants, beginning with those enrolled in March 2017 will be part of the participant contact update efforts. The team expects that 35 percent of the respondents will respond to each quarterly participant contact update effort.2 |
Contact updates by mail, online portal, or telephone (Instruments 5a and 5b) |
Impact evaluation participants selected for Short-Term Follow-up Survey sample |
A sample of participants (up to 13,000) beginning with those enrolled in March 2017 |
Up to 13,000 study participants, beginning with those enrolled in March 2017 will be part of the Short-Term Follow-up survey. The team expects that 80 percent of the participants selected will complete this survey effort, resulting in 10,400 completes. |
Telephone or in-person interviews conducted by local interviewers with CAPI technology (Instrument 12) |
Tribal HPOG 2.0 Evaluation |
|||
Grantees, partners, and employers |
Grantees |
Grantees have agreed to participate in the evaluation as a condition of receiving HPOG grant funding. Therefore, the team expects a 100 percent response rate. |
Semi-structured in-person interviews (Instruments 6 and 7) |
|
Management and Staff |
A very high response rate (at least 80 percent) is expected among grantee staff. |
Semi-structured in-person interviews (Instruments 6 and 7) |
|
Partners |
Partners have agreed to participate in the evaluation as a condition of receiving HPOG grant funding. Therefore, the team expects a 100 percent response rate. |
Semi-structured in-person interviews (Instruments 6 and 7) |
|
Employers |
A very high response rate (at least 80 percent) is expected among HPOG employers. |
Semi-structured in-person interviews (Instrument 8) |
Participants |
Program participants (current) |
The tribal evaluation team will work with the grantees to recruit participants during the annual site visit planning period. The team expects a 25-50 percent response rate from current program participants. |
In-person focus groups (Instrument 9) |
|
Program completers |
The tribal evaluation team will work with the grantees to recruit participants during the annual site visit planning period. The team expects a 25-50 percent response rate from program completers. |
Semi-structured in-person interviews (Instrument 10) |
|
Program non-completers |
The tribal evaluation team will work with the grantees to recruit participants during the annual site visit planning period. The team expects a 10-25 percent response rate from program non-completers. |
Semi-structured in-person interviews (Instrument 11) |
HPOG National and Tribal Evaluation Participant Accomplishment and Grantee Evaluation System (PAGES) |
|||
Participants |
National Evaluation (Non-Tribal) HPOG Participants |
No sampling techniques will be employed for PAGES data collection. A 100 percent response rate is expected. |
Baseline and ongoing participant level data |
|
Tribal HPOG Participants |
No sampling techniques will be employed for PAGES data collection. A 100 percent response rate is expected. |
Baseline and ongoing participant level data |
PAGES includes the applicant population of the anticipated 32 organizations that received HPOG funding. As discussed, the system provides data at the grantee- and individual- level. Thus, data is collected and will continue to be collected from the 32 grantees on their program designs and offerings, from all eligible applicants on their baseline characteristics, and from all of the individuals the grantees serve on their individual participation and outcomes.
Approximately 44,163 individuals are expected to complete the baseline data collection across the 32 grantees during the HPOG 2.0 grant period. The grantees under the National and Tribal evaluations will enroll participants over a four and a half year period.3 The National Evaluation team expects the impact evaluation sample to include up to 40,000 individuals who apply to participate in the HPOG programs operated by 27 non-Tribal HPOG 2.0 grantees (13,333 controls and 26,667 treatments). The Tribal Evaluation team expects the tribal grantees will enroll 2,663 participants. This represents an increase from our previous submission as it now includes the full program enrollment. Supporting Statement A, Section A15 provides more detail on this increase. These projected enrollment numbers suggest an additional 9,400 National Evaluation participants will complete the PAGES baseline intake form than was previously estimated. Approximately 1,500 participants from the first round of HPOG grants are expected to receive additional services under HPOG 2.0. Thus, the total National Evaluation sample is estimated at 41,500 participants. Further, it is anticipated that up to 2,663 individuals will apply to participate in the HPOG programs operated by five HPOG 2.0 Tribal grantees over life of the grant. No sampling techniques will be employed for PAGES data collection.
This section describes the data collection procedures for the previously approved HPOG 2.0 National Evaluation descriptive evaluation and the impact evaluation contact updates, followed by the impact evaluation’s Short-Term Follow-up survey, the focus of this request for clearance. The section next provides a description of the previously approved HPOG 2.0 Tribal Evaluation data collection procedures and concludes with a description of the PAGES data collection procedures.
The sample frame for the HPOG 2.0 National Evaluation descriptive evaluation includes all 27 non-tribal grantees. No statistical methods will be used for stratification and sample selection. The descriptive evaluation exclusively uses purposive sampling since it is a descriptive study. The primary data collection approach for the descriptive evaluation will be two rounds of semi-structured interviews conducted by telephone and up to two sets of in-person site visits with program directors, case managers and other relevant grantee staff. The first round of telephone interviews will focus on early implementation efforts. The second round will update the earlier round and will collect information to help lay the groundwork for the systems and cost studies. Site visits will be to programs implementing promising approaches to program components of specific interest to ACF. Telephone and site visit data collection will be supplemented with data from PAGES and other existing site-specific materials developed earlier by the national evaluation team.
Protocols for the first round of descriptive evaluation data collection were previously approved under OMB Control Number 0970-0462 in June 2017. The data collection efforts using the grantee telephone interview and screening guides (Instruments 2 and 3) are now complete. The site visit data collection (previously approved instrument 4) is still underway.
Site Visits
Site visits will take place at five to ten grantees participating in the National Evaluation during the first round of the descriptive evaluation. The research team will rely on data from the first round of descriptive evaluation telephone interviews and other extant data, (such as from PAGES and other existing site-specific materials developed earlier by the national evaluation team) to identify up to ten grantees that are implementing programs with innovative approaches to program components of interest to ACF. The national evaluation team will conduct two-person site visits to conduct this phase of the data collection (previously approved Instrument 4).
The national evaluation team has developed semi-structured interview protocols for the collection of first round implementation telephone interviews and evaluation site visits (Instruments 3 and 4). All protocols begin with a brief introductory script that summarizes the overall evaluation, the focus of each interview, how respondent privacy will be protected, and how data will be aggregated.
Participant Contact Update Request Procedures
The participant contact update efforts for the impact evaluation begin with the study participants who are randomly assigned to the study beginning in March 2017. All study participants enrolled during this timeframe will be included in this effort. The participant contact update form—previously approved in June 2017—will be self-administered (Instrument 5b). The form will be mailed to sample members quarterly, beginning three months after random assignment. Participants will be encouraged to respond by returning the form by mail, through a secure online portal, or they can update their contact information by phone. Participants can indicate that the information is correct or they can make any necessary changes in contact information.
Short-Term Follow-up Survey Procedures
This information collection request covers the Short-Term Follow-up Survey. Prior to the start of data collection, the evaluator will recruit and train a team of local interviewers. The team will also send an advance letter to all participants selected for the Short-Term Follow-up Survey data collection effort.
Interviewer Staffing: An experienced, trained staff of local interviewers will conduct the HPOG 2.0 Short-Term Follow-up Survey, starting 15 months after randomization. The training includes didactic presentations, numerous hands-on practice exercises, and role-play interviews. The evaluator’s training materials will place special emphasis on project knowledge and sample complexity, gaining cooperation, refusal aversion, and refusal conversion.
Abt maintains a roster of approximately 1,700 experienced in-person interviewers across the country. To the extent possible, the new study will recruit in-person interviewers who worked successfully on prior career pathways studies for ACF (such as first round HPOG Impact and PACE 15-month, 36-month, and 72-month surveys under OMB control numbers 0970-0394 and 0979-0397 respectively)). These interviewers are familiar with the career pathways model and study goals, and they have valuable experience locating difficult-to-reach respondents. We will also recruit other in-person interviewers with expert locating skills to ensure that the needs of all studies are met successfully.
All potential in-person interviewers will be carefully screened for their overall suitability and ability to work within the project’s schedule, as well as the specific skill sets and experience needed for this study (e.g., previous in-person data collection experience, strong test scores for accurately recording information, attention to detail, reliability, and self-discipline).
Advance Letter: To support the Short-Term Follow-up Survey effort, an advance letter will be mailed to study participants selected to participate in the survey approximately one and a half weeks before interviewers begin the data collection. The advance letter serves as a way to re-engage the participant in the study and alert them to the upcoming effort so that they are more prepared for the interviewer’s call. (See Attachment K.)
The evaluators will update the sample database prior to mailing advance letters to help ensure that the letter is mailed to the most up to date address. The evaluators will use address data from the quarterly contact update requests and matches to proprietary databases to update the sample database. The team will send a personalized advance letter to each selected study participant. The advance letter will remind each selected participant who agreed to participate in the study about the timing of the upcoming data collection effort, the voluntary nature of participation, and that researchers will keep all answers private. The letter provides each selected study participant with a toll-free number that he/she can call to set-up an interview. Interviewers may send an email version of the advance letter to participants for whom we have email addresses if we do not have success reaching them early in the field period.
Email Reminder: Interviewers will attempt to contact participants by telephone first. If initial phone attempts are unsuccessful, interviewers can use their project-specific email accounts to introduce themselves as the local data collection staff, explain the study and attempt to set up an interview. They send this email, along with the advance letter, about halfway through the time during which they are working the cases. (See Attachment N email reminder text.)
Interviewing: Data collection begins when local interviewers attempt to reach the selected study participants by telephone. Interviewers will have access to all telephone numbers collected at baseline and updated throughout the 15 month follow-up period for both the study participant and any alternate contacts (such as family or friends that study participants identified as people who will know how to reach them). The interviewers will call all phone numbers ever collected first and then attempt any alternate contacts.
After the interviewers exhaust all phone efforts, they will work non-completed cases in person. Interviewers may leave specially designed project flyers with family members or friends (see Attachment M survey flyer).
The sample frame for the HPOG 2.0 Tribal Evaluation includes all five tribal grantees. No statistical methods will be used for stratification and sample selection. The HPOG 2.0 Tribal Evaluation exclusively uses purposive sampling since it is a descriptive study. The tribal evaluation team will use multiple sources of data for the process and outcome evaluation, primarily centered around annual site visits to Tribal HPOG grantees: semi-structured in-person interviews with grantee and partner administrative staff, program implementation staff, and local employers; focus groups and follow-up interviews with program participants, including program completers and non-completers; and program operations data collected through PAGES. The previously approved Tribal Evaluation data collection effort is still underway. (See Supporting Statements A and B, OMB Control Number 0970-0462, approved in June 2017 for more detail.)
During the initial 120 day planning period after grant award, all grantees were required to provide information on their program components and offerings to be included into a previously approved internet-based data system (PAGES). All HPOG grantees update the grantee-level and ongoing participant-level data at least semi-annually, but may enter data at any time either manually or by uploading data from existing data systems into an internet-based data system (PAGES) constructed for the purpose of collecting data for the HPOG 2.0 programs. Abt Associates manages PAGES.
Participant-Level Baseline Data Collection
Grantees administer a previously approved informed consent form relating to PAGES data items, administrative data, and follow-up surveys (if applicable), when individuals who apply to the program are determined eligible to participate in HPOG 2.0.All non-Tribal HPOG 2.0 grantees are required to participate in the impact evaluation, which includes randomly assigning new eligible HPOG applicants to either be invited to receive HPOG services or to serve in the control group.4 However, prior HPOG and Pathways for Advancing Careers and Education (PACE) project participants who were randomly assigned in the first impact studies will not be subject to random assignment again and will be allowed to enroll in the program. After the consent process, grantee staff members administer PAGES baseline questions, including questions on individuals’ expectations for education and employment, barriers to employment, work preferences, and self-efficacy. After individuals complete the informed consent form and the required baseline questions, a secure, web-based software program randomly assigns them into either the treatment or control group.
The HPOG 2.0 Tribal grantees are participating in a federal evaluation that will not require random assignment. Applicants at the Tribal grantees complete the previously approved non-random assignment informed consent form that asks permission for researchers to access data individuals provide at intake, information about the training and services they receive after enrollment, administrative data, and follow-up surveys. After the consent process, a grantee staff member will administer PAGES baseline questions, excluding the baseline questions on individuals’ expectations for education and employment, barriers to employment, work preferences, and self-efficacy.
To ensure participants can understand each of the documents, the National Evaluation welcome packet and participant contact update forms, as well as the previously approved informed consent forms and PAGES data elements were designed at an 8th-grade readability level. The HPOG 2.0 team will provide a Spanish version of the informed consent forms and will work with grantees on ways staff can assist where translation of other data collection instruments may be needed. To ensure the instruments are culturally responsive, the Tribal Evaluation protocols were reviewed by the Tribal HPOG 2.0 grantees and consultants with expertise conducting research in tribal communities.
This section describes the methods to maximize response rates for the HPOG 2.0 National Evaluation descriptive evaluation, the HPOG 2.0 National Evaluation impact evaluation, and the HPOG 2.0 Tribal Evaluation. The focus of this request for clearance is on our procedures to maximize response and deal with non-response for the National Evaluation impact evaluation’s Short-Term Follow-up Survey. The National Evaluation descriptive evaluation and the Tribal Evaluation plans were previously approved in June 2017. See Supporting Statement B, approved in June 2017 for discussion on methods to maximize response rates to those efforts. Since the previously approved contact update forms are part of the methods to maximize the data Short-Term Follow-up Survey response rates, we retain them in this document.
The evaluation team will use the following methods to maximize response to the Short-Term Follow-up Survey effort:
Participant contact updates and locating;
Incentives; and
Sample control during the data collection period.
We now discuss each of these methods.
The HPOG 2.0 National Evaluation impact evaluation anticipates two rounds of follow-up survey data collection—a short-term survey about 15 months after randomization and an intermediate survey about 36-months after randomization. The evaluation team developed a comprehensive participant contact update system in order to maximize response to the two follow-up surveys. This multi-stage locating strategy blends active locating efforts (which involve direct participant contact) with passive locating efforts (which rely on various consumer database searches). The contact updates were previously approved under this OMB control number in June 2017 with non-substantive changes approved in July 2017 and will continue until the Intermediate Survey data collection (to begin about 3 years after randomization, under a separate information collection request). The Intermediate Survey data collection instrument and plans to maximize response to that survey and minimize non-response bias will be submitted under a separate request for clearance. This request covers the new Short-Term Follow-up Survey data collection efforts.
All impact evaluation participants will be included in the administrative data collection. However, in order to maximize efficiency for the survey data collection effort and allow all programs time to mature, the evaluators will select up to 13,000 study participants beginning with those enrolled in March 2017, for inclusion in the Short-Term Follow-up Survey sample. The plans to maintain updated participant contact information for the HPOG 2.0 National Evaluation impact evaluation begin with a welcome packet, sent to all sample members within the first month of enrollment and quarterly contact updates thereafter. The welcome letter and study brochure—Instrument 5a which was approved by OMB in June 2017—provide comprehensive information about the study requirements, the contact update efforts, and survey data collection activities. The evaluators send quarterly requests to participants to update their name, address, telephone and email information, and preferred method of contact using the previously approved contact update form (Instrument 5b, approved in June 2017). The form also collects new or updated contact data for up to three people who do not live with the participant, but who will likely know how to reach him or her. Interviewers will only use secondary contact data if the primary contact information proves to be invalid (e.g., they encounter a disconnected telephone number or a returned letter marked undeliverable). Previously approved Instrument 5b shows a copy of the contact update form.
To boost the response rate to the contact update requests, researchers offer three ways for participants to provide updated data. Participants can return the contact update form by mail, they can respond online, or they can call in and provide their updated information. Offering multiple options to respond helps to ensure that participants can respond in the mode most convenient for them.
The evaluation team will offer an incentive valued at $5 for each contact update form received from participants. The incentive—previously approved in June 2017, is a way to thank the participant for returning the form and for remaining engaged in the study. Participants will receive the incentive via email after they provide updated contact information. Study participants who return updated contact information will receive an email with instructions showing them how to log in to a secure study portal where they can redeem a $5 gift card from their choice of approved vendors. 5
The evaluation team also proposes to offer an incentive for completion of the Short-Term Follow-up survey. Given a target response rate of 80 percent for the Short-Term Follow-up survey, the projected 60 minute length of the survey, and based on the incentive amounts approved for previous rounds of data collection on OPRE’s prior Career Pathways studies (PACE and HPOG 1.0 Impact OMB control numbers 0970-0397 and 0970-0394 respectively), we feel that the appropriate incentive level is $40. As with the contact update forms, respondents will receive an email with customized instructions showing them how to log in to a secure study portal where they can redeem a $40 gift card from their choice of approved vendors. 6
Without an incentive of this order of cost, the impact evaluation study is unlikely to meet the previously mentioned quality targets set by the WWC (See Supporting Statement A, Section A9 for more information). The team also feels that a significant incentive is required at this contact in order to keep participants engaged in the study and responsive to the continued contact update efforts leading up to the Intermediate Follow-up Survey, 36 months after randomization.
Incentives at one or more phases of data collection have been used successfully, on a number of similar federally sponsored surveys such as PACE (OMB control number 0970-03970) and the HPOG 1.0 Impact Study (OMB control number 0970-0394.) The planned incentive amount is comparable to what was offered for the follow-up survey efforts for both of those studies.
Finally, the team does not rely solely on the use of incentives to maximize response rates and reduce nonresponse bias. The next section summarizes other efforts to maximize response rates.
During the Short-Term Follow-up Survey data collection period, the evaluation team will minimize nonresponse levels and the risk of nonresponse bias by:
Using trained interviewers with experience working on prior career pathways studies. They are skilled at working with low-income adults and in maintaining rapport with respondents, to minimize the number of break-offs and incidence of nonresponse bias.
Providing a Spanish language version of the survey instrument to help achieve a high response rate among study participants for whom Spanish is their first language.
Using a mixed-mode approach (telephone with in-person follow-up) but with a single team of local interviewers. Our experience on the prior career pathways studies, such as the first round HPOG1.0 Impact and PACE Studies, shows that respondents are more likely to answer calls from local interviewers than the phone center. This will also allow the local interviewer to tailor their approach to the in-person effort.
Sending email reminders to non-respondents (for whom we have an email address) informing them of the study and allowing them the opportunity to schedule an interview (Attachment N).
Providing a toll-free study hotline number—which will be included in all communications to study participants—to help them ask questions about the survey, update their contact information, and indicate a preferred time to be called for the survey.
Taking additional locating steps in the field, as needed, when the interviewers do not find sample members at the phone numbers or addresses previously collected.
Using customized materials in the field, such as “trying to reach you” flyers with study information and the toll-free number (Attachment M).
Requiring the survey supervisors to manage the sample to ensure that a relatively equal response rate for treatment and control groups is achieved.
Through these methods, the research team anticipates being able to achieve the targeted 80 percent response rate for the Short-Term Follow-up survey.
If interviewers achieve a response rate below 80 percent, the research team will conduct a nonresponse bias analysis. The evaluation team plans to deal with both unit non-response and item nonresponse in the Short-Term Follow-up survey. Unit nonresponse is almost inevitable because interviewers will likely be unable to complete a short-term follow-up interview with all 13,000 selected participants. Item non-response is also inevitable as participants have the right to refuse to answer any item in the survey, or they may not know the response.
To address unit nonresponse, the evaluation team will construct survey weights7 using pre-randomization individual characteristics from PAGES. The analysis will use these weights when estimating impacts for outcomes reported in these surveys.8 We will construct these weights for nonresponse (with baseline characteristics) using traditional methods that involve fitting a model for unit response propensity, stratifying the sample by predicted response propensity, calculating actual response rates for the strata, and finally inverting those average response rates for each strata to serve as nonresponse adjustment weights.9 These are similar to traditional baseweights for surveys except that instead of being inverse probabilities of selection, they are inverse probabilities of response.
The study team plans to carry these steps out separately for the treatment and control samples, on the pooled sample, as described in the National Evaluation’s Draft Impact Evaluation Design and Execution Plan currently (under review).
Unless item nonresponse is common for primary or secondary outcomes, our tentative plan is to deal with item nonresponse by dropping cases with missing outcomes. We anticipate that item nonresponse may be high for a few survey items (e.g., family income). However, items with likely high nonresponse are neither primary nor secondary outcomes. Experience with multiple imputation of survey items on HPOG 1.0 analysis was computationally intense and required substantial processing time for variance estimation, but did not show much impact on bias or variance. The study team and OPRE are currently discussing options for addressing missing data for this evaluation.10
The data collection instruments for the HPOG 2.0 National Evaluation descriptive evaluation (Instruments 2-4) were developed and reviewed by Federal staff and evaluation team members. Many questions were either taken from or modified from instruments successfully used in the HPOG 1.0 evaluation. The research team pre-tested the screening interview and first-round telephone interview instruments with three non-tribal HPOG grantees. Grantees that completed the survey during the pre-test will be given their completed surveys to review and update when the full survey is fielded to reduce burden while ensuring all responses are accurate and up-to-date. Experienced interviewers conducted the interviews and discussed respondents’ perceptions of the clarity and flow of survey items, ease of completion, and time requirements. After pretesting, we revised the instruments based on the feedback. All changes were reflected in the version of the instruments previously approved in June 2017.
The participant contact update form for the HPOG 2.0 National Evaluation impact evaluation (Instrument 5b) draws largely from the contact update forms previously approved for other ongoing career pathways studies, particularly the Pathways for Advancing Careers and Education (PACE) and the first round of the HPOG Impact studies, OMB control numbers 0970-0397 and 0970-0394 respectively. Based on our experiences, minor modifications were made to the HPOG 2.0 contact update form. The first modification allows participants to give consent for researchers to text any new phone numbers provided on the update form. The second modification allows the participants to indicate their preferred mode of contact, which will allow researchers to tailor their approach to the participant preferences.
In designing the Short-Term Follow-up survey, the evaluation team included items used successfully in other national surveys, particularly the PACE and first round of HPOG impact follow-up surveys (OMB control numbers 0970-0397/0394). Consequently, many of the survey questions have been thoroughly tested on large samples.
To ensure the length of the instrument was within the burden estimate, the evaluation team conducted a pretest of the Short-term Follow-up Survey (Instrument #12) after it was approved in June 2018. Following the pretest, several minor changes were recommended to shorten the interviews and improve the clarity of the questions. The changes in content requested as part of this non-substantive change request serve four primary purposes:
Revising or dropping questions to reduce the administration time;
Adding logic checks or follow-up probes to improve data quality;
Correcting skip logic (i.e., which question to ask next based on the respondent’s previous responses); and
Adding introductory or clarifying text to explain the purpose of a question.
The pretest also showed that a question to capture a critical data item—the respondent’s earnings at the current or most recent job—was inadvertently omitted from the original submission. The team seeks approval of question D7-D7c to capture the respondent earnings.
The details of the changes requested are provided in the supporting document Draft HPOG 2.0 Memo to OMB_Pretest changes_Expanded Site Visits_V4_REV091118.docx.
The data collection instruments for the HPOG 2.0 Tribal Evaluation have been reviewed by (1) ACF staff, (2) a Technical Working Group comprised of consultants with expertise in workforce development and tribal research, and (3) all five Tribal HPOG grantees. Their comments were incorporated into the final versions.
Abt Associates subcontracted with Green Beacon Solution, a leading Microsoft Gold Certified software development firm to create the HPOG 2.0 PAGES in consultation with ACF. Federal staff and evaluation team members had informal discussions with six first round HPOG grantees on possible data system designs and data elements. Prior to the initial launch of PAGES, there were two distinct environments, one for development and one for testing. Development and testing was conducted in multiple stages. Abt Associates professional software testers tested major modules that were deployed to the test environment and submitted issues. Once these issues were resolved, the broader team of program staff reviewed the module and offered feedback. In the month prior to the release of the system, Abt technology staff re-tested all parts of the system to ensure that it met the stated functional requirements and is free of bugs. Abt Associates, Urban Institute and ACF program staff also conducted user acceptance testing.
Since the launch of the system, three environments are maintained: Development, Test, and Production. Any changes to the system will have to move through the Development and Testing environments and procedures prior to being deployed in the live production environment. All three environments will be maintained throughout the life of the project.
The baseline questions are either identical or similar to questions used in previous Abt Associates or other national surveys. As such, they have been thoroughly tested on large samples.
With ACF oversight, Abt and its partners MEF Associates, the Urban Institute and Insight Policy Research are responsible for conducting the HPOG 2.0 National Evaluation; Abt and the Urban Institute are responsible for developing and maintaining PAGES and providing HPOG 2.0 grantees with support for using the system to produce the required semi-annual reports and collecting the needed baseline data. NORC, under subcontract to Abt Associates, is responsible for conducting the HPOG 2.0 Tribal Evaluation.
With ACF oversight, Abt Associates and Urban Institute are responsible for developing the HPOG 2.0 PAGES and providing support for using the system to produce the required semi-annual reports and collecting the needed baseline data.
Statistical analyses of the data for annual program performance reports will be limited to descriptive tabulations included in the contractor’s annual reports to ACF. Other as yet unspecified statistical analyses may be planned for the impact evaluation currently being designed and other future research efforts. Such analyses will be the subject of a later request for clearance.
The individuals listed in Exhibit B-2 below made a contribution to this information collection request.
Exhibit B-2: Contributors
Name |
Role in HPOG 2.0 National and Tribal Evaluation |
Organization/Affiliation |
Gretchen Locke |
National Evaluation Project Director |
Abt Associates |
Jacob Klerman |
National Evaluation Co-Principal Investigator |
Abt Associates |
Bob Konrad |
National Evaluation Co-Principal Investigator |
Abt Associates |
Robin Koralek |
National Evaluation Deputy Project Director |
Abt Associates |
David Judkins |
National Evaluation Director of Impact Analysis |
Abt Associates |
Debi McInnis |
National Evaluation Site Coordinator |
Abt Associates |
Michael Meit |
Tribal Evaluation Project Director |
NORC |
Kate Fromknecht |
Tribal Evaluation Project Manager |
NORC |
Emily Phillips |
Tribal Evaluation Senior Research Analyst |
NORC |
Julie Strawn |
PAGES Project Director |
Abt Associates |
Dr. Laura R. Peck |
PAGES Co-Principal Investigator |
Abt Associates |
Dr. Pam Loprest |
PAGES Co-Principal Investigator |
Urban Institute |
Dr. Eleanor Harvill |
PAGES Evaluation Design Task Lead |
Abt Associates |
Brian Sokol |
PAGES Data System Task Lead |
Abt Associates |
Dr. Alan Werner |
Key staff on HPOG 2.0 project |
Abt Associates |
Jennifer Buell |
PAGES Deputy Project Director |
Abt Associates |
Dr. Howard Rolston |
PAGES Project Quality Advisor |
Abt Associates |
Karen Gardiner |
PAGES Project Quality Advisor |
Abt Associates |
Inquiries regarding the statistical aspects of the design of the data system should be directed to:
Julie Strawn, Project Director
Abt Associates
4550 Montgomery Ave #800N
Bethesda, MD 20814
(301) 347-5853
Inquiries regarding the statistical aspects of the HPOG 2.0 National Evaluation design should be directed to:
Gretchen Locke, Project Director
Abt Associates
55 Wheeler St
Cambridge, MA 02138
(617)-349-2373
Inquiries regarding the statistical aspects of the HPOG 2.0 Tribal Evaluation design should be directed to:
Michael Meit, Project Director
NORC at the University of Chicago
4350 East West Highway, 8th Floor
Bethesda, MD 20814
(301)-634-9324
The Evaluators also consulted a team of outside experts on the design on the HPOG 2.0; recruitment strategies and data collection instruments for the HPOG 2.0 Tribal Evaluation; the grantees will also contribute to data collection via the PAGES system:
Technical Working Group (Consultants)
Mark Doescher, MD, MSPH, Stephenson Cancer Center, University of Oklahoma
Rick Haverkate, MPH, Deputy Director, Indian Health Service
Loretta Heuer, PhD, RN, FAAN, School of Nursing, North Dakota State University
Joan LaFrance, Ed.D, Mekinak Consulting
Myra Parker, JD, MPH, PhD, Center for the Study of Health and Risk Behaviors, University of Washington
Tribal Health Professions Opportunities Grants 2.0 grantees
Mark Hiratsuka, Cook Inlet Tribal Council
Phillip Longie, Cankdeska Community College
Irene BearRunner, Turtle Mountain Community College
Scott Baker, Ute Mountain Ute Tribe
Kathleen Thurman, Great Plains Tribal Chairmen’s Health Board
The following HHS staff, including the HHS project officers Hilary Forster, Nicole Constance, and Amelia Popham, have overseen the design process and can be contacted at:
Hilary Forster
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
330 C Street S.W., 4th Floor, Washington, D.C. 20201
(202) 619-1790
Nicole Constance
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
330 C Street S.W., 4th Floor, Washington, D.C. 20201
(202) 401-7260
Amelia Popham, MSW
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
330 C Street S.W., 4th Floor, Washington, D.C. 20201
(202) 401-5322
1 Response rate expectations are based on a variety of factors. Grantees have agreed to participate in the evaluation as a condition of receiving HPOG funding, so grantee, partner, and employer response rates are expected to be very high. Participation in the evaluation studies is voluntary for HPOG participants, so response rates are expected to be lower. Previous experience with similar populations indicates that response rates are expected to be lower for participants who do not complete the program than those who do.
2 The projected response rate for the contact update form is based on prior experience with similar approaches on studies of comparable populations—primarily the PACE and HPOG 1.0 Impact study samples (OMB No. 0970-0397 and 0970-0394 respectively).
3 Although it is a five year grant, enrollment did not start until four to six months into the grant period, and we expect enrollment to slow toward the end of the grant period to allow participants ample time to take advantage of the HPOG services. Thus, we view the enrollment period as four and a half years.
4 Although all new applicants are subject to random assignment, they are able to withdraw from the study at any time and they can choose not to answer any survey questions.
5 In accordance with HPOG funding requirements, the incentive will be offered to vendors that do not sell alcohol, tobacco, firearms or other entertainment.
6 In accordance with HPOG funding requirements, the incentive will be offered to vendors that do not sell alcohol, tobacco, firearms or other entertainment.
7 Given current plans to select the follow-up sample by selecting full monthly cohorts for a compact range of enrollment months, there will be no need to reflect probabilities of selection in these weights.
8 Some would argue that including regressors is sufficient to control for differential survey response with respect to observables. Nevertheless, the use of response-propensity adjusted weights in conjunction with regression models for outcomes confers a benefit known as “double robustness” (Scharfstein, Rotnitzky, & Robins, 1999); that is, if either the response-propensity model or the outcome model is correct, then the estimated treatment effects will be unbiased. Kang and Schafer (2007, with discussion) have argued that using weights needlessly increases variance compared with analysis with a correct outcome model. However, specifying a correct outcome model can be difficult and typically involves the exploration of alternate sets of covariates, something we would prefer to avoid. An advantage of using weights is that they can be developed just once without looking at any outcomes and then be used in the analysis of all outcomes.
9 See, for example, Section 3.5 of the textbook by Valliant, Dever, and Kreuter (2013); Chapter 5 of the textbook by Kim and Shao (2014); Section 2.7 of the textbook by Heeringa, West, and Berglund (2010). See also Göksel, Judkins, & Mosher (1992); Buskirk & Kolenikov (2015).
10 The PACE team, in consultation with OPRE and the HPOG 1.0 team, decided not to use multiple imputation. The HPOG 1.0 team reports that multiple imputation is quite computationally intense and that results—both impact estimates and standard errors—are nearly invariant to whether or not multiple imputation is used. In addition, multiple imputation often precludes using conventional software to deal with other technical issues (e.g., multi-level modelling issues, instrumental variables). Given this experience, the added cost and complexity do not seem worthwhile for HPOG 2.0.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Abt Single-Sided Body Template |
Author | Katheleen Linton |
File Modified | 0000-00-00 |
File Created | 2021-01-20 |