Goal-Oriented Adult Learning in Self-Sufficiency (GOALS) Study
OMB Information Collection Request
New Collection
Supporting Statement
Part B
September 2015
Submitted By:
Office of Planning, Research and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
7th Floor, West Aerospace Building
370 L’Enfant Promenade, SW
Washington, D.C. 20447
CONTENTS
B1. Respondent Universe and Sampling Methods 3
B2. Procedures for Collection of Information 4
B3. Methods to Maximize Response Rates and Deal with Nonresponse 5
B4. Tests of Procedures or Methods to be Undertaken 6
B5. Individual
Consulted on Statistical Aspects and Individuals Collecting
and/or
Analyzing Data 6
B1. Respondent Universe and Sampling Methods
Examine relevant literature and systematic reviews to identify programs that meet criteria;
Solicit nominations from Mathematica’s internal advisors, consultants, and experts;
Explore programs identified through other existing Mathematica projects;
Solicit nominations from ACF’s staff, partners and regional offices.
The next step will be to gather additional background on nominated programs and compile basic data on each. This will involve reviewing any available documentation on the nominated programs and compiling data to enable a comparison across programs. Examples of data include referral source, target population, demographics, program size, location and geographic coverage, length of time in existence, availability of outcome data, and the intervention model or key programmatic features that form the basis for its recommendation for potential exploration.
The final step will be to define clear parameters for prioritizing which programs warrant exploratory telephone calls. This will maximize the potential for identifying programs that are doing interesting and relevant work. The research team will prioritize those that serve disadvantaged adults and young adults with no strong attachment to the labor force and programs that show some evidence of effectiveness or promise for producing improved outcomes through development of goal-oriented skills. The GOALS Program Director/Administrator Exploratory Telephone Interview Guide (Attachment A) will serve as the basis for the interviews with up to 24 program directors. Site visitors will adapt the questions in the master guide to make them relevant to the unique aspects of each program.
The research team will develop criteria for program selection seeking to include programs that best incorporate research-based principles of goal-directed behavior and skills development and strengthening. Criteria may include dimensions such as:
Research/evidence-based programs
Aspects of skills development included in approach
Replicability of program
Outcomes of programs
Type of program
Targeted populations
Size of program
Intensity of services
Site location and geographic coverage
Length of time in existence
Up to 12 sites will be selected to receive visits, all implementing programs that incorporate research-based principles of goal-directed behavior and skill assessment and development.
The site visits will include updates from the exploratory calls with program directors/administrators, individual or small group discussions (of no more than 2-3 respondents per group) with program staff and community partners, and small group discussions with program participants. The GOALS Site Visit Master Interview Guide and Topics by Respondent (Attachment B) and GOALS Site Visit Participant Interview Guide (Attachment C) will serve as the basis for the site visit interviews. On average the research team will interview 15 staff per site, for a total of 180 interviews across all sites. It will not be possible to interview all program supervisors, program staff, and community partners. The research team will purposively select individuals to interview representing different program components, positions (for example, eligibility workers, case managers, workshop facilitators), and locations if programs include more than one site. The study team expects to include an average of seven participants in the group discussions at each site for a total of 84 program participants across all sites. The team will rely on a liaison at each program site to help recruit a mix of participants at different points in the service delivery process (entry and early participation, mid-way, late participation to program exit) and with different levels of program engagement (low to moderate engagement and high engagement).
B2. Procedures for Collection of Information
Exploratory telephone calls
The semi-structured exploratory discussions with program directors will be conducted over the phone. The research team will begin making contacts with programs and scheduling visits once OMB clearance is received (expected Winter 2016). Exploratory calls are to be completed from Winter through Spring 2016. To engage and obtain cooperation from program directors for the exploratory telephone calls, the research team will send an advance letter (Attachment E) by email to the program director/administrator that describes the research, requests the program director’s participation in a one-hour interview, and provides an overview of the topics to be covered. The team member responsible for scheduling the calls will follow up with the program director to provide further clarification, respond to questions, request program materials that are not available on the program’s website, and schedule the one-hour telephone interview. Each call will be led by a senior member of the Mathematica team; junior staff will take notes.
Site visits
Discussions will be conducted in-person in the form of semi-structured individual and groups discussions. They will be conducted from Spring 2016 through Spring 2017.
Members of the study team will make contact with the director of each program via email followed by a site visit introductory call. The study team will have already established a relationship with the program director during the exploratory call. During the introductory call, study team members will ask about program updates, request permission to conduct the field work, and describe the process for conducting the site visit. Scheduling of interviews will be done collaboratively with the organization to ensure minimal disruption to program operations. The study team will recommend that the program director identify a site liaison within the program to work with the study team to plan the visit sessions and recruit program participants for the small group discussions. The research team anticipates that interviews will last between 60 and 90 minutes.
B3. Methods to Maximize Response Rates and Deal with Nonresponse
Expected Response Rates
Response rates at the program level for the exploratory telephone interviews are expected to be 95 percent. In our previous experience, no or very few programs decline to participate in a phone interview. Estimates for the site visits are that only one or two of the invited programs may decline to participate. To prepare for this, the study team will recommend two alternative programs to serve as alternates, as needed. Based on experience conducting research in similar settings using similar techniques, few programs decline invitations to participate in studies of this type because programs often view contributing to knowledge development as part of their mission.
At the staff level during the site visits, the research team expects that most staff that are invited will participate but some will be absent on the day of the visit or unable to be available at that time. The study team anticipates that to meet with an average of 15 staff members per site, they may have to target recruitment of 15 to 20 staff members in anticipation of nonresponse or unexpected work absences on the day of the site visit.
Depending on the type of program, it may be more challenging to obtain the optimal discussion group size of about 7 program participants, which is why the study team will request that the program seek to recruit approximately 12 participants.
Specific approaches to achieving good response rates are described in the maximizing response rates section below for each data collection activity.
Data reliability. Strategies to ensure that the data are reliable and as complete as possible include flexibility in scheduling of visits and the assurance given to respondents of confidentiality of the information that they provide. Furthermore, the neutral tone of the questions in the data collection protocols and the absence of sensitive questions, along with the training of the site visitors, will facilitate a high degree of accuracy in the data. In addition, shortly after each site visit, the site visit team members will synthesize the data from each interview, observation, and group discussion as required to complete a structured site visit summary. Because most questions will be asked of more than one respondent during a visit, the analysis will allow for the triangulation of the data so that discrepancies among different respondents can be interpreted.
Dealing with Nonresponse
If the study team finds that a number of programs are refusing to participate in the exploratory calls or the site visits, the study team will confer with ACF and the expert panel to discuss alternate strategies for recruiting sites and revisit the recruitment approach.
If the site visitor finds that despite providing supports to the site liaison for planning the staff and program participant discussion sessions, participation rates are lower than expected, the site visitor will immediately begin problem solving with the site liaison and the program director to attempt to recruit additional participants for the study.
Maximizing Response Rates
Exploratory calls
Studies using similar methods have obtained a high response among sites. Several factors will help ensure a high rate of cooperation. First, senior members of the research team who are familiar with the programs will contact program leadership. Second, the research team will be recruiting program directors who are heavily invested in improving programs and interventions designed to strengthen psychological processes associated with goal-directed behaviors. The research team anticipates that directors will be eager to engage in these conversations.
Site visits
Rates of participation are usually high for studies using similar methods with programs interested in contributing to advancing the knowledge in their fields and sharing what they have learned. On-site, qualitative data collection engages participants and helps to ensure that the data are reliable. Site visit team members will begin working with program staff and the site liaison well in advance of each visit to ensure that the timing of the visit is convenient. Because the visits will involve several interviews and activities each day, there will be flexibility in the scheduling of specific interviews and activities to accommodate the particular needs of respondents and site operations. Discussions with program participants will be held at a time and location that is convenient to them. In addition, we will offer a $25 token of appreciation to program participants who are willing to take part in the discussion group. Our experience with other studies have shown that providing a $25 token of appreciation may increase the likelihood that some program participants show up who might not otherwise.
B4. Tests of Procedures or Methods to be Undertaken
Exploratory calls
There are no plans to test the procedures. Similar discussions have been conducted in the past by the research team, as well as by ACF for other projects, and have been an effective strategy for gathering information.
Site visits
To ensure that the site visit master interview guide provide effective field guides that will yield comprehensive and comparable data, one of the senior members of the study team or one of the expert consultants will conduct the first of these visits to test the data collection instruments and reporting procedures. This first site visit will help to ensure that the instruments are easy for the site visitors to use to gather the in-depth information needed on the topics of interest and do not omit relevant topics of inquiry.
B5. Individual Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
The following individuals will be involved in the design, data collection, and analysis for this study: Kimberly Boller, Senior Fellow, Mathematica Policy Research
Michelle Derr, Senior Researcher, Mathematica Policy Research
Jackie Kauff, Senior Researcher, Mathematica Policy Research
Elizabeth Cavadel, Senior Researcher, Mathematica Policy Research
Clancy Blair, Professor of Applied Psychology, New York University
Richard Guare, Director, Center for Learning and Attention Disorders at Seacoast Mental Health Center
Mary Anne Anderson, Research Analyst, Mathematica Policy Research
Valerie Caplan, Research Analyst, Mathematica Policy Research
Michelle Lee, Program Associate, Mathematica Policy Research
Butler, David, Julianna Alson, Dan Bloom, Victoria Deitch, Aaron Hill, JoAnn Hsueh, Erin Jacobs, Sue Kim, Reanin McRoberts, and Cindy Redcross. “What Strategies Work for the Hard-to-Employ? Final Results of the Hard-to-Employ Demonstration and Evaluation Project and Selected Sites from the Employment Retention and Advancement Project,” OPRE Report 2012-08. MDRC for OPRE ACF, Office of the Assistant Secretary for Planning and Evaluation, DHHS, March 2012.
Hamilton, Gayle. “Improving Employment and Earnings for TANF recipients.” The Urban Institute, March 2012.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | 50020 200 OMB Part B (08-03-15) |
Subject | OPRE OMB Clearance Manual |
Author | Mathematica Staff |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |