UNDERSTANDING THE DYNAMICS OF DISCONNECTION FROM EMPLOYMENT AND ASSISTANCE
OMB CLEARANCE PACKAGE:
SUPPORTING STATEMENT, PART B
Federal Project Officer
Emily Schmitt
Department of Health and Human Services
Administration for Children and Families
Office of Planning, Research and Evaluation
370 L'Enfant Promenade SW
Washington, DC 20447
December 2012; Revised March 2013
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
1. Respondent Universe and Sampling Methods 1
2. Information Collection Procedures 2
UNDERSTANDING THE DYNAMICS OF DISCONNECTION FROM EMPLOYMENT AND ASSISTANCE
SUPPORTING STATEMENT PART B:
COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
B.1 Respondent Universe and Sampling Methods
According to recent estimates, 20 to 25 percent of low-income single mothers are disconnected from work and cash assistance for some period of time over the course of a year. Studies generally define “disconnected” as having had no or low earnings and no TANF or SSI receipt for some period of time. 1 In order to be eligible for our study, respondents must either be disconnected currently (defined for our study as not employed and not receiving TANF or Supplemental Security Income (SSI) for themselves) or must have experienced at least six months of disconnection in the past two years.
The sample for the current study will consist of 66 disconnected, low-income women. Respondents must be unmarried, have resident children, not be employed, and not be receiving TANF or Supplemental Security Income (SSI) for themselves. Women who are currently employed or receiving TANF may be included in the study if they experienced at least six months of unemployment in the past two years, had a child and were unmarried during the period of unemployment, and were not receiving TANF at the time.
Information will be collected in two sites with relatively high concentrations of low-income families: Los Angeles, California and Southeast Michigan. Respondents will be sampled from two existing longitudinal surveys in those sites: the Best Start Los Angeles Pilot Community Evaluation, currently led by the Urban Institute’s Health Policy Center and the Center for Healthier Children, Families and Communities at the University of California Los Angeles (UCLA), and the Michigan Recession and Recovery Study (MRRS), conducted by the National Poverty Center of the University of Michigan. Sampling from survey respondents in these existing studies was determined advantageous given the surveys’ access to potentially hard to reach individuals that fit the study criteria and the current contracting organizations’ involvement in these survey studies.
Los Angeles, California
The Best Start LA longitudinal survey sample includes 734 participants who are mothers of young children. Of those, 300 will be surveyed for the final time between April 2013 and August 2013, in time to be included in the study under consideration here. To ensure high response rates among eligible participants, the ACF study will be advertised in person by field staff from UCLA immediately after participants complete the Best Start LA survey. Field staff from UCLA, who are collecting Best Start LA survey data, will describe the purposes and requirements of the study (locally referred to as the friendlier name Family Coping Strategies) and share a recruitment flyer with participants (see Appendices B-1 and B-1S). Those who are interested and think they may be eligible (estimated to be approximately 100 women) will consent to have UCLA share their names and contact information with the Urban Institute research team (see Appendix A-1 and A-1S). UCLA will send the Urban Institute a list of those who consent to have their information shared on a monthly basis. Once a list is received, an Urban Institute research assistant will immediately begin making phone calls to recruit and screen potential respondents for eligibility (see screening materials in Appendices A-2 and A-2S). We estimate that 80 of the 300 Best Start LA survey participants will be eligible for the study and we will achieve a response rate of approximately 70%. The estimate of a 70% response rate is based on the experience of the Best Start LA evaluation. Once a sample of 36 participants is reached, recruitment calls will cease. (The study has resources to conduct 66 interviews, 36 of which are budgeted to take place in LA). A follow-up call will be made to schedule the visit, which will remind participants of the study and help retain the sample (see Appendices A-3 and A-3S for this script).
We estimate a 70% response rate from eligible Best Start LA survey participants. The estimate of a 70% response rate is based on the experience of the Best Start LA evaluation.
Southeast Michigan
The Michigan Recession and Recovery Study (MRRS) is a panel study of a stratified sample of 900 households in Southeast Michigan, which includes three waves of data collection. Once the third wave is complete in spring 2013, researchers will select a pool of MRRS respondents who may be eligible for participation in the current study based on their reported household composition, employment history, and benefit receipt. MRRS survey participants who meet the selection criteria will be sent a letter informing them of their potential eligibility for the qualitative interview (see Appendix B-2). The letter contains this statement: “Kristin Seefeldt, an MRRS researcher and University of Michigan faculty member, will contact you in the next few days to provide more information about this phase of the study and your eligibility to participate. If you are eligible, she can set up a time to do the interview.” Within a week of receipt of this letter, the co-PI will follow up with a phone call to determine interest. Those who are interested will then be screened to determine if they are indeed eligible (see Appendix A-7). If eligible, an in-person interview will be scheduled. The interview will be scheduled to occur within two months of initial contact. We estimate that 35 of the 900 survey participants will be eligible for the study and we will achieve a response rate of approximately 85%. The estimate of 85% is predicted based on the co-PI’s previous experience with the MRRS respondents.
We estimate an 85% response rate from eligible MRRS survey participants. The estimate of 85% is predicted based on the co-PI’s previous experience with the MRRS respondents.
B2. Procedures for Collection of Information
The sample will not be stratified and no special sampling techniques will be utilized. Individuals who participate in the Best Start LA Pilot Community Evaluation survey and the Michigan Recession and Recovery Survey who are identified as eligible after completing a telephone screener will be invited to participate in the study, up to 66 participants (as described under B.1).
Qualitative data will be collected through one-time, in-person, 90-minute interviews, using a guide with key topics and open-ended questions rather than close-ended questions (i.e. rigidly specified and directly quantifiable questions). This approach is the best data collection method for understanding in depth the nuanced reasons for disconnection, the contexts in which families live, and the strategies they use to cope and manage. The approach will allow flexibility in adapting the discussion guide to capture key aspects of disconnection based on families’ unique circumstances. The field research teams will be prepared to conduct the interviews in either English or Spanish, as needed. At least one member of each research team in each site will be fluent in both Spanish and English. The Spanish translations of the interview instruments are included in the Appendices.
For respondents in Los Angeles, once a minimum of 15 participants have consented, the Urban Institute team will plan the first site visit to LA and begin scheduling interviews. It will likely take several months to recruit a sufficient number of participants for a site visit. This process will be repeated until a second site visit occurs and the remaining sample has been interviewed. The expected maximum number of months between the point of first receiving a flyer to the date of the interview is 5 months. For those recruited closer to the point of a site visit, the duration could be one month.
For respondents in Michigan, the interview will be scheduled to occur within two months of initial contact.
All interviews are expected to take place between April 2013 and September 2013.
Interviews will take place at the respondent’s home or in another place convenient to the respondent.
Audio recorded interviews will be fully transcribed and translated, as needed, and then coded using NVivo software to identify patterns and themes among respondents for each key research question
B.3 Methods to Maximize Response Rates
The researchers will take a number of steps to minimize the burden to the interview respondents and maximize response rates. First, the site visits will be scheduled in a manner that allows respondents to identify the most convenient time and location for the visit within the study timeframe. For example, interviews may be arranged in the evening or weekends to accommodate work or school schedules, and in the home or at locations that will be readily accessible via public transportation. The respondents will receive a reminder call several days in advance to confirm the scheduled time and location of the meeting. A $40 token of appreciation will be provided to each respondent.
B4. Tests of Procedures or Methods to be Undertaken
In October 2012, a two-member interviewing team from the Urban Institute and a separate team from the University of Michigan each pre-tested the data collection tools with a respondent with similar characteristics as the targeted study population. The two respondents were recruited through local community-based organizations with which the researchers had established relationships. Each respondent was debriefed following their interview to provide feedback on the content, clarity, and order of questions. Adjustments were made to several items to improve comprehension, flow, and timing.
Specific modifications to conversation guide following pre-tests
To pre-test our original draft conversation guide, we interviewed two respondents, A and B, in English. Revisions were made to our draft guide based on these and the conversation guide attached to the original OMB submission reflected these changes. A took about 80 minutes and B took 90 minutes. The conversation guide flowed well in general. The following describes the specific changes made to the original draft guide in response to these pretests.
The employment section went quickly for both respondents. Based on this, we substituted a more compact and easy-to-use (for the interviewer) employment history calendar.
The social network circle worked well and both respondents seemed to like that section since they had a visual guide.
By the time we got to the set of coping questions, we had touched on many of the coping topics already. We realized based on the pre-tests that some of the coping questions should go after the financial/sources of income questions, because we were talking about expenses and income already in that section so the respondents were starting to describe what they do to get food and clothing and other assistance. There was also some repetition with the coping strategies section, since the social network piece began to touch on coping strategies. In response, we moved the coping strategies section after the financial and before the social network and cut down on the number of coping questions.
Original summary questions we had included were too broad and vague for respondents to answer. We clarified the wording of two of these and moved up one question to the section on benefits.
We realized we needed to ask about how the respondent heard about benefit programs/would they know where to get information on benefits so we added a question on this. We deleted questions that were originally at the end of the survey and seemed out of sequence in the pre-test (If you needed to get help from welfare/public assistance or non-profit agencies or charities, would you know how to do this? Where would you get information?)
The income section was a bit confusing to respondents. We had started with yearly household income, then monthly income, but in between asked how stable income is month to month and who does it support. We changed this to ask first about all the sources of income the respondent received last month, listing each of and the amount, the stability of each, and who the money is used to support. We then ask, “how much did you make last year,” since after listing the different sources they could more easily calculate how much they made last year from these same sources. We also clarified that by “last year’s income” we meant the past 12 months.
We had several questions about combining work and parenting – separately asking about child care difficulties affecting work and work responsibilities affecting child care. We combined these into one question to reduce repetition and discuss together.
In response to OMB’s request for additional pre-tests, we conducted two more pre-tests, C and D, one in English and one in Spanish. C took about 90 minutes and D took about 100 minutes. The changes we had made (detailed above) to the conversation guide worked well and the interviews flowed well. Based on these pre-tests we have made the following additional changes to the conversation guide. The revised conversation guide is attached (Appendix A-9 Conversation Guide REV).
We changed “What is your birthdate?” to “How old are you?” as it seemed less intrusive to respondents.
For non-US born respondents, we added “What country are you from?” and for immigrants that indicate language barriers, we added “Have you taken English classes?”
We added WIC to the list of programs we probe for in Question 31.
When asking about benefit receipt, we added a probe for how long the process took (Question 32h).
We clarified that the monthly income we are talking about in Question 39 refers to household income.
In Question 64, inserted <If Yes> to show when probe is appropriate.
For the social diagram, we dropped the request for the initials of the people in the respondents’ network. Simply writing the relationship has served well during the pretests.
When asking for specific examples of the type of help provided in Question 74, participants tend to provide this information on their own. The lettered bullets that ask for specific examples will be used as probes instead.
Both of these pretests were done using the new employment history calendar which worked well in these interviews.
We also made a few changes in the Spanish language conversation guide that have to do with translation. These are:
We changed the wording for Food stamps from “Cupones de alimentos” to “Estampillas de comida” since respondents are typically familiar with this translation.
We changed the wording when asking whether participants expect to earn more, the same or less than last year. “Espera ganar…” to “Cree que va a ganar…” (Question 36).
Several other small Spanish word changes were made to questions 43, 44, and 74.
B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
The two project Co-Principal Investigators designed the data collection protocols and analytic plan, and both will engage in data collection and analysis. Their names, degrees and affiliations are provided.
Heather Sandstrom, Ph.D., Urban Institute
Kristin Seefeldt, Ph.D., University of Michigan
Each of these individuals have extensive field experience and training, including measurement design, qualitative interviewing, and analysis of qualitative data.
The Federal project officer for this project is Emily Schmitt.
1 Loprest, Pamela (2011) “Disconnected Families and TANF” (OPRE Report #2011-49). http://www.acf.hhs.gov/sites/default/files/opre/disconnected.pdf
File Type | application/msword |
Author | Giesen, Lindsay |
Last Modified By | DHHS |
File Modified | 2013-04-22 |
File Created | 2013-03-14 |