Supporting Statement B - LifeSet - NonSub v2_Clean

Supporting Statement B - LifeSet - NonSub v2_Clean.docx

OPRE Study: Evaluation of LifeSet [Impact and Implementation Evaluation]

OMB: 0970-0577

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes






Evaluation of LifeSet



OMB Information Collection Request

New Collection





Supporting Statement

Part B



June 2021

Updated May 2022








Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers: Kathleen Dwyer, Alysia Blandon


Part B


B1. Objectives

Study Objectives

The primary objective of this study, funded by the Administration for Children and Families (ACF) Office of Planning, Research, and Evaluation (OPRE), is to test, through a rigorous evaluation, the impact of youth participation in the LifeSet program1 in New Jersey on youths’ education and employment, social connections, housing stability, resiliency, social-emotional competence, mental health, criminal justice system contact, intimate partner violence, and economic well-being. The study will have two main components that support this objective: an impact study to assess the effects of participation in LifeSet on outcomes of interest and an implementation study to describe how LifeSet is implemented and what factors influence service delivery. The specific research questions for the studies are outlined in SSA section A2.


Generalizability of Results

Impact Study:

This randomized controlled trial (RCT) evaluates the impact of a program designed to serve young people ages 17 to 21 who are current or former foster youth. As such, this randomized study is intended to produce internally-valid estimates of the intervention’s causal impact, not to promote statistical generalization to other sites or service populations.


Implementation Study:

This implementation study is intended to present an internally-valid description of the implementation of the LifeSet program in New Jersey, not to promote statistical generalization to other sites or service populations.


Appropriateness of Study Design and Methods for Planned Uses

Impact Study:

The use of an RCT study design will allow us to learn about the causal impacts of LifeSet and whether youth who receive the program fare better than youth who receive services as usual. A survey collected from youth at baseline and child welfare administrative data will be used to ensure that randomization produced groups that are equivalent on key pre-test measures and to determine if groups remain equivalent after attrition. The youth survey will collect data on pre-test measures and demographics not captured in the child welfare administrative data. The project team will use child welfare administrative data to collect accurate information about participants’ foster care histories without additional burden to participants. Additionally, collection of administrative data from other agencies will provide data on service receipt and participant outcomes in the research questions listed in section A2. Future requests will include two waves of follow-up youth surveys.


Implementation Study:

The accompanying implementation study will allow us to learn about the implementation of LifeSet, how the context for youth transitioning out of foster care in New Jersey influences implementation, and how LifeSet compares to other services youth may receive in the state. The current planned two site visits to New Jersey will allow the project team to interview child welfare agency administrators, administrators of local LifeSet providers, and staff and administrators of the developer of LifeSet. The project team will also hold focus groups with LifeSet Team Supervisors and Specialists who provide the frontline services to youth in the program. These interviews and focus groups will allow us to gain a detailed understanding of the different aspects of LifeSet’s implementation, the services youth in the state may receive, and the context around youth transitioning to adulthood in the state. Future requests will include additional site visits to conduct interviews and focus groups with staff of the child welfare agency, LifeSet providers, and the LifeSet developer as well as with young adults receiving LifeSet services and young adults receiving services as usual. Future requests will also include a brief online survey of LifeSet Specialists and observations of program activities.

Data from the impact and implementation studies are not intended to be representative of all programs serving vulnerable youth or youth ages 17 to 21 transitioning from foster care to adulthood. As such, findings from this study are not generalizable to all programs serving youth transitioning out of foster care. Key limitations will be included in written products associated with this study. As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.



B2. Methods and Design

Target Population

Impact Study:

The target population for LifeSet in New Jersey is youth ages 17 to 21 who have an open case with the New Jersey Department of Children and Families (DCF), are in the custody or guardianship of DCF, are living in the community (i.e. not incarcerated, in residential treatment, or hospitalized), are living in the counties served by the provider agencies, and do not have a disqualifying condition (e.g., serious violent criminal history, residence out of state, severe mental illness, or intellectual disability). Prior to the start of randomization, the target population was estimated at 900 age-eligible youth in the target counties. Since randomization began in August 2021, around 18 percent of age-eligible youth have met the program’s exclusion criteria. This rate was unknown at the time the evaluation plan was finalized. Combined with a pandemic-related decline of about 200 youth aged 17-21 in care, the estimated target population size is closer to 680.


The program will have slots to admit approximately 200 study-eligible youth per year assuming an average length of service of nine months and a maximum length of service of twelve months. Because the program will begin serving youth prior to the evaluation starting, some of the program slots will not be available for study youth during the first six to nine months of study enrollment. As such, there will likely be fewer than 200 youth enrolled during the first year of study intake. The project team anticipates enrollment will reach 200 in the second year of study intake. Thus, we estimate that it will take up to two years to enroll the target sample size of at least 300 youth in the LifeSet program.


Implementation Study:

For the implementation study, the target population study includes administrators and staff of DCF, the LifeSet provider agencies, and the LifeSet developer; young adults enrolled in LifeSet, and young adults receiving services as usual. The sampling frame will consist of the roster of staff and young adults randomized as part of the impact study. The unit of analysis is the individual, and the project team will use a nonrepresentative sample. The current information request only includes data collection from administrators and staff of DCF, the LifeSet provider agencies, and the LifeSet developer. Future requests will include youth randomized in the impact study to LifeSet and services as usual.


Sampling and Site Selection

LifeSet is currently implemented in 15 states. New Jersey was selected for the proposed evaluation after conversations with the LifeSet Developer and the state DCF indicated the state would support a rigorous evaluation. All four local LifeSet provider agencies that implement LifeSet in New Jersey are included in the study.



Impact Study:

The sampling frame for the impact study is the list of target youth determined by New Jersey DCF to be eligible for LifeSet services, stratified by the four provider regions. New Jersey DCF determines eligibility for LifeSet using its administrative data and screening conversations with caseworkers. The DCF Research Coordinator will send monthly a list of youth determined to be eligible for LifeSet to the Urban Institute project team, who will randomly assign approximately 300 eligible youth to receive LifeSet and approximately 300 eligible youth to receive services-as-usual during the two-year intake period.



Randomization began in August 2021 using a 1:1 randomization ratio (1 treatment to 1 control, or 1T:1C). However, only 60 percent of treatment group youth have gone on to enroll in LifeSet, much lower than the anticipated rate of 90 percent. To account for the decrease in excess demand, the randomization ratio was changed in March 2022 to a 2:1 randomization ratio (2 treatment to 1 control, or a 2T:1C). The anticipated sample size has increased from 600 to 661 to allow for as many eligible youth to be randomized into the treatment group as possible while preserving statistical power with unequally sized groups. A total of 340 youth were randomized at the 1T:1C ratio (170 in each group) and an expected 321 additional youth will be randomized at the 2T:1C ratio, for a final sample size of 661 (384 treatment, 277 control). New minimum detectable effects (MDEs) were calculated using an alpha of 0.05, 80 percent power, and R2 of 0.30. The MDE for administrative data is 0.185 and for survey data is 0.201 (assuming an 85% retention rate at the third survey wave).2 We note that the target population is not evenly distributed across the four providers’ catchment areas. To prevent one provider from randomly having all youth in their catchment area assigned to only one group, randomization will be stratified by provider catchment area using a 2T:1C ratio within each stratum.







Implementation Study:

The project team will use non-probability, purposive sampling (recruitment) to identify potential respondents in each respondent group who can provide information on the study’s key constructs. Because they will be purposively selected, they will not be representative of or reflect the full population of administrators or staff. For administrators of DCF, LifeSet developer, and LifeSet provider agencies, the project team will solicit suggestions for whom to interview based on their knowledge of LifeSet and DCF processes or involvement in the implementation of LifeSet in New Jersey. For focus groups with LifeSet Team Supervisors and LifeSet Specialists, the project team will request a roster from each LifeSet provider agency of front-line staff in those positions. Future requests will include a staff survey of LifeSet Specialists, observations of program activities, and interviews and focus groups with youth receiving LifeSet and services as usual.



B3. Design of Data Collection Instruments

Development of Data Collection Instruments

Impact Study:

Baseline Youth Survey: Items on the baseline youth self-report survey were drawn from instruments used in similar studies of the target population (Courtney et al. 2016; Dworsky et al. 2011) or similar aged respondents (Bureau of Labor Statistics 2013; Center for the Study of Social Policy 2020; Harris et al. 2009) that aligned with the study objectives. Table B1 below describes how the baseline survey sections align with the impact study research questions listed in A2.


Table B1.

Instrument 1 – Baseline Youth Survey

Research Question(s)

Living Arrangements (LA1_W1 – LA5_W1)

3

Social Support (SSN0_W1 – SSN7_W1)

2

Education (EDU1_W1 – EDU10_W1)

1

Employment and Earnings (EMP0_W1 – EMP9_W1)

1, 8

Economic Hardship (ECON1_W1 – ECON3_W1)

8

Mental Health Services (MH1_W1 – MH4_W1)

5

Substance Abuse (SA1_W1 – SA5-W1)

5

Criminal Justice Involvement (CRM1_W1 – CRM2_W1)

6

Spouse/Partner Violence (REL0_W1 – CTS4_W1)

7

Youth Resiliency (YR0-W1 – YR10_W1)

4

Social-Emotional Competence (SEC0_W1 – SEC16_W1)

4


Data collection was streamlined by only including items in the survey that are required for establishing baseline equivalence of treatment and control groups on demographic characteristics and pre-test measures. Skip logic patterns were designed to streamline data collection by ensuring that only relevant items are asked of respondents. Where applicable, specific language in items was modified to match the New Jersey context. The baseline survey will be pretested with a small (n<10) number of youth similar to the target population to ensure usability. The survey data will be collected by trained field interviewers using computer assisted personal interviewing (CAPI) technology in order to reduce measurement error. Additionally, audio computer assisted self-interviewing (audio-CASI) will allow participants to privately answer the most sensitive items on the survey to prevent social desirability bias in responses. Future requests will include two waves of youth follow-up surveys.


Administrative Data: The project team will use administrative data for all aspects of the impact study, including for baseline data, service receipt, and outcomes. The project team will pursue data sharing agreements with additional administrative data sources noted in Table B2 to supplement or replace the self-report outcome data from the survey.


Since each agency already collects the information the project team is requesting, pulling administrative data is the lowest burden method of collecting the necessary information. Collection of administrative data reduces the measurement error associated with collecting similar information through participant self-report. Table B2 below lists the sources of administrative and describes how each aligns with the research questions. The Administrative Data List (Instrument 2) has a detailed description of the data fields to be requested from each source.


Table B2.

Instrument 2 – Administrative Data List

Research Question(s)

Source(s)

Public child welfare agency

3, baseline equivalence

New Jersey Department of Children and Families (DCF)

LifeSet provider agencies

1, 2, 3

LifeSet provider agencies

Criminal justice agencies

6

New Jersey Department of Corrections (NJDOC), New Jersey Juvenile Justice Commission, New Jersey Administrative Office of the Courts Superior Court Probation Services Division

Public benefits agency

8

New Jersey Department of Human Services Division of Family Development

Homeless management information system

3

New Jersey Statewide Homelessness Management Information System (HMIS)

National Student Clearinghouse

1

National Student Clearinghouse (NSC)

National Directory of New Hires

1

National Directory of New Hires (NDNH)

State wage records

1, 8

New Jersey Department of Labor and Workforce Development Unemployment Insurance (UI) employer wage records


Implementation Study: The study will use administrative data collected by the LifeSet provider agencies to understand service delivery and whether the program was delivered with fidelity to the LifeSet model. The project team developed focus group and interview guides appropriate for addressing the key research questions listed in Supporting Statement A2. For each research question, the project team first identified appropriate data sources (i.e., target respondents most knowledgeable about the topic) and the type of instrument (i.e., focus group or individual interview) most appropriate to address the question. Next, the project team created a crosswalk of the instruments and identified target respondents, topics, and constructs. The crosswalk was used to develop protocols tailored to specific respondents. After developing each of the initial protocols, the project team compared the data collection instruments and streamlined each of the protocols to ensure they were only asking necessary questions and to avoid duplication. The team also mapped the questions in the data collection instruments with the study research questions to ensure that they were only asking what was necessary to answer the research questions and understand the implementation of the program.


The project team does not plan to pilot the discussion guides with respondents similar to the target population; the guides were only piloted by the team internally to determine burden hours. The implementation study relies on triangulation, as stakeholders will be asked about similar topics to give a full picture of the questions the project team is attempting to answer. Collecting diverse perspectives will allow for a deeper examination of the study research questions of how LifeSet is implemented in New Jersey, the contextual factors that shaped implementation, whether modifications were made to the program, infrastructure that supported implementation, and the services young people would receive in the absence of LifeSet. See Table B3 below for a description of how each data collection instrument aligns with the research questions listed in Supporting Statement A2. Future requests will include a staff survey of LifeSet Specialists, observations of program activities, and interviews and focus groups with youth receiving LifeSet and youth receiving services as usual.


Table B3.

Instrument

Research Question(s)

Instrument 2 – Administrative Data List, LifeSet Provider Agency Table

1, 4, 6

Instrument 3A – Site Visit 1 Interview Guide for Administrators: Child Welfare Agency Administrators

1, 2, 5

Instrument 3B – Site Visit 1 Interview Guide for Administrators: Licensed LifeSet Experts

1, 5

Instrument 3C – Site Visit 1 Interview Guide for Administrators: LifeSet Developer Administrators

1, 5

Instrument 3D – Site Visit 1 Interview Guide for Administrators: Provider Agency Administrators

1, 2, 5

Instrument 4A – Site Visit 2 Focus Group Guide for Staff: LifeSet Specialists

1, 2, 4, 5

Instrument 4B – Site Visit 2 Focus Group Guide for Staff: LifeSet Team Supervisors

1, 2, 4, 5

Instrument 5A - Site Visit 2 Interview Guide for Administrators: Child Welfare Agency Administrators

1, 3, 4, 5, 6

Instrument 5B – Site Visit 2 Interview Guide for Administrators: Licensed LifeSet Experts

4, 5, 6

Instrument 5C - Site Visit 2 Interview Guide for Administrators: LifeSet Developer Administrators

1, 4, 5, 6

Instrument 5D – Site Visit 2 Interview Guide for Administrators: Provider Agency Administrators

1, 4, 5, 6



B4. Collection of Data and Quality Control

Impact Study:


Baseline Youth Survey: Urban Institute will share the contact information for youths in both the LifeSet and services-as-usual groups with a contracted independent survey firm, RTI International, to collect youths’ baseline survey responses. RTI International will mail participants a lead letter and fact sheet introducing the study and making youth aware that someone will be contacting them (Appendices D and E). Field interviewers will initiate telephone and in-person contact attempts one week after the mailings to schedule a time to conduct the baseline survey interview. Field interviewers will obtain youths’ informed consent (or assent for minors) to participate in study activities (Appendices A and B). The project team cognitively tested the consent forms with a small number (n=3) young adults aged 18-21 who had spent time in foster care to ensure comprehension of informed consent. The project team made revisions to the language and formatting of the baseline youth survey (Instrument 1), young adult consent form (Appendix A), youth assent form (Appendix B), and baseline study fact sheets (Appendix E) to ensure participant comprehension of study materials. In addition, the project team moved the request for consent to access administrative data to the end of the baseline youth survey (Instrument 1) as the cognitive test suggested some respondents would be more comfortable with asking for consent after rapport has been built with the interviewer. To encourage youths’ participation and avoid refusals, interviewers will emphasize privacy and highlight the opportunity participation provides for voicing opinions about their experiences. Baseline survey data will be collected by field interviewers either in-person or virtually by phone using CAPI.


To ensure data quality, field interviewers will receive two days of training from RTI International on how to ethically obtain informed consent, use the CAPI software, and accurately record participants’ responses. In addition, with participants’ consent, portions of survey interviews will be recorded and reviewed by trained validators for quality assurance.


Panel Outreach: In advance of future requests that will include two waves of follow-up youth surveys, the project team has created an outreach process to maintain contact with participants. Panel outreach activities will prevent high attrition rates between survey waves. See Table B4 below for a description of panel outreach activities to be conducted by RTI International.


Table B4.

Steps

Purpose

Benefit

Actions

Four-month tracking calls

(Appendix G)

Each sample member will receive a phone call 4 months after completing the baseline interview

Maintains communication Uncovers recent moves or unreliable phone service

Re-dial disconnected numbers at different time points

Transfer cases among field interviewers to contact respondents from different telephone numbers

Send text message alerts

Contact secondary sources (e.g., friends)

Contact service providers and child welfare staff for updates

Panel outreach mailing

(Appendices H and I)

Each sample member will receive a letter and business reply postcard 8 months after the baseline interview

Allows respondents to call to confirm/update information

Uncovers recent moves

Informs sample members of upcoming interview

Update system with new leads obtained

Send undelivered packages to in-house tracing unit

Send a new mailing for any mail returned with forwarding information

Intensive tracing

Specialized staff will use databases to generate new leads, then work to confirm new contact information

Decreases field tracing costs during follow-up

Create a new panel maintenance mailing for any leads generated from tracing unit

Update system with new leads obtained


Administrative Data: The elements to be collected from administrative records are outlined in the administrative data list (Instrument 2). For each agency, this activity will consist of four components. The first will be having agency-specific conversations with the staff most familiar with the data to understand what data are available and its structure and quality. During this conversation, the project team will establish a timeline and procedures for transferring the data to the evaluation team. The second component will be the first data pull, which will occur at one-year after the last youth is randomized. This first pull will require the data administrator at each agency to identify youth randomized in the study and extract the relevant data elements. The third component will be a follow-up conversation with the data staff to answer any questions or address any concerns regarding data quality that have come up around the first data pull. The fourth component will be the second and final data pull, which will occur at two years after the last youth is randomized. The project team will develop data sharing agreements with each agency as necessary for data access and will also facilitate data sharing agreements across agencies as necessary.


Implementation Study:


Interviews and Focus Groups: Project team members from the Urban Institute will collect data for the implementation study. Data collection will consist of interviews and focus groups that will take place during site visits (all virtual pending changes to COVID-19 travel restrictions and safety concerns). Table A1 in Supporting Statement A summarizes the data collection that will be conducted for the implementation study. The project team will contact agency and program leaders and front-line LifeSet staff via email using contact information provided by each organization. The project team will use an email outreach script (Appendices J and K). If necessary, the team will follow-up with individualized emails or phone calls to answer any questions respondents may have.


All project team members who participate in site visits will be trained on consent and interview procedures prior to entering the field. The team will all sign the Urban Institute Confidentiality Forms prior to data collection and will store all data in locked cabinets or on a secure drive.


At least one senior researcher from the project team will attend each site visit. To reduce burden and work disruption, site visits will be scheduled based on interview- and focus- group participants’ availability. If in-person site visits become feasible, the visits will generally last two days. The discussion guide questions for interviews and focus groups are designed to elicit nuanced responses, and the project team will need to probe with individualized follow-ups when answers are vague, ambiguous, or when more specific or in-depth information is needed. At the start of the interviews and focus groups with staff, the project team will ask the respondents for verbal consent to participate and permission to record the conversation using an informed consent form (Appendix L). Verbal consent will also be requested during program participant focus groups. The team will ask program participants to sign consent forms for focus groups conducted in-person, if feasible. The team will cover the following during the consent process: the study’s purpose and funder, the nature of the information that will be collected, how the information will be used, the potential benefits and risks of participating, and assurance that participation in the study is voluntary. The team will also inform study participants that they may choose to skip any questions or stop participating in the interview at any time. Additionally, the team will also ask program participants for consent to audio record the focus group. The team will only use this audio recording to supplement notes taken during the focus group and will not record if any program participant does not consent to being recorded. If at any time a study participant becomes distressed, the study team members conducting the interview will stop the interview.


Administrative Data: Administrative data will be requested from the local LifeSet providers and the LifeSet developer. Local LifeSet provider agencies collect service data in their own management information systems. The process for obtaining this data will be the same as outlined in the above section for the impact study administrative data.


B5. Response Rates and Potential Nonresponse Bias

Response Rates

Impact Study:


Baseline Youth Survey: For the baseline youth survey data, the project team expects a 95 percent unit response rate consistent with previous studies of the transition-aged youth population (Courtney, Valentine, and Skemer 2019; Courtney et al. 2018; 2011; Dworsky et al. 2011; U.S. Department of Health and Human Services, Administration for Children and Families 2008). Unit response rates will be calculated for both the overall study sample and separately for treatment and control group and will be calculated by dividing the number of youth for whom survey data are available by the total number youth in the study and in each group. We expect the survey item response rate to be at least 95 percent based on previous studies (Courtney et al. 2018). Item response rates will be calculated by dividing the number of respondents that answered an item by the total number of respondents asked the item. To obtain the expected response rates, the project has contracted with an independent survey firm, RTI International, with a demonstrated track record of collecting data from youth formerly in foster care or similar populations.


Administrative data: For the child welfare agency administrative data, we expect a 100 percent response rate. A majority of the data the project team intends to collect already exist in a form required for submission to the federal government through the Administrative Foster Care and Adoption Reporting System (AFCARS)3. We also expect a 100 percent response rate for administrative data from the LifeSet provider agencies. The developer of LifeSet requires providers to collect a majority of the information the project team intends to request as part of certification. The project team assumes that they will have child welfare administrative data for all youth in the study and LifeSet program administrative data for all treatment group youth who enrolled in the program. Past studies conducting impact studies on child welfare involved families using administrative data (Pergamit et al. 2019, Pergamit et al. 2017) have had 99 percent response rates on the impact study data collection. For non-child welfare agencies, the project team expects to receive administrative data from at least half of the agencies the project team intends to seek data sharing agreements with. The project team does not anticipate difficulties receiving data from the National Student Clearinghouse (NSC) or the National Directory of New Hires (NDHN) as these sources have streamlined processes for requesting data that do not require negotiation of data sharing agreements. Data from the homeless management information system (HMIS), state wage records, and the NSC are the most important for answering the primary research questions, particularly for participants that do not respond to the follow-up surveys, and these sources are likely to share data related to study participants. Criminal justice agencies will be the least likely to share data. The project team has experience negotiating for and obtaining data from similar agencies in prior projects. New Jersey DCF has agreed to facilitate discussions with other state agencies to assist the project team in negotiating data sharing agreements. We expect to have a 90 percent match rate, on average, in linking study participants’ data across the agencies the project team obtains data from.


Implementation Study:

The interviews and focus groups are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported, though ACF anticipates a high response. The project team expects that program administrators and other staff will be interested in sharing their insights with ACF. Additionally, the project team will conduct interviews on site or virtually and select dates that are most convenient for program staff. To make participating as easy as possible, the project team will work collaboratively with respondents to schedule the interviews at times that are most convenient.


NonResponse

Impact Study:


Baseline Youth Survey: For the baseline youth survey, nonresponse bias will be analyzed using youth characteristics noted in the child welfare administrative data to determine if certain characteristics are significantly associated with nonresponse. These include youth demographics (age, race/ethnicity, sex, county of residence) and foster care experiences (age at removal, total number of removals, length of time in placement, out-of-home placement types, etc.). As indicated above, item nonresponse is anticipated to be 5 percent or less. Item nonresponse will be analyzed to determine whether data are missing at random or if certain items are consistently missing using the child welfare administrative data as well as valid responses to survey items. If data are missing at random, the project team will consider whether imputation techniques to account for any missing data will be useful and feasible based on how extensive the missingness is and the availability of variables useful for imputation. The project team may also use relevant administrative data sources to determine youths’ statistics on pre-tests at baseline. If the project team decides to impute any missing variables, they will use Stata to run multiple imputation (Little & Rubin, 2002). If missingness exceeds 5 percent for any items, the project team will work with the subcontracted survey firm to improve question wording for the follow-up survey waves to prevent future missingness. Items that are missing 10 percent or more of the time will be dropped from the survey.


Administrative data: We expect DCF child welfare administrative data to be 95% complete. If the data have a higher rate of incomplete or missing information, the project team will note the data elements that are missing and work with agency staff to understand what factors drive missingness. In the analysis, the project team will consider whether imputation techniques to account for any missing data will be useful and feasible based on how extensive the missingness is and the availability of variables useful for imputation. If the project team decides to impute any missing variables, they will use Stata to run multiple imputation (Little & Rubin, 2002).


Implementation Study:


Interviews and Focus Groups: As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in written materials associated with the data collection. In the event that staff are not available to participate in interviews, the project team will work closely with administrators to either identify other staff with similar knowledge, or ways to schedule telephone interviews or follow-up conversations. Because the evaluation is voluntary, any staff member may choose not to participate. Any substantial nonresponse from staff will be documented and reported as a study limitation.


Administrative data: We expect administrative program data from LifeSet provider agencies to be 95% complete. If the data has a higher rate of incomplete or missing information, the project team will note the data elements that are missing and work with agency staff to understand what factors drive missingness. In the analysis, the project team will consider whether imputation techniques to account for any missing data will be useful and feasible based on how extensive the missingness is and the availability of variables useful for imputation. If the project team decides to impute any missing variables, they will use Stata to run multiple imputation (Little & Rubin, 2002).



B6. Production of Estimates and Projections

Impact Study:

The impact study will produce internally valid estimates of the program’s impact and will be released to the public. The estimates will not be representative of the population of all youth transitioning out of foster care and as such will not be used to make policy decisions. Publications will clearly state that findings are not generalizable beyond the specific study population.


Impact estimation methods: The project team will conduct an Intent-to-Treat (ITT) analysis to estimate the impact of program participation on outcomes. The ITT estimate is measured as the average individual outcomes for all randomized to the treatment group less the average individual outcomes for all randomized to the control group. The analysis will control for pre-randomization covariates using a regression framework. For continuous outcomes, the team will utilize an ordinary least squares (OLS) regression model and for binary outcomes, the team will utilize a logit or probit model for estimation. The exact covariates will be finalized after reviewing the data for data quality and completeness. In addition, the sample will be evaluated for equivalence between the treatment and control groups on observable variables collected before and shortly after randomization. Variables that show significant differences between the two groups at p<.05 will be included as covariates in the regressions. The project team anticipates conducting exploratory subgroup analyses of program impacts on substantively important subpopulations such as those defined by youths’ gender, parenting status at baseline, criminal justice history at baseline, and age at randomization. Depending on the take-up and crossover rates for the evaluation, they may also estimate the Treatment-on-the-Treated (TOT) estimate using an "instrumental variable" estimate (IV) (Angrist, Imbens, & Rubins, 1996). The IV estimate is a “per-person served” estimate, among those who comply with their random assignment that accounts for take up and crossovers.


Auxiliary data: Administrative data may be used to analyze and correct for non-response bias. The child welfare administrative data contains demographic characteristics of all study participants; the team will use these data to substitute for missing demographic data on the surveys and determine patterns of non-response bias. The project team will seek access to administrative data from the agencies listed in section B3. The team will use these data to improve the quality of the impact estimates through substituting for missing outcome data from the follow-up surveys and improving accuracy of any imputation used to address item non-response.


Data archiving: Quantitative datasets from the impact study will be archived at the National Data Archive on Child Abuse and Neglect (NDACAN) at Cornell University. Files will be stored as Restricted Access Files. Files will be deidentified, including removal or masking of personally identifiable information (PII) as well as both direct and indirect identifiers. Documentation such as a detailed codebook, user manual, and data collection instruments will also be submitted to NDACAN to increase accuracy of any secondary analysis performed by individuals who were not part of the project team.


Implementation Study:

The data will not be used to generate population estimates, either for internal use or dissemination. The information gathered from the implementation interviews and focus groups with administrators and staff will be combined and quantified based on patterns across sites and will also be archived at NDACAN.



B7. Data Handling and Analysis

Data Handling

Impact Study:


Baseline Youth Survey: The use of trained interviewers and audio computer-assisted self-interview (ACASI) software to administer the youth self-report survey will minimize errors for youth with reading difficulties or language comprehension issues. Interviewers will be trained to ensure youth understand the questions being asked in order to elicit a valid response. The use of ACASI software for sensitive questions will minimize reporting errors due to socially desirable responding. The survey will incorporate the use of skip logic to ensure that only relevant items are asked of participants. All youth survey data will be collected on computers or tablets to minimize errors in data entry.


Administrative Data: The project team will request codebooks or other documentation for all administrative data. Data files will be checked for common detectable errors such as invalid date pairings (e.g. birth date out of age rage, stop date before start date), values other than those noted in the data documentation, and differences in data elements that should be stable for individuals across time and data sources (e.g. birth date, sex, race, ethnicity). The project team will consult with the administrative data source to correct detectable errors wherever feasible. When data incongruities exist for the same study participant among data sources, the administrative data from the New Jersey DCF will be held as the “data of record” and will supersede other data sources. Any additional rules needed for correcting detectible errors will be clearly defined and documented.


Implementation Study:

Interviews and focus groups will be audio recorded, with participant permission, to minimize errors in the final transcripts. The project team will use a transcription service to fully transcribe audio recordings of interviews and focus groups. In cases when participants did not consent to being recorded, the project team will clean and code notes taken during the interview or focus group.


As discussed below, interview and focus group data will be qualitatively coded. Once the coding scheme is established, the project team will ensure inter-rater reliability by having multiple coders code several transcripts. The project team will run coding comparison queries in NVivo and re-code until a Kappa coefficient of over 0.80 is achieved (considered a high level of agreement between) (McHugh, 2012). If the initial level of agreement is below 0.80, the coders will meet to discuss the definitions of each code before returning to recode the transcripts.


Data Analysis

Impact Study:

See section B6 above for impact study data analysis. The project team will pre-register the study with the American Economic Association’s registry for RCTs. An analysis plan will be publicly posted on OPRE’s website prior to the start of analysis.


Implementation Study:

The implementation study will use qualitative methods to analyze information from the various data sources. The project team will follow a deductive coding process, beginning with development of a coding scheme based on key constructs drawn from the implementation study research questions outlined in Supporting Statement A2. The coding scheme will be further developed by reviewing transcripts and interview notes in conjunction with the instruments and the research questions to identify key themes or topic areas that arise through different interviews. Project team members who participated in qualitative data collection will review the coding scheme to ensure that important points are not missed.


Transcripts of interviews and focus groups will be uploaded into qualitative data analysis software. Project team members will test the coding scheme by coding one or two interviews each, before running a query to examine coder reliability. The coding team will then meet to review any questions and divergent codes. The coding team will continue coding sections of an interview until they have reached an interrater reliability of a kappa coefficient of .80 or greater.


The full study will be pre-registered on the American Economic Association RCT registry.


Data Use

The project team will archive data as discussed in Supporting Statement A, section A10 Data Archiving to facilitate secondary data analysis. Findings from both the impact and implementation studies will be disseminated to the public through a comprehensive evaluation report and a peer-reviewed journal article. The project team will include a detailed study methodology in its final report that will help the public understand and properly interpret the information derived from the data collection. The methodology section will include, but not be limited to the information discussed in sections B2, B5, and B6 above. As discussed in Supporting Statement A section A10 Limitations, the study’s limitations will be included in all written products associated with the study.



B8. Contact Person(s)

The information for this study is being collected by the Urban Institute on behalf of ACF. The Co-Principal Investigators for this study are:

Mike Pergamit

Urban Institute

202.261.5276

mpergamit@urban.org

Mark Courtney

University of Chicago

773.702.1219

markc@uchicago.edu




Attachments

Instrument 1: Baseline Youth Survey

Instrument 2: Administrative Data List

Instrument 3A: Site Visit 1 Interview Guide for Administrators: Child Welfare Agency Administrators

Instrument 3B: Site Visit 1 Interview Guide for Administrators: Licensed LifeSet Experts

Instrument 3C: Site Visit 1 Interview Guide for Administrators: LifeSet Developer Administrators

Instrument 3D: Site Visit 1 Interview Guide for Administrators: Provider Agency Administrators

Instrument 4A: Site Visit 2 Focus Group Guide for Staff: LifeSet Specialists

Instrument 4B: Site Visit 2 Focus Group Guide for Staff: LifeSet Team Supervisors

Instrument 5A: Site Visit 2 Interview Guide for Administrators: Child Welfare Agency Administrators

Instrument 5B: Site Visit 2 Interview Guide for Administrators: Licensed LifeSet Experts

Instrument 5C: Site Visit 2 Interview Guide for Administrators: LifeSet Developer Administrators

Instrument 5D: Site Visit 2 Interview Guide for Administrators: Provider Agency Administrators

Appendix A: Young Adult Study Consent Form

Appendix B: Youth Study Assent Form

Appendix C: Notification Letter to Parents of Minors at Baseline

Appendix D: Baseline Youth Survey Lead Letters

Appendix E: Baseline Youth Survey Fact Sheets

Appendix F: Baseline Youth Survey Refusal Letters

Appendix G: Panel Maintenance Tracking Scripts

Appendix H: Panel Maintenance Letter

Appendix I: Panel Maintenance Postcard

Appendix J: Outreach Email Staff Interviews and Focus Groups – Staff Connected to via Program/Agency

Appendix K: Outreach Email Staff Interviews and Focus Groups – Staff Not Connected to via Program/Agency

Appendix L: Implementation Study Informed Consent for Staff

Appendix M: IRB Approval Letter



References

Angrist, Joshua, Guido W. Imbens and Donald Rubin. “Identification of Causal Effects Using Instrumental Variables.” Journal of the American Statistical Association 91, no .434 (1996): 444-455.

Bureau of Labor Statistics, U.S. Department of Labor. “National Longitudinal Survey of Youth 1997 Cohort, 1997-2011 (Rounds 1-15),” 2013. https://www.nlsinfo.org/content/cohorts/nlsy97.

Center for the Study of Social Policy. “Youth Thrive Survey User Manual.” Center for the Study of Social Policy, August 2020. https://cssp.org/resource/youth-thrive-survey-user-manual/.

Courtney, Mark E., N.J. Okpych, P. Charles, D. Mikell, B. Stevenson, K. Park, B. Kindle, J. Harty, and H. Feng. “Findings from the California Youth Transitions to Adulthood Study (CalYOUTH): Conditions of Foster Youth at Age 19.” Chicago, IL: Chapin Hall at the University of Chicago, 2016. https://www.chapinhall.org/research/majority-of-california-youth-in-foster-care-believe-extended-care-helps-them-reach-life-goals/.

Courtney, Mark E., N.J. Okpych, K. Park, J. Harty, H. Feng, A. Torees-Garcia, and S. Sayed. “Findings from the California Youth Transitions to Adulthood Study (CalYOUTH): Conditions of Youth at Age 21.” Chicago, IL: Chapin Hall at the University of Chicago, 2018. https://www.chapinhall.org/research/calyouth-wave3/.

Courtney, Mark E., Erin J. Valentine, and Melanie Skemer. “Experimental Evaluation of Transitional Living Services for System-Involved Youth: Implications for Policy and Practice.” Children and Youth Services Review 96 (January 1, 2019): 396–408. https://doi.org/10.1016/j.childyouth.2018.11.031.

Courtney, Mark E., Andrew Zinn, Heidi Johnson, and Karin E. Malm. “Evaluation of the Massachusetts Adolescent Outreach Program for Youths in Intensive Foster Care: Final Report.” Multi-Site Evaluation of Foster Youth Programs. Washington, D.C.: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2011. https://www.acf.hhs.gov/opre/resource/evaluation-of-the-massachusetts-adolescent-outreach-program-for-youths-in.

Dworsky, Amy, Mark E. Courtney, Jennifer Hook, Adam Brown, Colleen Cary, Kara Love, Vanessa Vorhies, et al. “Midwest Evaluation of the Adult Functioning of Former Foster Youth,” 2011. https://www.chapinhall.org/research/midwest-evaluation-of-the-adult-functioning-of-former-foster-youth/.

Harris, K.M., C.T. Halpern, E. Whitsel, J. Hussey, J. Tabor, P. Entzel, and J.R. Udry. “The National Longitudinal Study of Adolescent to Adult Health: Research Design [WWW Document],” 2009. https://addhealth.cpc.unc.edu/documentation/study-design/.

Little, Roderick JA, and Donald B. Rubin. Statistical analysis with missing data, 2002.

McHugh, Mary L. "Interrater reliability: the kappa statistic." Biochemia medica 22, no. 3 (2012): 276-282.

Pergamit, Michael, M. Cunningham, and D. Hanson. "The impact of family unification housing vouchers on child welfare outcomes." American Journal of Community Psychology 60.1-2 (2017): 103-113.

Pergamit, Michael, Mary Cunningham, Devlin Hanson, and Alexandra Stanczyk. “Does Supportive Housing Keep Families Together?” Supportive Housing for Child Welfare Families Research Partnership. Washington, D.C.: Urban Institute, May 2019. https://www.urban.org/sites/default/files/publication/100289/does_supportive_housing_keep_families_together_1.pdf.

U.S. Department of Health and Human Services, Administration for Children and Families. “Evaluation of the Life Skills Training Program Los Angeles County, California: Final Report.” Multi-Site Evaluation of Foster Youth Programs. Washington, D.C., July 2008. https://www.acf.hhs.gov/opre/resource/evaluation-of-the-life-skills-training-program-los-angeles-county.



1 LifeSet is a therapeutic case management intervention that provides youth and young adults leaving foster care, juvenile justice, and mental health systems with the intensive in-home support and guidance they need to move towards youth-defined goals in multiple domains of independent living including education, housing, employment and financial security, health and safety, and social connections and support.

2 MDEs using the same assumptions for an equally split sample size of 600 were 0.192 for administrative data and 0.208 for survey data.

3 OMB #0970-0422

13




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBrewsaugh, Katrina
File Modified0000-00-00
File Created2022-05-04

© 2024 OMB.report | Privacy Policy