STREAMS_Supporting Statement B_Revised March 2017

STREAMS_Supporting Statement B_Revised March 2017.docx

Strengthening Relationship Education and Marriage Services (STREAMS) Evaluation

OMB: 0970-0481

Document [docx]
Download: docx | pdf



Strengthening Relationship Education and Marriage Services (STREAMS) Evaluation



OMB Information Collection Request

0970-0481

Supporting Statement

Part B

February 2016

Updated March 2017



Submitted by:

Office of Planning, Research and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


330 C Street, SW
Washington, D.C. 20201


Project Officer:

Samantha Illangasekare





CONTENTS

B1. Respondent universe and sampling methods 1

Target population 1

Sampling frame, sample design, and coverage of target population 1

Precision needed for key impact estimates 2

Expected response rates to participant surveys for the impact study 4

Expected item nonresponse rate for critical questions 4

B2. Procedures for collection of information 4

Process study data collection 4

Impact study data collection 5

B3. Methods to maximize response rates and deal with nonresponse 6

Process study 6

Impact study 7

B4. Tests of procedures or methods to be undertaken 8

B5. Individuals consulted on statistical aspects and individuals collecting and/or analyzing data 8

References 9

TABLES

B.1 Minimum detectable effect sizes 3

B.2 Expected response rates and number of responses 4



B1. Respondent universe and sampling methods

The Office of Planning, Research, and Evaluation (OPRE) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services seeks approval to collect process and impact study data from six healthy marriage and relationship education (HMRE) programs funded by the Office of Family Assistance (OFA) within ACF. This information collection is being carried out as part of the Strengthening Relationship Education and Marriage Services (STREAMS) evaluation. The purpose of STREAMS is to measure the effectiveness and quality of HMRE programs, program components and implementation factors designed to strengthen and improve the quality of romantic relationships. In September 2015, ACF awarded five-year grants to 46 HMRE grantees that plan to serve a mix of adults and youth. The STREAMS evaluation will include up to six sites chosen from among these grantees. Some selected sites will serve youth; others will serve adults. Proposed data collection activities in study sites are: (1) data collection for documenting program implementation, including: semi-structured interviews with grantee staff and community stakeholders, focus groups with program participants, a survey of program staff, and use of a management information system by grantee staff to record data on session adherence; and (2) data collection for measuring program impacts, including: baseline and follow-up surveys of 3,600 individuals in sites serving youth in high schools; and baseline and follow-up surveys of 4,000 individuals in sites serving adults.

Target population

The STREAMS evaluation will examine HMRE programs in six sites selected from among the 46 HMRE grantees that were awarded funding by ACF in September 2015. Sites will be selected that are best suited to examine research questions that will fill gaps in the literature and field and that address issues of particular interest to ACF. Issues of particular interest include the effects of widely used curricula that have not yet been rigorously studied, effective strategies for combining relationship skills and economic stability services, and implementation factors that could influence program effects. Sites must also be able to generate adequate sample sizes to detect meaningful impacts and be able to implement a random assignment research design. STREAMS sites will not be selected to be representative of all HMRE grantees. Instead, they will be chosen to address specific research questions of particular policy interest.

Sampling frame, sample design, and coverage of target population

Process study. The evaluation team will select respondents for process study instruments in STREAMS sites as follows:

  • Staff and stakeholder interviews. Researchers will purposively select respondents for these interviews using organizational charts, information about each staff member’s role at the host organization, and information about key community partners and stakeholders. Researchers will aim to interview all grantee directors, managers, and supervisors and a selection of front line program staff. The evaluation team expects to select up to 25 respondents in each of six sites, or 150 total.

  • Focus groups. The evaluation team will conduct three focus groups with 10 participants each in each evaluation site. The team will randomly select participants at each site that have attended at least three program activities in the month prior to the site visit. The team will select 12 participants to invite to each focus group to account for potential no shows. We assume that four evaluation sites will serve adults and two will serve youth. We assume 120 focus group participants at sites serving adults (10 participants per focus group * 3 focus groups per site * 4 sites) and 60 focus group participants in sites serving youth (10 participants per focus group * 3 focus groups per site * 4 sites).

  • Staff survey. All staff that work directly with program participants and their supervisors will be asked to complete a paper-and-pencil staff survey. The research team assumes that 20 staff members in each of the six evaluation sites (120 total) will fit this criterion.

  • Add-on to nFORM to document session adherence. All staff that facilitate group HMRE sessions in evaluation sites will be asked to complete session adherence forms after each session they facilitate. The research team assumes that eight staff members per site, or 48 total, will fit this criterion.

Impact study. In each of the six participating sites, the STREAMS impact study will be conducted with individuals eligible to receive the HMRE services provided by the grantee. In sites serving youth in high schools, these individuals will be students in schools and classrooms selected by the grantee to receive HMRE services. For example, the grantee may elect to deliver an HMRE curriculum to 9th grade students as part of a regular school health class. All students attending the class will be eligible for STREAMS. In sites serving adults, study participants will be individuals or couples who have voluntarily applied to receive the HMRE services provided by the grantee. The HMRE grant program requires that all supported services are provided to individuals on a voluntary basis. As a result, HMRE programming for adults is typically offered through organizations that specialize in delivering voluntary, community-based programming, such as non-profit social service agencies, faith-based organizations, hospitals, or university-based cooperative extension programs.

In each site, all eligible individuals will be considered for enrollment in STREAMS until a targeted sample size has been reached. The targeted sample size for each site was determined based on the precision needed for the key impact estimates (discussed below). For sites serving youth in high schools, the targeted sample size is 1,800 youth per site. For sites serving adults, the targeted sample size is 1,000 individuals or couples per site. To obtain the necessary sample sizes, study enrollment is expected to occur over a roughly two-year period from July 2016 through August 2018 (dependent on approval of this information collection request). In each site, the grantee may continue to provide HMRE services to similar individuals after the end of the STREAMS enrollment period. However, these additional individuals served through the grant program will not be part of the STREAMS evaluation sample.

Precision needed for key impact estimates

For the STREAMS impact study, each site will be analyzed separately, so relatively large samples are needed to answer the study research questions. Table B.1 shows the estimated minimum detectable effect sizes for the targeted sample sizes of 1,800 per site for sites serving youth in high schools and 1,000 per site for sites serving adults. As explained below, the targeted sample size is relatively larger for sites serving youth in high schools than for sites serving adults to account for differences in how random assignment will be conducted and for the number of research groups to which the study participants will be randomly assigned.

Table B.1. Minimum detectable effect sizes

Sample size

Continuous outcomes

(Effect size units)

Dichotomous outcomes

(Percentage point difference)

Sites serving youth in high schools



1,800 students, two research groups

0.114

5.7

1,800 students, three research groups

0.140

7.0

Sites serving adults



1,000 individuals or couples, two research groups

0.154

6.1

Notes: For sites serving youth in high schools, the estimates assume a sample of 100 classrooms, 90 percent response rate to the follow-up survey, and intraclass correlation coefficient of 0.02. For sites serving adults, the estimates assume an individual response rate of 80 percent.


For sites serving youth in high schools, random assignment will occur at the classroom or school level, and the sample will be randomly assigned to either two or three research groups depending on the specific research question being addressed. The estimated minimum detectable effect sizes presented in Table B.1 assume randomly assigning a sample of 100 classrooms to either two or three research groups. For a site with two research groups, the targeted sample size of 1,800 youth per site will allow for detecting an effect size of 0.114 on a continuous outcome such as attitudes toward healthy relationships or communication/conflict management skills. For a binary outcome such as the percentage of students correctly answering a factual question about the characteristics of healthy relationships, the targeted sample size will allow for detecting an impact of 5.7 percentage points. For a site with three research groups, the targeted sample size of 1,800 youth per site will allow for detecting an effect size of 0.140 on a continuous outcome and an impact of 7.0 percentage points on a binary outcome. These minimum detectable effect sizes are in line with the findings from the few prior evaluations of HMRE programming for high school students (e.g., Kerpelman et al. 2009).

For sites serving adults, participants will be randomly assigned as individuals and the design will feature two (not three) research groups. For these reasons, the sites serving adults have a smaller targeted sample size of 1,000 individuals or couples per site. This targeted sample size allows for detecting an effect size of 0.154 on continuous outcomes such as communication skills or co-parenting. For binary outcomes, such as whether the sample member has experienced an incident of intimate partner violence in the past year, the targeted sample size allows for detecting an impact of 6.1 percentage points. There is currently little rigorous research evidence on the effectiveness of HMRE programs that do not serve couples and instead serve adults as individuals. However, a recent pre-post analysis of one HMRE curriculum delivered to low-income adults found improvements on communication and conflict management of 25 to 30 percent of a standard deviation six months after the program ended (Antle et al. 2013). More rigorous research is available on the effectiveness of HMRE programs for couples. To date, these programs have been found to have relatively modest effects. However, the Oklahoma site in the Building Strong Families (BSF) evaluation had effect sizes of greater than 0.15 on multiple dimensions of relationship quality at the 15-month follow-up (Wood et al. 2010).

Expected response rates to participant surveys for the impact study

We expect a response rate of 100 percent for STREAMS baseline surveys, since completing a baseline survey is a requirement for inclusion in the study sample (Table B.2). For sites serving youth in high schools, we anticipate a 90 percent response rate to the one-year follow-up survey. These response rates are consistent with response rates achieved in other federal evaluations examining similar school-based interventions at the one-year follow-up, such as the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA) and the Personal Responsibility Education Program Multi-component Evaluation (PREP). For sites serving adults, we anticipate an 80 percent response rate to follow-up surveys. These response rates are consistent with those achieved in other federal evaluations of HMRE programs serving adults, such as BSF and the Parenting and Children Together (PACT) evaluation. See Section B3 for details on the proposed methods for maximizing response rates to follow-up surveys.

Table B.2. Expected response rates and number of responses

Data source

Number in
sample

Expected response rate

Expected number of responses

  1. Baseline survey for youth

3,600

100%

3,600

  1. Follow-up survey for youth

3,600

90%

3,240

  1. Baseline survey for adults

4,000

100%

4,000

  1. Follow-up survey for adults

4,000

80%

3,200

Expected item nonresponse rate for critical questions

Based on prior experience asking similar questions with similar populations, the evaluation team does not anticipate significant item nonresponse on surveys. The team will use audio computer-assisted self-interviewing (ACASI) for youth surveys and computer-assisted telephone interviewing (CATI) for adult surveys. The use of ACASI and CATI improves data quality and completeness because skip logic, appropriate wording variations, and consistency checks are programmed into the instrument. For sites serving adults, CATI interviewers will be trained to use appropriate probes and prompts to encourage responses to all questions. For school-based sites using ACASI, respondents will be provided with headsets so that they can listen to a recording of questions being read aloud, improving comprehension. In addition, evaluation team members will be on site when surveys are administered to answer questions that may arise.

B2. Procedures for collection of information

Process study data collection

We plan to conduct a multi-day site visit to each STREAMS site to collect process study data. We will use the topic guide for staff and stakeholder interviews (Instrument 1) to conduct interviews with program directors, managers, supervisors, community stakeholders, and front line staff. These interviews will focus on the implementation and operation of HMRE programs and the community context in which they operate. All interviews will be conducted by a team of two study team members, one asking questions and a second typing close to verbatim notes capturing key quotes and responses on a laptop. Site visit teams will use an audio recorder with permission from respondents to record interviews to later confirm direct quotes and other details from the interviews.

During each site visit, we will conduct three focus groups with program participants, each lasting 90 minutes. In advance of the site visit, site visit teams will work with program staff to identify a convenient location and time for holding the focus groups. One month prior to each visit, the site visit team will randomly select program participants to invite to the focus group from among active participants listed in the nFORM MIS. To be eligible for focus group participation, participants must have attended three or more program activities within the past month. Evaluation team staff will contact participants selected for the focus group to invite them to participate, send a confirmation letter to each participant (see Attachment H), and place a reminder call to participants the day before each focus group. Focus group discussions will follow the focus group guides (Instruments 2 and 3).

The evaluation team will conduct staff surveys (Instrument 4) using a self-administered paper-and-pencil instrument. The team makes this plan for two reasons: (1) the instrument is designed to be brief and is free of complicated skip patterns, and (2) the expected sample size does not justify the costs related to developing and maintaining an automated survey instrument. Prior to each site visit, the evaluation team will mail the staff survey to all direct service staff and their supervisors. The team will provide return envelopes and request that the survey be returned within four weeks. Site visit teams will attempt to collect any remaining surveys from program staff during the site visit.

Program staff that facilitate group HMRE sessions will be asked to complete the session adherence form (Instrument 5) after each session. The existing nFORM system (OMB no. 0970-0460) requires all facilitators funded through the current round of HMRE grants to report session attendance after each session. When facilitators in STREAMS sites use nFORM to complete this attendance report, they will be prompted to complete the session adherence form in addition to completing the required information on session attendance. The adherence form will make use of check boxes and radio buttons for ease of data entry.

Impact study data collection

Survey data collection for the impact study will follow different procedures in sites serving adults and sites serving youth. These procedures are described below.

Sample intake in sites serving adults. In sites serving adults, program staff will meet with eligible and interested potential participants to enroll them in the study sample. They will use the introductory script (Instrument 6) to describe the study to potential enrollees. This script will highlight the purpose of the study, the baseline and follow-up measures, and random assignment. If the participant wishes to continue with the sample enrollment process, the intake worker will call Mathematica’s Survey Operations Center to connect the applicant with a trained interviewer who will administer the consent form and conduct the baseline survey (Instrument 10). Once the baseline survey is complete, the interviewer will instruct the applicant to hand the phone back to the intake worker, who will then use the add-on component of nFORM (Instrument 7) to first confirm whether the applicant is eligible for random assignment and then to conduct the random assignment process. Once the applicant has been randomly assigned, the worker will inform the applicant of his or her research status.

Follow-up survey for adults. In sites serving adults, follow-up surveys will be completed 12 months after random assignment using CATI (Instrument 11). Prior to calling sample members to complete the survey, the evaluation team will send sample members an advance letter (Attachment P) to notify them that a Mathematica interviewer will be calling to ask that they complete an interview. Mathematica’s locating team will send post card reminders (Attachment N), as well as text message reminders (Attachment O) to encourage response among hard-to-reach sample members. If the respondent cannot be reached by phone and cannot be found by in-house locating staff, the evaluation team will assign a field locator in the program’s community to contact the sample member in person. The field locator will then provide the sample member a cell phone to call the Mathematica Survey Operations Center to complete the interview.

Sample intake in sites serving youth in high schools. In sites serving youth in high schools, schools or classrooms will be randomly assigned to two or three research groups. Prior to the beginning of programming, schools will provide the evaluation team with rosters of eligible youth. The evaluation team will then work with the school to obtain active consent from the parents of eligible youth. The parental consent form is included as Attachment E. Once the consent gathering process is complete, the evaluation team will coordinate with study schools to schedule baseline data collection. On the designated day, members of the evaluation team will report to the school to facilitate the baseline data collection process. Youth whose parents have consented to their participation in STREAMS will be asked to report to a designated classroom in their school to complete study enrollment. They will first be asked to sign a study assent form (Attachment F). If youth assent to study participation, they will then complete the STREAMS baseline survey for youth in high schools (Instrument 8), which will be administered via ACASI on a tablet device provided by either the program or the evaluation team. Two members of the evaluation team will be present during each administration to monitor activities and troubleshoot tablet device issues as needed. Once youth have completed the survey, their responses will be automatically uploaded to Mathematica’s secure server. In each site, a make-up administration date will be scheduled for baseline survey administration for any youth unable to make the initial administration.

Follow-up survey for youth in high schools. In sites serving youth in high schools, follow-up surveys will be administered 12 months after random assignment (Instrument 9). Surveys will be administered via ACASI in schools, using methods similar to those used in these sites for baseline survey data collection. In most cases, youth will be enrolled in the same schools at follow-up in which they were enrolled at baseline, simplifying locating efforts and improving response rates. To ensure high response rates, the evaluation team will send advance letters to the youth and their parents informing them of the call (Attachment M), and attempt to contact youth by telephone who do not complete the survey during in-school group administration (see Attachment L for the introductory script for these calls). In these cases, interviewers will read the questions from the ACASI survey over the telephone.

B3. Methods to maximize response rates and deal with nonresponse

Process study

The evaluation team will use a mix of strategies to ensure high response rates for process study data collection. To help ensure high participation among staff for interviews conducted during process study site visits, the evaluation team will coordinate with grantees to determine convenient dates for these visits. The team will work with grantees to develop a schedule that accounts for the availability of key program staff. To ensure high focus group attendance, evaluation team members will send reminders to selected sample members prior to each focus group and will offer a $25 gift card as a token of appreciation. To achieve high response rates to the staff survey, during site visits, evaluation staff will follow up with staff members who have not completed the survey and ask that they complete it. Site visitors will provide an additional copy of the survey to staff members if needed and request that the staff member complete the survey before the end of the site visit. The evaluation team will maintain high response rates to the session adherence forms that facilitators will complete using an added component to nFORM through regular monitoring of adherence data and through follow-up as needed to encourage facilitators to keep the information complete and up-to-date.

Impact study

The evaluation team will conduct baseline surveys at sample intake and follow-up surveys about 12 months later. As noted in Section B1, we anticipate that all sample members will complete baseline surveys because completing this survey is a condition of study participation. The evaluation team will take the following steps to ensure high response rates to the 12-month follow-up surveys:

  • Follow-up surveys in sites serving adults. In sites serving adults, follow-up surveys will be conducted using a CATI instrument administered by phone. To facilitate locating at the 12-month follow-up, detailed contact information will be gathered from sample members at baseline. In addition, all telephone interviewers will be thoroughly trained in both the specifics of the project as well as strategies for securing respondents’ cooperation and averting refusals. The team will include bilingual interviewers who can administer the instrument in Spanish when needed. Mathematica’s Survey Operations Center will be adequately staffed for both daytime and evening interviews in all U.S. time zones. Respondents will be offered a $25 gift card as a gift of appreciation. Mathematica will use automated search tools and online locating services to locate hard-to-reach sample members. For the hardest-to-reach sample members, Mathematica will deploy field locators in the communities where programs in the study operate to locate sample members and provide them with a cell phone to complete the survey if needed. Mathematica has used these methods in other studies of HMRE programming, such as BSF and PACT, and achieved response rates above 80 percent.

  • Follow-up surveys for sites serving youth in high schools. In sites serving youth in high schools, follow-up surveys will be conducted primarily through group administration in schools. School-based HMRE programs often serve teens in the early years of high school. Therefore, most students will be enrolled in the same schools a year later. For example, in ACF’s PREP Evaluation, which is examining programs similar to the school based programs to be examined in STREAMS, the two school-based sites are both achieving response rates above 90 percent for the 12-month follow-up, with more than 95 percent of completes conducted through group administration in schools. To ensure high response rates, youth respondents in STREAMS sites will receive a $15 gift card as a token of appreciation. To further boost response rates, the evaluation team will follow-up by telephone with sample members who do not complete follow-up group administration (because they were absent, dropped out, transferred, or graduated) and ask that they complete the survey by phone. These respondents will be offered a $20 gift card as a gift of appreciation. These same procedures are being used in school-based sites in PREP, sites that are achieving 12-month response rates above 90 percent.

B4. Tests of procedures or methods to be undertaken

As described in Supporting Statement Part A, and in Attachments B and C, many of the items included in the youth and adult surveys are standardized scales or items adapted from existing surveys. In some cases, the study team developed new items for measuring constructs for which existing measures are not currently available. These items were developed drawing phrasing and language used in survey questions with similar populations.

The evaluation team has conducted telephone pretests of the youth baseline and follow-up surveys (Instruments 8 and 9), the adult baseline survey (Instrument 10), and the staff survey (Instrument 4) to ensure that questions are understandable, that they use language familiar to respondents, and that they are consistent with the concepts they aim to measure. All pretests were conducted with fewer than 10 people. The team has also used these pretests to identify typical instrumentation problems (such as question wording and incomplete or inappropriate response categories), to measure the response burden, and to confirm that there are no unforeseen difficulties in administering the instruments.

The evaluation team also conducted pretests of the adult follow-up survey. The pretests were conducted by telephone with 9 people. The team used the pretests to measure the response burden and identify any opportunities to improve question ordering or wording. The pretests confirmed that the response burden falls within the estimates specified in Part A of this information collection request.

B5. Individuals consulted on statistical aspects and individuals collecting and/or analyzing data

Mathematica Policy Research is conducting this study under contract number HHSP233201500095G. Mathematica received input and guidance on the plans for the statistical analyses for this study from ACF staff. The Mathematica evaluation team is led by the following individuals:

Dr. Robert G. Wood
Co-Project Director
Mathematica Policy Research

Ms. Diane Paulsell
Co-Project Director
Mathematica Policy Research

Dr. Brian Goesling
Principal Investigator
Mathematica Policy Research

Mr. Shawn Marsh
Survey Director
Mathematica Policy Research

In the future, further input on analytic approaches may be sought from additional Mathematica staff and from outside consultants.

References

Antle, B., Sar, B., Christensen, D., Ellers, F., Barbee, A., & van Zyl, M. (2013). The impact of the Within My Reach relationship training on relationship skills and outcome for low-income individuals. Journal of Marital and Family Therapy, 39, 346-357.

Kerpelman, J. L., Pittman, J. F., Adler-Baeder, F., Eryigit, S., & Paulk, A. (2009). Evaluation of a statewide youth-focused relationships education curriculum. Journal of Adolescence, 32, 1359-1370.

Wood, R. G., McConnell, S., Moore, Q., Clarkwest, A., & Hsueh, J. (2010). Strengthening unmarried parents’ relationships: The early impacts of Building Strong Families. Princeton, NJ: Mathematica Policy Research.




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDiane Paulsell
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy