OMB No. 0970-XXXX Expiration XX/XX/20XX
Responding to Intimate Violence in Relationship Programs (RIViR)
Supporting Statement A
New Collection
November 2017
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
Mary Switzer Building
330 C Street, SW
Washington, DC, 20201
Table of Contents
Section
A.1 Necessity for the Data Collection 1
A.2 Purpose of Survey and Data Collection Procedures 4
A.3 Improved Information Technology to Reduce Burden 14
A.4 Efforts to Identify Duplication 14
A.5 Involvement of Small Organizations 15
A.6 Consequences of Less-Frequent Data Collection 15
A.8 Federal Register Notice and Consultation 16
A.9 Tokens of Appreciation for Respondents 16
A.10 Privacy of Respondents 20
A.11 Justification for Sensitive Questions 20
A.12 Estimates of Information Collection Burden 21
A.13 Cost Burden to Respondents or Record Keepers 23
A.14 Estimate of Cost to the Federal Government 23
A.16 Plan and Time Schedule for Information Collection, Tabulation, and Publication 23
A.17 Reasons Not to Display OMB Expiration Date 23
A.18 Exceptions to Certification for Paperwork Reduction Act Submissions 23
Tables
Table A.2.2. Data Collection Instruments and Description 10
Table A.2.3 Cross-walk of Research Questions and Instruments 11
Table A.12.1. Estimated Annualized Burden Costs and Total for 2-Year Data Collection
LIST OF ATTACHMENTS
A.1 Lead letter for parents
A.2 Recruitment script for adults
A.3 Recruitment script for youth 18 years and older
B.1 Adult consent form
B.2 Adult consent form script
B.3 Parent permission form
B.3s Parent permission form (Spanish translation)
B.4 Parent permission form script
B.4s Parent permission form script (Spanish translation)
B.5 Youth assent form
B.6 Youth assent script
B.7 Youth 18 years and older consent
B.8 Youth 18 years and older script
C.1 Post-screener questions
C.2 Locator section for adults
C.3 Contact information form for parents of youth younger than 18
D.1 Existing validated IPV and TDV screening tools
E.1 60 Day Federal Register Notice
F.1 IRB Approval Notice
LIST OF INSTRUMENTS
Instrument #1.1: IPV Screener #1
Instrument #1.2: IPV Screener #2
Instrument #1.3: IPV Screener #3
Instrument #2.1: TDV Screener #1
Instrument #2.2: TDV Screener #2
Instrument #2.3: TDV Screener #3
The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval for data collection as a component of the Responding to Intimate Violence in Relationship Programs (RIVIR) project. The goals of this research project are to develop improved theoretical frameworks, screening tools, and surrounding protocols for recognizing and addressing intimate partner violence (IPV) and teen dating violence (TDV) in healthy marriage and relationship education (HMRE) programs with the diverse, middle- to low-income populations they serve. We are studying HMRE programs, and we are referring to them as “healthy relationship” (HR) programs in this project (and consequently throughout this document). This project has already conducted several activities using existing data sources and materials to synthesize information about the current state of the field and to identify potential IPV and TDV screeners and protocols.
The project is now planning to test these IPV and TDV screeners and surrounding protocols in HR programs. This application seeks approval for this testing component of the RIVIR project. We plan to test standardized quantitative tools, as well as open-ended scripts that create opportunities for disclosure of IPV or TDV among adults and youth who participate in HR programs. This work will include data collection from approximately 1,200 HR program participants and will be completed in collaboration with approximately four HR grantee organizations funded by ACF’s Office of Family Assistance (OFA) to implement HR programs.
Gaps the Information Collection is Designed to Address
Intimate partner violence (IPV) and teen dating violence (TDV) have long-lasting and deleterious effects (Coker et al, 2002). Unfortunately, IPV and TDV are both prevalent in the U.S. (Breiding et al, 2014; Black et al, 2011; Taylor et al, 2014), and research (McKay et al, 2015) and practice-based knowledge (Menard & Williams, 2005) indicates that IPV and TDV are common among healthy relationship (HR) program participants. HR programs, which typically involve healthy relationship education and skill-building and can include explicit information on IPV or TDV, provide a natural environment for individuals to discuss their personal relationships, including unhealthy relationship issues (Krieger et al, 2016) and can be associated with reductions in IPV among participants (Antle et al., 2011; Lundquist et al., 2014). Conversely, a concern among domestic violence advocates has been that HR programs could inadvertently discourage individuals from leaving abusive relationships (Leiwant, 2003) and thereby contribute to increased IPV. For these reasons, experts in both the IPV/TDV and HR fields agree that it is crucial that HR programs are prepared to recognize and address IPV and TDV in their programming (Menard & Williams, 2006; Derrinton et al, 2010; Ooms et al, 2006).
Federal programs have acknowledged the importance of HR programs’ ability to recognize and respond to IPV and TDV. The Administration for Children and Families (ACF) Office of Family Assistance (OFA) has funded three cohorts of HR programs since 2006 and each cohort has had requirements related to addressing IPV and TDV in their programs. For example, current ACF HR grantees, funded in 2015, were required to show evidence in their grant applications of consultation with a local domestic violence program or coalition and encouraged to take a “comprehensive approach to addressing domestic violence” (ACF, 2015).
Very little research is available to guide such approaches, however (Clinton-Sherrod et al., 2016; McKay et al., 2016). Research-based publications in this area have been descriptive in nature and have not included assessment of guidelines for recognizing or addressing IPV or TDV in HR programs. The two available works in this area describe the implementation of IPV screening and surrounding protocols in one community-based HR program (Whiting et al, 2009) and a set of twelve HR programs serving incarcerated and reentering men and their families (McKay et al, 2013).
In the absence of research-based guidance, practice-based recommendations have been developed to provide guidance to HR programs on how to recognize and respond to IPV and TDV. In collaboration with the National Healthy Marriage Resource Center (NHMRC), the National Resource Center on Domestic Violence developed a five-part resource packet for HR practitioners and administrators that includes sections on understanding domestic violence, building effective partnerships with local domestic violence programs, developing domestic violence protocols, screening and assessment for domestic violence, and responding to domestic violence disclosure (NHMRC, 2011, updated 2015). This work has also been informed by practitioner discussions that occurred during two inter-agency and inter-organizational meetings focused on addressing IPV/TDV in HR programs (Ooms et al, 2006; Derrington et al, 2010).
One commonality between these practice-based recommendations is that they all suggest that HR programs provide all program participants with information and education on IPV and TDV and safe opportunities for disclosure of abusive relationships and HR staff should be prepared to support participants in making decisions about safe program participation and seeking follow up services from local domestic violence organizations. Many HR programs offer opportunities for participants to disclose IPV or TDV in the form of having one-on-one open-ended conversations with individuals or through using screening instruments with closed-ended questions that are administered during program intake. However, HR programs vary in the types of IPV and TDV education and screening approaches they use and in the manner in which they use them (e.g., at what point during the program, by whom) (Krieger et al, 2016), and no empirical information is available to guide these decisions. To date, there are no IPV or TDV screening approaches or surrounding protocols that have been empirically tested in HR settings and among HR populations.
While no IPV or TDV closed-ended screening tools have been tested among HR populations, there is evidence that these types of tools may be used to effectively provide HR participants with opportunities for IPV or TDV disclosure. Many instruments with closed-ended questions have been empirically validated to effectively screen adults for IPV in other populations and settings. Most validated IPV screeners have been tested among heterosexual women within medical settings, but some IPV screeners have been tested and validated in social service settings (with some similarities to HR programs), including parents participating in court-ordered family mediation (Pokman et al, 2014); heterosexual women in mental health, social service, and medical agencies (Jory, 2004); individuals in substance abuse treatment (Kraanen et al, 2013); women in crisis shelters (Sherin et al, 1998; Brown et al, 1996); and women seeking legal help for IPV (Bonomi et al, 2005). Likewise, some IPV screeners have been validated with populations that include similar sub-groups as HR program populations, including men (Goetz et al, 2006; Shakil et al, 2005), youth (Emelianchik-Key, 2011; Datner et al, 2007; Goetz et al, 2006), Spanish-speakers (Goetz et al, 2006; Paranjape et al, 2006; Bonomi et al, 2005), parents (Jones et al, 2005; Pokman et al, 2014; Eliason et al, 2005; Williams, 2012), individuals involved in the criminal justice system (Eliason et al, 2005; Williams, 2012), and individuals in same-sex relationships (Chan & Cavacuiti, 2008). Only one standardized TDV measurement instrument has been validated (Emelianchik-Key, 2011) and it was tested among a primarily white, heterosexual, and female youth population aged 13 to 21 (and its length makes it unsuitable for use as a screening tool).
Literature indicates that universal education1 on IPV and TDV paired with open-ended conversations that provide individuals with opportunities to talk about their relationships may be promising. Several intervention studies that included a universal education component suggest that it is perceived as important by those who receive it (Thompson et al, 1998), can lead to improved knowledge and self-efficacy regarding accessing IPV/TDV resources (Miller et al, 2016; Thompson et al, 1998), and could help to address barriers to disclosure (Othman et al, 2013). In a qualitative analysis of audiotaped conversations between patients and emergency health care providers, researchers found that IPV disclosure was more likely when providers probed about IPV experience, created open-ended opportunities for discussion, and were generally responsive or expressed empathy when a patient mentioned a psychosocial issue (for example, “stress”) (Rhodes et al., 2007). This available research on IPV and TDV closed-ended screeners, universal education, and open-ended conversations indicates that these strategies could be an appropriate strategy for recognizing and addressing (through referral) IPV and TDV among HR participants. The research and data collection proposed by OPRE in this application is needed to empirically test IPV and TDV screening tools and protocols to provide HR programs with evidence-based recommendations on how to recognize and address IPV and TDV in their programs.
How the Information from this Study Will Further ACF Goals
This study is needed to achieve a central goal of the RIViR project for ACF: to identify, prioritize, and test IPV and TDV screeners and surrounding protocols in HR programs. More broadly, the study furthers one of ACF’s agency goals, which is to “promote safety and well-being of children, youth, and families,” as outlined in ACF’s 2015-2016 strategic plan (ACF, 2015). In line with this goal, ACF prioritizes the safety and well-being of HR program participants. Thus, the information that this study will produce will be used to help HR programs better serve (and avoid causing inadvertent harm to) individuals who have been or are currently in abusive relationships.
There are no legal or administrative requirements that necessitate the collection. ACF is undertaking the collection at the discretion of the agency.
The purpose of this component of the RIViR study is to test IPV and TDV screening tools among HR program participant populations in HR program settings. The study will assess the psychometric properties of the screening tools when used with HR program populations, and compare how well each differentiates HR program participants who are experiencing IPV or TDV from those who are not. This information will be used to guide HR program staff in offering referrals to their local domestic violence program partners for full assessment and possible services. (None of the selected grantees have any legal obligation to report the IPV/TDV experiences that are the subject of these screening tools, nor do they have any legal obligations to report abuse of parents by youth that could hypothetically be volunteered by youth in the course of study participation. Grantee staff who are subject to mandatory reporting, which varies by site and staff role, are obligated to report child abuse and neglect if volunteered by youth or adults in the course of study participation, but this is not a topic addressed in the study instruments.) We will test a total of six screening tools: four standardized closed-ended tools (two for IPV and two for TDV) and two open-ended universal education scripts (one each for IPV and TDV) that create opportunities for disclosure of IPV or TDV among adults and youth who participate in federally funded HR programs2. We will test the IPV and TDV screeners among no more than 600 adults and 600 youth, respectively. Each study participant will be asked to complete three IPV or TDV screeners.
This work will be completed in collaboration with approximately four grantee organizations that have been funded by ACF’s OFA to implement HR programs. Data collection will commence after OMB approval (anticipated by February 2017). We anticipate that data collection will take place over the course of up to 24 months (see table A.16.1 below for more details). There are not multiple phases of this study and no previously OMB approved collections related to this study. There are no other related studies conducted by ACF that are addressing similar or the same research questions. With participants’ permission, this data collection will be supplemented with administrative data on HR program participant demographics available from OFA’s administrative data collection system (nFORM).
The research questions are included in Table A.2.1. As described in the study background section (see A.1.1), little evidence exists regarding the effectiveness of IPV and TDV screeners and surrounding protocols among HR program populations. Our research will examine the psychometric properties of selected IPV and TDV screeners in HR program populations and compare the open- and closed-ended IPV and TDV screeners with regard to their ability to differentiate program participants who are experiencing IPV and TDV from those who are not. We are specifically examining these two research questions because they address vitally important research gaps that will inform how HR programs recognize and address IPV and TDV among their program populations. In order to inform future HR programs in their efforts to address IPV and TDV in their programs, we are now collecting data on the use of IPV and TDV screeners and protocols within HR program settings and populations.
First, we will address the research question, “What are the psychometrics of common closed-ended IPV and TDV screeners as implemented in HR programs?” Psychometric testing looks at the “reliability,” or the extent to which an instrument produces consistent results, and “validity,” or how accurately an instrument measures what it is intended to measure, of the selected screeners. Determining the reliability and validity is an essential step to understand if and how these IPV and TDV screeners function among HR populations, and ultimately to be able to provide evidence-based recommendations to HR programs and practitioners on strategies for recognizing IPV and TDV in HR program populations. As mentioned in A.1.1., several studies have tested the psychometrics of IPV and TDV screeners among populations similar to HR populations, but no studies have examined the psychometrics of IPV or TDV screeners among HR program participants in the HR program setting.
Second, we will address the research question, “How well do open-ended IPV and TDV screening approaches compare to closed-ended screening approaches?” As noted in A.1.1, there is an absence of evidence on how open-ended IPV/TDV screening approaches compare to closed-ended approaches in producing the outcomes for which they were designed: identifying individuals who may need IPV/TDV-related help and ultimately, helping individuals to feel more informed and empowered regarding their options for disclosure and help-seeking (whether or not they opt to disclose). To address this research question, we will evaluate how well standardized IPV/TDV tools and open-ended approaches compare in their ability to differentiate participants who may need IPV/TDV-related help from those who do not. This will be assessed by comparing respondents’ disclosures of IPV/TDV experiences using the three different tools, as well as their post-screening knowledge, opinions, and perceptions.
Table A.2.1. Research Questions
Research Questions |
|
|
Our research approach and methodology was chosen to answer the project’s research questions, and generate usable results that can be applied in HR programs, while placing the least burden on HR programs and participants. Grantee staff will recruit approximately 600 adult and 600 youth participants to participate in this study, for a total of 1200 participants recruited from approximately four sites. Adult participants will be recruited to test the three IPV screeners and youth participants will be recruited to test the three TDV screeners. The same participants will complete the three IPV or TDV screeners at three different time points. This number of proposed research participants is essential to the utility of our research; we conducted power calculations to determine how many individuals we would need to interview in order to have adequate power to detect difference in the screeners’ effectiveness with the least amount of program and participant burden. (Additional details on power analysis and other aspects of our study design are included in Supporting Statement B.)
We are aware of two limitations of this study design. First, although all participants will be invited to participate in the study, there may be differences between individuals who choose to participate (and/or whose parents give them permission to participate) and the general population of HR participants to whom our findings should be generalizable. Second, as with all research on IPV or TDV screening, we must rely on individuals’ self-reports of IPV and TDV-related experiences, and some individuals may not disclose IPV and TDV or may provide incorrect information on their IPV and TDV experiences. However, our research design will allow us to model 1) study selection and attrition biases, and 2) any differences in how well a particular screener elicits disclosures of IPV or TDV experiences compared to the other screeners.
OPRE has contracted with RTI to conduct this study. RTI will work with four HR grantees currently funded by the Administration for Children and Families (ACF) to deliver healthy marriage and relationship education programming, The four sites will be chosen for their capacity to manage the required volume of data collection and for their capacity (or “agency readiness”) for responding to IPV/TDV and managing potential safety concerns related to implementing IPV/TDV screening. To prepare for the site selection process, RTI reviewed funding applications for all current ACF healthy marriage and relationship education grantees to assess the following criteria:
Case flow: the ability of the site to recruit approximately 300 participants during a 9-month window
Opportunities for at least three independent encounters with participants as part of program intake and other program activities
Active, functioning partnership with local domestic violence program
Adequate domestic violence protocol (site has a protocol that includes tailored, site-specific content on all of the minimum recommended elements of a domestic violence protocol)
Ability to obtain local IRB oversight (site has a working relationship with a local Institutional Review Board that is willing to provide human subjects protection oversight for its role on the screener testing work)
Diverse, English-speaking program population
Inclusion of participants who indicate some IPV/TDV at intake (i.e., they do not categorically exclude such participants from all programming)
All program participants who are enrolled at the four HR grantee program sites during the study enrollment period will be invited to participate in the study. Recruitment will begin after IRB and OMB approval (anticipated February 2017) and will continue until target sample sizes are reached. Recruitment materials for parents of minor youth, adults, and youth 18 years and older, are located in Attachments A.1, A.2, and A.3 respectively. (The content of the materials and the process by which they are disseminated to parents is distinct from that used by the two youth-serving grantees to inform parents about their children’s participation in the HR programs, which focus on the content and benefits of program participation for youth and are disseminated to most parents in person at school orientation events.)
Prior to and during study involvement, grantee project staff will emphasize to participants that participation in the study is voluntary and that participants may reverse their decision to participate at any time. They will also be reminded that participation in this study will not have any bearing on the services that they receive, nor will declining to participate result in any punitive measures (particularly important to clarify for high school aged youth). The voluntary nature of the study, as well as other key information about the study, will be explained in writing and verbally through informed consents and accompanying scripts. All information will be kept private. All potential adult participants will receive an informed consent (Attachment B.1) that research staff will review with them (Attachment B.2). Parents will need to provide permission for their child to participate, prior to minor participants’ being invited to participate (Attachment B.3-B.4). All potential minor youth participants who have received parental permission will receive an informed assent (Attachment B.5) that research staff will explain to them (Attachment B.6). Youth who are 18 years or older will receive an informed consent (Attachment B.7), which will be reviewed by staff (Attachment B.8).
Adult participants will complete the IPV screeners (Instruments 1.1, 1.2, and 1.3) one-on-one with HR staff during intake or program participation. Youth participants will complete closed-ended TDV screeners using tablets during HR high school programming, and the open-ended TDV screener one-on-one with program staff (Instruments 2.1, 2.2, and 2.3). The first screener will be administered right after participant consent or assent is obtained. The second screener will be administered between 2 days and one month after the first screener, and the third screener will be similarly spaced. The survey system will be programmed to administer the instruments in a random order for each participant. (Instruments must be administered in random order so that order effects can be estimated and adjusted for in the analytic models. Otherwise, an instrument might perform better or more poorly because of being administered before or after another instrument, or because of being administered earlier or later in program participation, and these order effects could not be distinguished empirically from the properties of the instruments themselves.) All of the instruments will be programmed to be Web-based and completed via tablets by the respondent (for the two closed-ended youth screeners) or by the HR program staff member administering the screener (for both open-ended screeners and the adult closed-ended screeners). For screeners administered out loud, participants will participate in a private space such as the project office (for adults) or an office at their school (for youth). Adult study participants who opt to participate in all screening interactions as part of this study will complete a total of three screening interactions over the course of their study participation, one with each of the adult screeners (Instruments 1.1, 1.2, and 1.3). Youth study participants who opt to participate in all screening interactions as part of this study will also complete a total of three screening interactions over the course of their study participation, one with each of the youth screeners (Instruments 2.1, 2.2, and 2.3). The screeners will be programmed so that a set of two post-screener items on gender identity and sexual orientation (the two demographic variables needed for our analysis that are not available from administrative data) is included in whichever of the three screeners is randomized to be administered first for a given respondent (Attachment C.1).
In addition, screeners will be programmed such that the screener that is randomized to be administered third for a given respondent includes items that assess post-screening knowledge, opinions, and perceptions (Attachment C.1). These post-screener questions will appear at the end of whichever of the screening instruments is third and final to be administered: Instrument 1.1, 1.2, or 1.3 for adults, depending on the order in which they are randomized to be administered to a given adult respondent: and Instrument 2.1, 2.2, or 2.3 for youth, again depending on the order in which they are randomized to be administered to a given youth respondent. Adult and youth study participants who opt to participate in all screening interactions will each complete this module just once during their study participation. We will also conduct qualitative interviews regarding perceptions of the TDV screening tools with four study participants in each of the two youth-serving sites, and qualitative interviews regarding perceptions of the IPV screening tools with four study participants in each of the two adult-serving sites (i.e., tool feedback interviews). These interviews will help us to further understand (from the participant’s perspective) the extent to which they achieve the outcomes for which they were designed. (Since each of these two interviews will be completed with fewer than ten individuals, they are not included in this Information Collection Request.)To make unbiased recruitment for the qualitative interviews possible, a locator section (Attachment C.2) will be administered to all adults at the end of the first instrument to collect participant contact information. Parents of all youth will be asked to complete and return a contact information form (Attachment C.3), which will be distributed to them along with the permission form. RTI will use this contact information to re-contact a subset of participants after all three waves of screener administration are complete in order to recruit individuals from the two youth-serving sites to participate in the TDV tool feedback interview and individuals from the two adult-serving sites to participate in the IPV tool feedback interview. Potential participants will be purposefully selected based on their answers to screener questions in order to achieve a sample at each site that includes participants who disclosed experiences of IPV/TDV on one or more screener, as well as participants who did not disclose IPV/TDV on any of the screeners.
Table A.2.2 lists all data collection instruments by the title used to describe the instrument throughout the entire package (which matches the file name of the instrument document) and in the same order as they are listed in the burden table in A.12. All instruments can be found in Instruments 1.1 through 2.3 and C.1 and C.2.
Table A.2.2. Data Collection Instruments and Description
Instrument |
Description |
Total Number of Respondents |
1.1: IPV Screener 1 |
Standardized IPV tool 1: Intimate Justice Scale (15 items) for physical violence and coercive control |
600 |
1.2: IPV Screener 2 |
Standardized IPV tool 2: Universal Violence Prevention Screen (5 items) for physical violence and Women’s Experience with Battering (10 items) for coercive control |
600 |
1.3: IPV Screener 3 |
Open-ended IPV tool based on existing and widely accepted IPV universal education guidelines and expert consultant input |
600 |
2.1: TDV Screener 1 |
Standardized TDV tool 1: Safe Dates tool (32 items) for physical violence and coercive control, with expanded/revised monitoring items |
600 |
2.2: TDV Screener 2 |
Standardized TDV tool 2: Conflict in Adolescent Dating Relationships Inventory (25 items) for physical violence and coercive control, with expanded/revised monitoring items |
600 |
2.3: TDV Screener 3 |
Open-ended TDV tool based on existing and widely accepted TDV universal education guidelines and expert consultant input |
600 |
Post-screener questions (C.1) |
Form that asks brief questions about gender identity and sexual orientation (administered as part of the first screening) and captures post-screening knowledge, opinions, and perceptions (administered as part of the third screening) |
600 |
Locator section for adults (C.2) |
Form that includes brief questions on whether or not the study can recontact the participant and, if so, questions to collect their contact information (e.g., phone number, address, email address) |
600 |
Contact Information Form for Parents of Youth Younger Than 18 (C.3) |
Form that includes brief questions on whether or not a parent gives permission for the study to recontact their minor child, and if so, questions to collect their contact information (e.g., phone number, address, email address) |
600 |
|
|
|
Table A.2.3 directly connects each instrument back to the research questions. Our first research question, “What are the psychometrics of two common closed-ended IPV and TDV screeners as implemented in HR programs?” requires testing two IPV and two TDV closed-ended screeners in HR program settings. Our second research question, “How well do open-ended IPV and TDV screening approaches compare to closed-ended screening approaches?” requires testing (and comparing results of) both closed-ended and open-ended screeners. We also will ask participants (or parents of minor participants) for permission to potentially recontact them to develop a qualitative understanding of their experiences (only a small subset of participants will be recontacted).
Table A.2.3 Cross-walk of Research Questions and Instruments
Research Questions |
Instruments Used to Answer Research Questions |
What are the psychometrics of (2) common closed-ended IPV and TDV screeners as implemented in HR programs? |
|
How well do open-ended IPV and TDV screening approaches compare to closed-ended screening approaches? |
|
To inform the selection of the IPV and TDV screener instruments to test for this study, we first conducted a systematic literature review of existing validated IPV and TDV screening tools. Empirically validated tools, defined as those with a published measure of accuracy or validity (e.g., correlation with another known measure) or sensitivity greater than or equal to 50%, were included for further synthesis and review (see Attachment D.1 for a summary of empirically validated IPV and TDV disclosure tools that resulted from the literature review). Based on a lack of identified TDV disclosure tools that met our validation criteria, we conducted a second, expanded search to include tools that had been field tested with youth, but on which a full assessment of psychometric properties was not published in the peer-reviewed literature. Finally, we conducted a search for protocols for universal education and open-ended IPV/TDV disclosure opportunities. While we did not identify any universal education and open-ended protocols that had been empirically tested, we did find research that indicated these types of approaches are promising (as summarized in section A.1.1). We consulted with expert panelists and academic partners (see section A.8), as well as federal partners, to obtain any missing existing validated closed-ended IPV screeners, validated or field tested closed-ended TDV screeners, or open-ended tools in our literature review. From this consultation, we added one missing closed-ended tool to our list and identified two open-ended tools on which to base the open-ended tools to be tested in this study.
After categorizing the existing validated IPV and TDV closed-ended tools, we assessed the extent to which each IPV and TDV tool would be the most likely to be appropriate for use among HR programs and populations. We looked at the focus of each screener (victimization or perpetration), the forms of IPV measured (e.g., emotional, physical, sexual abuse); the population(s) with which it was tested and validated, the settings in which it was validated, and the length of each screener (e.g., number of items). We prioritized IPV and TDV screeners that were short enough to feasibly be used in HR program settings, yet complete enough to cover essential IPV and TDV constructs, and had been validated among sub-populations that were similar to HR program sub-populations and tested in similar service delivert oriented settings. Using these criteria, our review resulted in a short list of standardized tools to vet with our panel of experts. The tools on the IPV screener short list included Partner Violence Screen (Mills et al, 2006), Universal Violence Prevention Screen (Heron et al, 2003), Datner Measure (Datner et al, 2007), Intimate Justice Scale (Jory, 2004; Whiting et al, 2009), Women’s Experience with Battering (Smith et al, 1994), Psychological Malreatment of Women Inventory (Tolman, 1999), and Intimate Partner Violence Control Scale (Bledsoe & Sar, 2011). The tools on the TDV screener short list included the Teen Screen for Dating Violence (Emelianchik-Key, 2011), Conflict in Adolescent Dating Relationships Inventory (Wolfe et al, 2001), the Safe Dates Evaluation Tools (Foshee et al., 2005), the Fragile Families Study scale (McLanahan et al., 2003), and the Abuse Assessment Screen (McFarlane et al., 1992),. We also developed two sample protocols representing universal education and open-ended IPV and TDV screening approaches for testing in youth- and adult-serving healthy relationship programs. We based these sample protocols on available practice-based guidance (e.g., Chamberlain & Levenson, 2013), including input from practitioner experts.
To move forward with selecting the four closed-ended tools (two IPV and two TDV) to be included in this study, we shared these narrowed lists of standardized tools recommended for consideration with our expert panel, federal partners, and our academic partners. We asked the panelists and partners to help us to prioritize a total of two standardized tools for IPV and TDV each to test in the adult and youth HR populations, respectively. We also shared our universal education and open-ended IPV and TDV screening approaches with the experts and partners to garner feedback on the appropriateness of the language, questions, and guidance to users.
The expert panelists, federal partners, and academic partners provided feedback on the literature review table and tool recommendations. One of our experts suggested a TDV tool that was not on our original list, the Youth Risk Behavior Survey (Centers for Disease Control and Prevention, 2017), which includes measures of physical and sexual violence. From this expert feedback, we selected the final set of standardized tools to be tested for this current study:
1.1: IPV Screener 1: Intimate Justice Scale (15 items) for physical violence and coercive control. (Instrument List 1.1)
1.2: IPV Screener 2: Combine the Universal Violence Prevention Screen (5 items) to measure physical violence and the Women’s Experience with Battering (10 items) to measure coercive control. (Instrument List 1.2)
2.1: TDV Screener 1: Combine the Youth Risk Behavior Survey (2 items) to measure physical and sexual violence and the Fragie Families measure (2 items) to measure coercive control (Instrument List 2.1)
2.2: TDV Screener 2: Conflict in Adolescent Dating Relationships Inventory (25 items) for physical violence and coercive control, with expanded/revised monitoring items. (Instrument List 2.2)
Experts also provided suggestions to improve the open-ended screeners, including revisions to improve the open-ended screeners’ flow, accuracy, clarity, and accessible to HR program participants. The final open-ended IPV screener and TDV screener are Instruments 1.3 and 2.3.
Our data collection will utilize a web-based platform (Voxco-programmed online survey) to collect data from participants for the four closed-ended screeners, as well as the supplemental module (see Attachment C.1; added at the end of whichever screener is administered third) in real time. All screener data will be collected using the technology (e.g., tablets) that participating HR programs are already using for their local evaluation and federally required data collection. The open-ended screeners require face-to-face conversation, and therefore would not be suitable to be collected via a tablet or laptop. To save burden and avoid privacy risks associated with paper-based data collection, however, data resulting from these open-ended screeners will be also be entered electronically only. All data will be stored online, which will improve privacy protections (as opposed to paper forms that HR programs would have to scan or mail to RTI).
As described in section A.2.4, the RIViR team has conducted a systematic review of validated and non-validated closed-ended IPV and TDV screeners (Attachment D.1), as well as a literature search for evidence-based IPV and TDV universal education and open-ended screeners. This review, conducted in 2015 and updated in 2016 and again in 2017, underscored key shortcomings in existing research:
Most formally validated IPV/TDV screening tools have been tested in health care settings and only a few have been validated in non-health care delivery settings.
No closed-ended TDV screeners have been formally validated (due in part to the lack of an accepted "gold standard” measure against which to assess their properties).
No available IPV/TDV screening tools have been validated with HR program populations, although some have been validated with populations that include similar sub-populations.
Little empirical information is available on open-ended screeners (in any setting or with any population), and none is available on the use of open-ended screeners in HR programs.
To date, no studies have established the psychometric properties of IPV or TDV screeners among HR program populations nor compared the use of open- and closed-ended screeners in these populations.
Whiting and colleagues (2009) documented one HR program’s use of the previously validated closed-ended IPV screener, the Intimate Justice Scale (Jory, 2004). While the Intimate Justice Scale was used among HR program participants, Whiting et al. were not able to establish psychometric properties of this closed-ended IPV screener (Whiting et al, 2009).
In addition to reviewing published works, the RIViR project team contacted researchers at Brigham Young University who have compiled local evaluation data from several cohorts of previously funded HR program sites, including collecting information on whether and how IPV screening was conducted and what data resulted (Hawkins and Erickson, unpublished data). This consultation confirmed that prior HR programs (or their local evaluators) have not collected any data that could be used to answer this study’s research questions.
Finally, as noted in section A.1.1, HR programs currently have access to practice-based guidance on IPV and TDV screening, but research-based guidance has not been available.
Some or all of the HR programs included in this study will be small, community-based organizations. To minimize any burden on these organizations resulting from the data collection process, the study team will develop site-specific data collection protocols that work within each site’s staffing work flow and existing program activities such that this data collection will not impact the organizations’ operations or ability to serve clients.
This data collection request covers the administration of three screeners to HR program participants at three time points; none of the tools is repeated at more than one time point. Testing of more than one screener is necessary to be able to address our first research question (“What are the psychometrics of (2) common closed-ended IPV and TDV screeners as implemented in HR programs?”) and our second research question (“How well do open-ended IPV and TDV screening approaches compare to closed-ended screening approaches?”). As detailed in SSB, Section B.2.4, the analyses to answer each of these research questions require that the same individuals complete multiple screeners.
To reduce participant burden (and to provide findings that are likely more feasible to implement in HR programs), we have selected IPV and TDV screeners that are brief and should take between 10 and 15 minutes each. While some IPV and TDV screening tools have over 100 items, our selected IPV and TDV screeners for testing range from 4 to 22 items. Our IPV and TDV open-ended screeners are also brief, and only include approximately four open-ended questions each.
There are no special circumstances for this data collection.
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13 and OMB regulations at 5 CFR Part 1320 [60 FR 44978, August 29, 1995]), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on April 12, 2016 in Federal Register Vol. 81, No. 70, pages 21553-21554, and provided a 60-day period for public comment. A copy of this notice is included in Attachment E.1. No substantive comments were received during the 60-day notice period.
RTI also consulted with an expert panel with expertise in areas including IPV services and program implementation, IPV screening and, HR program implementation on the design of the study and selection of screeners.
We propose to offer a $5 token of appreciation per participant per screening tool in order to proactively minimize non-response bias. Since our analysis plan requires obtaining data from the same respondent using all three screeners, we propose a $5 bonus for completion of all three screening tools in order to minimize attrition bias. Limited empirical evidence exists to guide precise amounts of tokens of appreciation relative to study type and population (Singer & Ye, 2013); as such, our proposed token of appreciation of $5 is set at the lowest possible level to leverage the well-documented benefits of tokens of appreciation while minimizing the investment of government funds. The maximum total a single respondent will receive in screener completion incentives over the life of the study is $20.
In reviewing OMB’s guidance on the factors that may justify provision of tokens of appreciation to research participants,[1] we have determined that tokens of appreciation are necessary for this data collection due to the following special considerations: (1) data quality, (2) complex study design, (3) improved coverage of specialized respondents, and (4) reduced survey costs.
A.9.1. Improved Coverage of Specialized Respondents
Sample calculations for the RIViR study suggest that the study must enroll from approximately 75-90% (depending on the site) of the available sample in each site in order to accomplish our analytic objectives. The study’s research questions focus on IPV/TDV screening tool performance among the mostly low-income, minority populations served by federally funded HR programs. These populations are both small in absolute number and hard to reach (due to factors associated with their low income). If we are unable to secure participation from a significant proportion of our sampling pool, we could be unable to achieve our study aims due to inadequate sample sizes and low resulting power. Research has shown that tokens of appreciation can increase response rates for populations that include racially and socioeconomically diverse participants (similar to the RIViR study sample), including a few studies specifically with low-income and racially diverse populations (James & Bolstein, 1990; Singer & Kulka, 2002), and individuals with lower education levels (Berlin et al, 1992).
A.9.2. Complex Study Design
Our analytic objectives require that participants complete three different screening tools over the course of their study participation. If they are not motivated to continue participating in the study and completing screening tools, this will prevent the study from having the statistical power to address its research questions. This concern is so critical in multi-wave studies that the U.S. Census Bureau now offers tokens of appreciation for several longitudinal panel surveys, including Survey of Income and Program Participation (SIPP) and Survey of Program Dynamics, based on findings from its own incentive experiments. For example, a series of multi-wave incentive experiments conducted as part of SIPP that compared $0, $10, $20, and $40 incentives found that response rates increased in proportion to the incentive offered, and (of particular relevance for the RIViR project) that base-wave incentives improved participation in later waves of data collection.
Results like these suggest that, without a small token of appreciation to secure ongoing participation, the study will need to recruit and enroll more potential respondents up front in order to achieve sufficient numbers of responses for each wave and screening tool—a strategy that would add unnecessary respondent burden and grantee staff member burden. In addition, if more participants drop out of the study or fail to show up to complete follow-up screening tools, HR program staff will need to spend more time attempting to locate and following up with participants who do not show up.
A.9.3. Data Quality
The data resulting from the proposed RIViR data collection must be of sufficient quality to address the study research questions. In particular, three aspects of data quality must be ensured in order to support the planned analyses: low missingness, minimal non-response bias, and minimal attrition bias.
Missingness. If we secure participation from an adequate number of respondents among the small available sample (see A.9.1) but participants are not sufficiently motivated to spend the time needed to complete all items on the screening tools, the quality of the data will be compromised by missingness (or item non-response). This could prevent the study from having the statistical power to address its research questions.
Non-response and attrition bias. Nonresponse and attrition bias could be a serious threat to the RIViR study’s validity, and must be proactively addressed. Without a token of appreciation, differential nonresponse and differential attrition are likely based on differences in intrinsic motivation to participate (Groves, Singer, & Corning, 2000). This is a particular issue in a survey on the topic of IPV experiences (which could reasonably be related to a prospective respondent’s history of IPV victimization) and based on differences in competing obligations (which could be associated with economic strain and therefore also associated with IPV [Cunradi, Caetano, & Schafer, 2002]). Minimizing nonresponse and attrition among less-intrinsically-motivated or more-strained respondent groups is crucial, since their characteristics could be independently related to our analytic outcomes, and offering tokens of appreciation is a strong, empirically tested approach to achieving that.
It is widely recognized that, unless survey non-response occurs at random, low response rates in survey-based research lead to non-response bias (e.g., Rubin, 1976). The most effective way to minimize non-response bias is to design survey field approaches that maximize response rates (Massey & Tourangeau, 2013). According to “leverage-salience theory,” based on a wide body of research on survey incentives, tokens of appreciation can function to minimize non-response bias, because they specifically help to increase participation among those who are less intrinsically motivated to participate in a survey on a particular topic or who have heavier competing obligations (Singer & Ye, 2013). In a study in which participants will be asked questions about IPV victimization, it is reasonable to assume that the strength of sample members’ intrinsic motivation to participate and the extent of their competing obligations could each be independently related to the victimization experiences captured by the IPV screening tools this study will test. Proactively minimizing the potential for differential non-response according to these characteristics (such as between IPV victims and non-victims) is a major validity issue for the RIViR study.
Multiple empirical assessments, including experimental studies, have found that the provision of tokens of appreciation reduces non-response and non-response bias in both interviewer-administered and web-based surveys (such as those proposed for the RIViR study) and that monetary tokens of appreciation are more effective than other tokens or gifts at preventing survey nonresponse (e.g., Abreu & Winters, 1999; Greenbaum, 2000; Goldenberg, McGrath, & Tan, 2009; Goritz, 2006b; SAMHSA, 2014; Shettle & Mooney, 1999; Singer et al., 1999; Singer and Kulka, 2002). Evidence across studies indicates that the effect of tokens of appreciation on response rates functions minimize non-response by decreasing refusals (Singer & Ye, 2013) and that they can help to minimize attrition bias for studies involving multiple waves of data collection (Singer et al., 1998), such as the RIViR study. For example, the National Institute of Child Health and Human Development-funded Healthy Outcomes of Pregnancy Education study, which enrolled pregnant African American and Latina women, found that differential attrition (in this case, the higher likelihood of study dropout among single, less educated, drug and alcohol using, and non-working women compared to their counterparts) was more pronounced in the group who were not given any monetary incentive than in the group who received a modest token of appreciation (El-Khorazaty, et al., 2007).
A.9.4. Reduced Survey Costs
The costs associated with initial and follow-up contact attempts in a multi-wave data collection are substantial, and evidence suggests it can be reduced with a small, strategic investment in tokens of appreciation. Brick and colleagues’ (2014) incentive experiment found that a monetary incentive helped to reduce both the number of contact attempts and the total interviewer time required to secure participation, while still maintaining data quality- In an incentive experiment conducted as part of the National Crime Victimization Survey, researchers found that providing a $10 incentive to respondents decreased the cost per contact attempt by 34 percent. Other studies have found similar effects (National Survey on Drug Use and Health. OMB control number 0930-0231; National Survey of Family Growth, Cycle V, OMB control number 0920-0314; Kennet et al., 2005; Duffer et al, 1994).
A.9.5. Past Experience
Past experience on OMB-approved studies led by RTI and others suggests that the provision of tokens of appreciation among similar populations (including low-income and racially diverse populations) is crucial for securing initial and ongoing participation. On the Assets for Indendence (AFI) Evaluation conducted by RTI (OMB control number 0970-0414), participants were provided $20 for completing baseline and follow-up hour-long surveys. The study increased the token of appreciation for the follow-up surveys to $50 in an effort to improve initial low response rates, which helped to avoid potential non-response bias. A recent experiment conducted by RTI on incentives in a study of that required a similar amount of total participation time from respondents, the Residential Energy Consumption Study (OMB control number 1905-0092), a nationally representative household survey, found that offering a $20 token of appreciation produced a uniform, statistically significant decrease in nonresponse across study modes and protocols (Murphy, 2016).
The proposed information collection was reviewed and received contingent approval from RTI’s Office of Research Protection on August 16, 2016. Upon receipt of required revisions, the IRB provided formal approval on September 9, 2016. The IRB approval letter appears as Attachment F.1.
We will take the utmost measures to maximize participant privacy, particularly because of the sensitive nature of the questions and to protect the safety of all research participants. Before enrolling participants and administering the screeners, grantee project staff will participate in an in-person training with the RTI project team about participant privacy guidelines and screener administration procedures. RTI staff will serve as liaisons to the grantee projects throughout data collection to provide support and to ensure that data collection protocols are carefully followed.
Efforts will be made to ensure that no one other than the project staff administering the screener and the participant can view or hear responses during screener administration (i.e., the staff will be instructed to administer the verbal screeners in a private room with the door closed). During group administration with youth, we will try to ensure that no other youth or teachers can view survey responses; respondents will be spaced out around the room and privacy screens will be used to protect students’ tablets from the view of surrounding students as needed.
Screener responses will only be accessible to authorized RTI and HR program staff, and identifying information will be kept separate from screener responses by grantee project staff and RTI staff. Each participant will be given a unique identifier prior to data collection; identifying information collected in the locator section will be automatically separated from screener responses when transmitted and will be stored separately at RTI. All electronic data will be transmitted securely using an encrypted protocol (HTTPS) immediately upon completion of each screener and will be stored in RTI’s Enhanced Security Network and on RTI’s secure project shared drive. A summary of responses for each participant to each screener will be uploaded to a secure website for sharing with grantee project staff, to be used for making referral decisions. No personally-identifying data or participant information will be stored on the computers used to collect the data. HR program staff will store completed consent, assent, parent permission forms, and tokens of appreciation receipt forms in a locked filing cabinet in their agency office. They will scan these forms and upload them at least weekly to a secure website hosted by RTI.
Information will not be maintained in a paper or electronic system from which they are actually or directly retrieved by an individuals’ personal identifier.
The goals of this study necessitate collecting data via IPV and TDV screeners that ask questions about emotional abuse, physical violence, and/or coercive control. These types of questions are necessary for identifying IPV and TDV, are included in some form in all IPV or TDV screeners, and therefore must be included in this data collection effort in order to assess the psychometric properties of and compare the effectiveness of IPV and TDV screening tools. All of the screeners include such questions.
In an effort to minimize the participant burden associated with answering sensitive questions, we chose closed-ended IPV and TDV screening tools (1.1, 1.2, 2.1, and 2.2) that have been previously validated and had fewer questions (15-32) compared to other IPV and TDV screeners. Based on expert consultant recommendations, the IPV and TDV open-ended screening instruments include less direct and explicit questions about IPV and TDV. Instead, the open-ended screeners include universal education in which participants are read information about IPV or TDV and then asked questions such as, “What are your thoughts on the information?” or “Does this sound like your relationship?”
Research indicates that women who have experienced IPV are amenable to and generally supportive of being asked questions about IPV in the medical setting (Bacchus et al, 2002; Gielen et al, 2000) and in surveys (Black, Kresnow, Simon, Arias and Shelley, 2006). Nonetheless, study participants will be notified that the screeners include sensitive questions and they will be reassured that their participation is voluntary during the informed consent process and that their responses will be kept private to the extent permitted by law. Participants will be told that their decision to participate or not will have no effect on any services they receive from their HR program.
In addition, to assess differential non-response to the study screeners and also determine whether the screeners work differently for different sub-groups of youth and adults, we will be collecting demographic information on study participants, including sexual orientation and gender identity. Information on sexual orientation and gender identity is critical to accomplishing our analytic goals with regard to assessing coverage of the study population and effectiveness of the screeners being tested. As stated by OMB’s Federal Interagency Working Group on the Measurement of Sexual Orientation and Gender Identity, “At a time when sexual and gender minority (SGM) populations are becoming more visible in social and political life, there remains a lack of data on the characteristics and well-being of these groups. In order to better understand the diverse needs of SGM populations, more representative and better quality data need to be collected” (OMB, 2016).
These forms will be used for data collection for up to 2 years (or 24 months), depending how long it takes to reach target data collection numbers. Respondents will be youth or adult participants in OFA-funded HR programs at 4 HR program sites across the nation. Table A.12.1 provides the annual burden associated with this effort.
Six hundred adult participants will complete the three IPV screeners, the locator section for adults, and the post-screener questions. Generally, HR programs serve low- to lower middle-income men and women (e.g., most participants in the Building Strong Families and Supporting Healthy Marriage demonstrations reporting incomes of less than $25,000 per year). We estimate adult participants will be primarily low income and that their hourly pay rates will stay the same through their study participation (which would be up to 3 months). Therefore the wage estimate for adult participants is based on an annual income of $40,180, which is 200% of the 2016 federal poverty level for a three-person household; this translates to an hourly rate of $19.32.
Six hundred youth will complete the three TDV screeners and the post-screener questions. We have used the federal minimum wage of $7.25 to estimate the hourly wage for youth participants because youth typically hold minimum wage jobs (if they are employed). Six hundred parents of youth will complete the contact information form for parents of youth younger than 18. Because HR programs typically target youth in low-income areas, we estimate the average hourly rates for parents of youth participants will be the same as the HR adult program participants; $19.32 an hour for a three-person household.
Staff from four sites will be involved in recruiting participants, administering surveys, and maintaining data collection protocol and record-keeping. We base an estimate of site staffs’ average hourly wage on the U.S. Bureau of Labor Statistics, National Compensation Survey, 2010, that indicates that “social and community service managers” average hourly wages are $27.86.
Table A.12.1. Estimated Annualized Burden Costs and Total for 2-Year Data Collection
Activity |
Total No. of Respondents |
Annual No. of Respondents |
No. of Responses per Respondent |
Average Burden per
Response |
Total Annual Burden Hours |
Hourly Wage Rate1 |
Total Annual |
1.1: IPV Screener 1 |
600 |
300 |
1 |
.167 |
50 |
$19.32 |
$967.93 |
1.2: IPV Screener 2 |
600 |
300 |
1 |
.167 |
50 |
$19.32 |
$967.93 |
1.3: IPV Screener 3 |
600 |
300 |
1 |
.25 |
75 |
$19.32 |
|
2.1: TDV Screener 1 |
600 |
300 |
1 |
.167 |
50 |
$7.25 |
$363.23 |
2.2: TDV Screener 2 |
600 |
300 |
1 |
.167 |
50 |
$7.25 |
$363.23 |
2.3: TDV Screener 3 |
600 |
300 |
1 |
.25 |
75 |
$7.25 |
$543.75 |
Post screener questions for adults (C.1) |
600 |
300 |
1 |
.1 |
30 |
$19.32 |
$579.60 |
Post screener questions for youth (C.1) |
600 |
300 |
1 |
.1 |
30 |
$7.25 |
$217.50 |
Locator section for adults (C.2) |
600 |
300 |
1 |
.1 |
30 |
$19.32 |
$579.60 |
Contact information form for parents of youth younger than 18 (C.3) |
600 |
300 |
1 |
.1 |
30 |
$19.32 |
$579.60 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Participant recruitment |
600 |
300 |
1 |
.1 |
30 |
$27.86 |
$835.80 |
Administration of data collection protocol and record-keeping |
600 |
300 |
1 |
.167 |
50 |
$27.86 |
$1,395.79 |
Total |
|
|
|
|
550 |
|
$8,842.96 |
N/A = Not applicable.
ACF does not anticipate additional costs to respondents other than time spent (captured in A.12.1, above).
The estimated cost to the federal government for the proposed data collection and analysis is $860,000. This figure includes labor hours, and other direct costs (travel, photocopying, mailing, etc.) for both years of data collection. The annual cost is $430,000.
This is a new information collection.
The table below provides a timeline based on OMB approval in February 2017 with data collection beginning upon approval. Data collection will continue until sites have enrolled and administered screeners to sufficient numbers of respondents to provide adequate power for the planned analyses (see Supporting Statement B), but no longer than two years (through February 2019 assuming approval is received in February 2017). All dates are dependent on OMB approval.
Activities |
Due Date |
Data collection by HR program staff |
January through December 2019 |
Analysis |
January through September 2020 |
Final brief |
December 2020 |
OMB expiration date will be displayed on all necessary materials and documents.
There are no exceptions to the certification.
1 We define the term “universal education” as information on IPV or TDV that is proactively provided to all program participants. “Universal education” typically includes information about the warning signs of IPV and TDV, how to access help, and phone numbers for local and national hotlines.
2 These tools will be referred to throughout this OMB application as “screeners.”
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | McKay, Tasseli |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |