0990_Section B Supporting Statement DVHF Revised November 2018 clean

0990_Section B Supporting Statement DVHF Revised November 2018 clean.docx

Domestic Violence Housing First Demonstration Evaluation

OMB: 0990-0458

Document [docx]
Download: docx | pdf

Supporting Statement for

Domestic Violence Housing First Demonstration Evaluation

Part B


B. Collection of Information Employing Statistical Methods

1. Respondent Universe and Sampling Methods

The Domestic Violence Housing First (DVHF) demonstration project is funded by the Bill & Melinda Gates Foundation and includes domestic violence agencies in the state of Washington. The DVHF demonstration project began in 2015 and ends at the end of 2019. The federal government is conducting an evaluation to determine the effectiveness of the DVHF demonstration. If data are not collected within a timely manner we will lose this unprecedented opportunity to examine whether and how Domestic Violence Housing First impacts the safety, housing stability, and well-being of survivors and their children.

All eligible clients receiving services at any of the four domestic violence agencies participating in the DVHF evaluation research constitute the universe for the study, and all will be invited to participate. We will recruit 320 participants – an anticipated 80 from each of the four agencies – over the course of 15 months of participant recruitment. Given prior experience in similar studies, we expect that at least 90% of clients who are invited will agree to participate. It will be necessary to include the entire universe of eligible clients in order to achieve a sufficient sample size for the planned analyses and also to ensure variability in the services received by study participants.

The sample of 320 will provide greater than 80% power at 2-tailed p < .05 for a minimum detectable difference of d=.25 SD (a small effect size) on outcome trajectories (both linear and quadratic) across four measurement points, assuming approximately 50% of the sample receive mobile advocacy and flexible financial assistance. (See section B2 for more in-depth explanation of power estimates.)

2. Procedures for the Collection of Information

Under the guidance of the study’s two Project Coordinators, advocates from each of the four participating domestic violence agencies will invite eligible clients to hear more about participating in this research study. Eligibility criteria include (1) being a recent survivor of intimate partner violence, (2) being homeless or at immediate risk of becoming homeless, (3) having entered services within the prior three weeks, and (4) speaking English or Spanish, or one of the languages that the interviews have been translated into or for whom we have an interviewer. Careful procedures will be followed, under the guidance of the Project Coordinators, to assure that all eligible participants are offered the opportunity to participate in the study. For example, the Project Coordinator will contact each of their two agencies at least every other day, and will ask their Point of Contact (POC) about new clients of the agency who meet eligibility requirements for the study. The Project Coordinator will ascertain with the POC if the client has been asked to participate in the study, and will make every effort to assure that the client is approached about the study within 10 days of receiving services. The time frame of 10 days has been chosen to ensure that clients are not approached about the research study when they are in immediate crisis, and clients will still be eligible for study participation up to 21 days into their receipt of services from the agency.

Once a client agrees to hear more about the study, the Project Coordinator or other member of the research team will contact them, ensure that they are eligible for participation, and provide detailed information about the study and their rights as a research participant. The first interview will then be scheduled, at a location that is private and convenient for the survivor. Initial interviews will be conducted in person. Subsequent interviews will be conducted in person unless the participant moves out of the area or prefers to be interviewed by telephone.

Interviews. Interviews will be conducted by either a Project Coordinator or other highly trained and supervised member of the research team. Interviewers will receive intensive training in safe, sensitive, careful interviewing of IPV survivors, based on prior trainings developed and utilized by the principal investigator over numerous prior studies. Interviewers will demonstrate competence prior to conducting interviews with research participants. Ongoing supervision will ensure consistency and attention to detail. All interviews will be scheduled at the convenience of the participant and will take place at a safe and confidential location of the survivor’s choosing (e.g., private room in the agency, their home if safe). Using established and trusted safety procedures (see Sullivan & Cain, 2004; Sullivan et al., 1996), project staff will contact the participant one day prior to the interview to confirm the appointment and answer any final questions. At the start of the first interview, the interviewer will review the purpose of the study, describe what the interview will involve, inform the survivor of their rights as a research participant, and answer any questions they may have. After the participant gives consent, the interviewer will proceed with the interview. If at any point the participant becomes distressed by the process, the interviewer will stop and provide empathy. In the unlikely event (based on the principal investigator’s prior experience) that a survivor becomes highly distressed, they will be immediately referred to a mental health counselor at the recruitment site. Each participating agency has mental health counselors and clinical social workers with extensive experience in the provision of services to multi-stressed individuals. If immediate attention is needed, the interviewer will contact the agency with the participant, and provide transportation to their site. At the end of the interview, the interviewer will answer any questions, thank the participant, compensate them for their time, and discuss the logistics of the next interview.

All interview data will be electronically captured directly onto laptop computers, using Qualtrics software. Electronic data capture has been found to be superior to paper surveys, as there are fewer errors in data entry and the process is faster and less expensive (Lane, Heddle, Arnold, & Walker, 2006). Data are encrypted and downloaded directly onto a secure, password protected server at Michigan State University, allowing for data management and analysis to occur expediently and safely.

In-depth interviews will be conducted with all study participants either in person or by phone, every six months over 24 months. Each is approximately 1.25 hours in length. Basic demographic information will be captured at the first interview, along with historical and baseline information about housing stability; economic stability; physical, emotional, and economic abuse; baseline measures of quality of life and mental health symptomatology and substance abuse; and parents’ report of children’s academic attendance and achievement as well as behavioral problems and socio-emotional skills. The second interview will include repeated administration of baseline measures plus information about services received; subsequent interviews will include repeated administration of baseline measures. Interviews will be conducted at six-month intervals. The six month time frame was chosen to be long enough for change to occur but short enough that participants can recall events accurately. If data were collected less frequently we would lose valuable information about event timing and causality.

Statistical Power. Estimates of statistical power were computed for longitudinal multilevel analyses that will be used to model outcome trajectories over 4 time-points. Repeated assessments of each participant will be modeled at level 1 of the MLM; both linear and quadratic slope terms will be included if needed to reflect acceleration or slowing of change over time. Type of service received (mobile advocacy and flexible funding vs. standard services) will be added to the model at level 2, allowing tests of the significance of trajectory differences between the two service types (i.e. by estimating Service type x Slope interactions. These power estimates take into account the use of propensity score covariates, calculated to account for possible pre-existing differences related to type of service received, assuming that they account for as much as 30% of the variance in the outcome trajectory.

Hypothesis 1. (Survivors receiving mobile advocacy and flexible financial assistance will show greater improvement in housing stability, economic stability, safety, quality of life, and mental health and substance abuse compared to survivors receiving “standard services” that either do not include mobile advocacy or flexible funding, or include minimal levels.) The sample of 320 will provide greater than 80% power at 2-tailed p < .05 for a minimum detectable difference of d=.25 SD (a small effect size) on outcome trajectories (both linear and quadratic) across time (Spybrook et al, 2011), assuming approximately 50% of the sample receive mobile advocacy and flexible financial assistance. The N will provide adequate power even if the proportion receiving mobile advocacy and flexible financial assistance is as low as 30%, with the minimum detectable difference in slopes rising to d=.38 SD, which is still a small-to-medium effect size. The anticipated minimum detectable difference in slopes of d=.25 SD translates into the following differences in raw score metric, which are based on modal standard deviations from published studies of similar populations, where available: 8.50 points on the Community Composite Abuse Scale; 0.53 points on the Housing Instability Index; 1.50 points on the PHQ-9 depression scale; 1.15 points on the GAD-7 anxiety scale; 0.30 points on Quality of Life; 0.25 points on Social Support.

Hypothesis 2. (As parents’ housing stability and well-being increase, so too will children’s outcomes. Specifically, children will demonstrate positive changes over time in school attendance and achievement, behavioral problems, and social-emotional skills.) . For child outcomes, the minimum detectable difference in slopes will be larger than for analyses involving parents (d=.43, assuming that 50% of their parents receive mobile advocacy and flexible financial assistance) due to the anticipated smaller sample size; this translates into a raw score difference in slopes of 1.12 points on the Strengths and Difficulties total score.

Power will be lower for tests of whether child outcomes are mediated by parent outcomes, both because these tests involve indirect effects and because the sample of survivors with children will be somewhat smaller than the total of 320. Assuming that the standardized direct effects comprising the indirect effect (i.e., service type -> parent outcome and parent outcome -> child outcome) are both at least .21 and that the sample of participants with children is at least 150, power will exceed 80% to detect these mediated effects.

3. Methods to Maximize Response Rates and Deal with Nonresponse

To maximize response rates at each time point, we will use procedures similar to those that resulted in a 94% retention rate over two-year follow-up in the principal investigator’s prior studies. The first phase of the retention process consists of “setting the stage” by promoting trust with participants, as well as implementing reminders for future interviews, providing a phone line for participants to call or text if necessary, and clarifying compensation for participation. The second phase consists of implementing proactive and creative retention strategies (e.g., visiting participants at home). The final phase involves using social network and community-oriented strategies to contact participants. Participants will be contacted every 3 months in order to ascertain if their contact information has changed or is expected to change, and we will ask for contact information for anyone in their lives who is likely to know how to find them over time, as well as permission to contact these individuals if necessary. We will also offer phone minutes or phones with paid minutes to participants to increase their ability to stay in touch with the research team. All retention strategies are designed to ensure participants’ safety and confidentiality.

Missing data will be minimized through the use of proven methods of participant retention and careful, face-to-face interviewing. In addition, one of the advantages of the mixed effects analytic approaches that we will use for analysis is the ability to retain in analysis all individuals with person-level data, including those with missing or mistimed interviews. It should be possible to include all individuals who complete initial interviews in the analyses. Pattern mixture modeling (Little, 2009) will be used to determine whether missing data affect study conclusions or are “ignorable” (i.e., conditionally missing at random). Ignorable missing data will be estimated using expectation maximization and multiple imputation procedures appropriate for longitudinal data (Enders, 2010). Sensitivity analysis will be used to examine the possible impact on study conclusions involving any missing data found to be non-ignorable (Daniels & Hogan, 2008).

4. Tests of Procedures or Methods to be Undertaken

Interview protocols have been programmed into Qualtrix, and pilot testing of interviews is currently underway with a small (<10) sample. Focus of the pilot tests is timing of the interviews, debugging the Qualtrix format, and finalizing section transitions. Most measures have been used previously by the investigators and are not expected to undergo any but minor changes.

5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Individuals who have been consulted on design and analysis issues include the following:


Lisa Goodman, PhD

Professor, Counseling, Developmental & Educational Psychology Department

Campion Hall, Room 310

Boston College

140 Commonwealth Avenue

Chestnut Hill, MA 02467

Lisa.goodman@bc.edu

617-552-1725


Rubén Parra-Cardona, PhD

Associate Director, MSU Research Consortium on Gender-based Violence

Associate Professor, Human Development & Family Studies

3D Human Ecology Building

Michigan State University

E. Lansing, MI 48824

parracar@hdfs.msu.edu

517-432-2269


Individuals who designed the data collection:

Cris Sullivan, Ph.D.

Professor, Department of Psychology

Psychology Building

Michigan State University

East Lansing, MI 48824


Deborah Bybee, Ph.D.

Professor, Department of Psychology

Psychology Building

Michigan State University

East Lansing, MI 48824

bybee@msu.edu


Individual responsible for data collection:


Cris Sullivan, Ph.D.

Professor, Department of Psychology

Psychology Building

Michigan State University

East Lansing, MI 48824


Individual responsible for analyzing the data:

Deborah Bybee, Ph.D.

Professor, Department of Psychology

Psychology Building

Michigan State University

East Lansing, MI 48824

bybee@msu.edu









4

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDeb
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy