HomelessProgramsEval OMB SS-A_1.14.14

HomelessProgramsEval OMB SS-A_1.14.14.docx

Evaluation of Programs to Provide Services to Persons Who Are Homeless with Mental and /or Substance Use Disorders

OMB: 0930-0339

Document [docx]
Download: docx | pdf

The NATIONAL Evaluation OF the Substance Abuse and

Mental Health Services administration’s (samhsa’S) homeless programs

Supporting Statement


A. JUSTIFICATION


1. Circumstances of Information Collection


The Substance Abuse and Mental Health Services Administration’s (SAMHSA) Center for Mental Health Services (CMHS) and Center for Substance Abuse Treatment (CSAT) is requesting approval from the Office of Management and Budget (OMB) for data collection activities for SAMHSA’s Evaluation of Programs to Provide Services to Persons who are Homeless with Mental and/or Substance Use Disorders. This program is scheduled through September 2016 and will conduct a cross program evaluation for the Projects for Assistance in Transition from Homelessness (PATH); Services in Supportive Housing (SSH); Grants for the Benefit of Homeless Individuals (GBHI), and Cooperative Agreements to Benefit Homeless Individuals (CABHI) grante programs. The data collection activities described in this package include conducting telephone interviews, site visits with guided interviews and web-based assessments:


  • Project Director Telephone Interview

  • Site Visit Guides

  • Evidence-Based Practice (EBP) Self-Assessment, Parts 1 & 2

  • Permanent Supportive Housing (PSH) Self-Assessment


SAMHSA’s Programs to Provide Services to Persons who are Homeless with Mental and/or Substance Use Disorders (hence forward referred to as Homeless Programs for convenience) are authorized under Sections 506 (GBHI and CABHI), 520A (SSH), and 521 of the Public Health Service Act (PATH), as amended. The program also addresses Healthy People 2020 Objectives: Mental Health and Mental Disorders: Treatment Expansion (Focus Areas: MHMD-8, 9, 10 and 12) and Substance Abuse: Screening and Treatment (Focus Area: SA- 8). The Homeless Programs also support SAMHSA’s Strategic Initiatives of Data, Outcomes and Quality and Recovery Support.


Evaluation of the PATH program helps meet the PATH program reporting requirements under Section 528 of the Stewart B. McKinney Homeless Assistance Amendments Act of 1990 which requires a tri-annual report detailing the purpose of expended funds.


Background

Homelessness affects more than 3.5 million people in the United States (National Law Center on Homelessness & Poverty, 2009) and about 38% of those homeless are alcohol dependent and 26% abuse other drugs (Burt et al., 1999; National Coalition for the Homeless, 2009). In a general homeless population, about 32% of men and 36% of women are estimated to have co-occurring mental and addictive disorders (North, Eyrich, Pollio, & Spitznagel, 2004). Overall, in a national sample, about three-quarters (74%) of people admitted to shelters reported any alcohol, drug, or mental health problem in the year before shelter admission (Burt et al., 1999). The literature is replete with evidence suggesting that homelessness, substance use, and mental illness are closely associated and that the prevalence rates for the latter two problems are high among homeless individuals (Hiday, Swartz, Swanson, Borum, & Wagner, 1999; Mallett, Rosenthal, & Keys, 2005; Shelton, Taylor, Bonner, & Bree, 2009; Vangeest & Johnson, 2002).


The issues of substance use, mental and co-occurring disorders among people who are homeless or at risk of homelessness can be effectively addressed through behavioral health treatment and housing interventions. Effectiveness of substance use treatment (e.g., Modified Therapeutic Communities, Motivational Enhancement Therapy, Service Outreach and Recovery) in producing abstinence from substance use and a number of positive outcomes like employment stability, treatment adherence, and reduced unprotected sex has been well established (Ball et al., 2007; Borsari & Carey, 2000; Brown & Miller, 1993; Conrad et al., 1997; Drake, Yovetich, Bebout, Harris, & McHugo, 1997; Kertesz, Crouch, Milby, Cusimano, & Schumacher, 2009; Miller, Benefield, & Tonigan, 1993; Project MATCH Research Group, 1997; Rosenblum, Magura, Kayman, & Fong, 2005; Stephens, Roffman, & Curtin, 2000), as have the effects of integrated treatment for co-occurring disorders (e.g., Integrated Dual Disorders Treatment) on substance abuse, mental health, hospitalization, violence, and homelessness (Drake, McHugo, & Noordsy, 1993; Mueser, Drake, & Miles, 1997).


Housing interventions are most effective when combined with other services (Caton et al., 2007; Nelson, Aubry, & Lafrance, 2007), however, it is unclear which combination of housing models, services, and treatment yields the most robust outcomes with respect to housing stability, substance use, psychiatric symptomology, employment, and other important outcomes. In addition to combining treatment and housing strategies, various common structural characteristics or services, systems and program organization have been found in effective programs. In their review, Cheng and Kelly (2008) describe structural characteristics generally found in effective programs: interagency coalitions; interagency service delivery teams; interagency management information systems and client tracking systems; interagency agreements that formalize collaborative relationships; interagency application for funds; uniform application, eligibility criteria, and intake assessments; and co-location of services.


Although the research literature on homelessness and its amelioration among individuals with substance abuse, mental illness, and co-occurring disorders is extensive, few studies have included non-HUD–funded programs to better describe prevalence of substance abuse, mental illness, and co-occurring problems; evaluation of subgroups of homeless individuals within a single study (with similar definitions, measures, and procedures); information on the needs of subpopulations across the continuum of homelessness; implementation and effectiveness of EBPs delivered specifically in homeless populations; fidelity of treatment and housing models implemented; cost-effectiveness in complex sites that employ multiservice interventions; and the value that comprehensive initiatives, such as those implemented by Homeless Program grantees, add to the overall treatment systems. There is also a dearth of empirical studies that look at performance measurement of homeless programs, benchmarking, and efficiency measures. Finally, there are few multisite studies of the sustainability of programs after cessation of federal funding and factors associated with sustainability.


Overview of SAMHSA’s Homeless Programs

Recognizing the societal costs incurred when individuals in need fail to get treatment services and the effectiveness of treatment, SAMHSA CMHS and CSAT were funded by Congress to establish four programs: SSH, PATH, GBHI, and CABHI. Table 1 provides a comparison of key characteristics for each program.

Table 1. SAMHSA Homeless Progams and Evaluation Cohorts of Focus


GBHI

SSH

CABHI

PATH

Number of Grantees

216 grants awarded since 2001

62 grants awarded since 2007

31 grants awarded since 2011

56 awarded annually to US States and Territories

Cohorts (Number of Grantees)

Pre 2009 (168), 2009 (25), 2010 (23)*

2007-2008 (14), 2009 (43), 2010 (5)*

2011 (23), 2012 (8)*

Annually since 1990 2010-2012 (56 per year)*

SAMHSA Center

CSAT

CMHS

CSAT & CMHS

CMHS

Grant Type:

Services Grant

Services Grant

Cooperative Agreement

Formula Grant

Program Purpose

Expand & strengthen treatment and other services including linkage to housing for individuals and families who are homeless, chronically homeless, or at risk for homelessness & have a substance use disorder, mental disorder or co-occurring disorders

Help prevent or reduce chronic homelessness by funding services for individuals & families experiencing chronic homelessness and mental or co-occurring substance use disorders in coordination with PSH programs

Develop or expand the community infrastructure that integrates treatment & services for mental & substance use disorders & permanent housing for persons who are chronically homeless; enroll clients in Medicaid and other mainstream programs; & provide direct supports & treatment to those not Medicaid eligible

Assist individuals with serious mental illness (SMI) or co-occurring SMI and substance use disorders who are homeless or at risk of homelessness to obtain treatment, other supports, and linkage to permanent housing

EBPs Required to Implement

At least one EBP

At least one EBP

At least one EBP

Not required to implement EBPs

Services Provided

Direct treatment for substance use and mental disorders; outreach; case management or other linkage strategies; and wraparound and recovery support services

Mental health and substance abuse treatment; support services, including outreach, case management, and assistance in obtaining benefits; and other wraparound services

Behavioral health, housing support, and other recovery-oriented services; coordination of housing and support services; engagement/enrollment in Medicaid and other mainstream benefits

Outreach, screening and diagnostic services; habilitation and rehabilitation services; mental health and alcohol or drug treatment services; case management ; and other wraparound services

Award Amount & Length

Up to $350,000 per year up to 5 years

Up to $400,000 per year up to 5 years

Up to $500,000 per year up to 3 years

For FY2012, ranges from $50,000 to $9 mil.

* Yellow highlighted cohorts are included in the primary-level data collection activities for SAMHSA’s Evaluation of Programs to Provide Services to Persons who are Homeless with Mental and/or Substance Use Disorders.

National Evaluation of SAMHSA’s Homeless Programs

The evaluation contractor has been tasked with conducting a cross-program, multi-site evaluation of SAMHSA’s four Homeless Programs (GBHI, SSH, CABHI and PATH) over a five-year period. The contractor had led a previous cross-site evaluation of the GBHI program and a separate cross-site evaluation of the SSH program. In the course of both evaluations (the second year of the GBHI evaluation and the first year of the SSH evaluation), SAMHSA decided to combine the two evaluation efforts and add the CABHI and PATH programs to identify commonalities across the four programs that may be used to compare effectiveness of the grant programs and of models of service delivery.


Currently, SAMHSA CMHS & CSAT monitors the performance of the Homeless Programs through the Government Performance and Results Act (GPRA) (OMB No. 0930-0208) and the National Outcomes Measures (NOMS) (OMB No. 0930–0285), which are both client-level assessments administered by grantees. Although the GPRA and NOMS data are sufficient for monitoring client outcomes, they are not sufficient for understanding the services and context in which each grantee operates.


2. Purpose and Use of Information

All of the SAMHSA Homeless Programs aim to reduce homelessness, increase housing stability, ameliorate substance use, and stabilize mental health functioning for populations with behavioral health needs. However, the programs differ in targeted subpopulations, approaches to providing services, specific services provided, and the setting in which services are provided. The Homeless Programs cross-program evaluation broadly aims to identify commonalities across the Homeless Programs service delivery by examining which service models are delivered, with what outcomes, for which populations, and with what resulting comparative effectiveness and cost effectiveness. To compare programs, the evaluation will identify service models based on service approach (e.g., direct vs. referral), services (type of service and adherence to practice), housing types and models, type of partnerships and factors leading to program sustainability. The resulting models will facilitate interpretation of client and program-level outcomes, comparative effectiveness and cost-effectiveness evaluation. The evaluation questions (listed in Table 2 below) fall within three interrelated evaluation components: structure/process, outcome, and economic.


Evaluation Components

Structure and Process Component. Structure encompasses the resources available in a healthcare delivery system, representing the capacity to deliver quality care, but not the care itself. Process represents what is done with the client. The structure and process component of the evaluation, in part, defines grantee characteristics which allows for the overall evaluation to compare and contrast grantee programs. It describes implementation processes while identifying barriers and facilitators faced by grantee projects. This evaluation component also identifies the housing models and services included in grantee projects and what factors shaped the grantees’ choice in selecting them. Data derived from these structure/process measures will be used to describe the Homeless Programs; to explain outcomes; in comparative effectiveness and cost effectiveness analyses; and to aid in the interpretation of findings.


Outcome Component. The outcome component of the evaluation will focus on the effect of the Homeless Programs on client outcomes, accounting for grantee and project characteristics, service systems and community contexts. The purpose of the outcome evaluation is to identify and measure post-project participation findings across the broad array of outcomes expected to be influenced by the range of services provided by the Homeless Programs grantees either directly or through referral. Client-level data will be extracted from GPRA (OMB No. 0930-0208) and NOMS (OMB No. 0930–0285) data submitted by grantees to SAMHSA.

Both GPRA and NOMS are collected at client intake (baseline) and 6-month follow-up, providing longitudinal data that will be central to outcome analyses.


Without a comparison group, the outcome evaluation is not an impact study, does not intend to draw causal inferences or produce national estimates; instead analysis will help identify associations and possible effects which can be compared to findings in the existing literature. We will use analytic techniques to provide insights into factors associated with greater improvement in outcomes and to control for extraneous putative factors. First, as indicated by the reviewers, we will make use of the GPRA and NOMS longitudinal collection for pre-post comparisons—that is comparing each participant to him or herself and indicating baseline to 6-month change. Secondly, in a retrospective comparison design we have developed comparisons of the cohorts of focus for the client-level data (2009-2012) to prior SAMHSA homeless cohorts (2001-2008) to understand trends in data over time. We will in part inform our interpretations of the client outcomes within the context of changes in the SAMHSA’s Homeless Portfolio requirements and to account for potential differences overtime in the portfolios’ performance with regard to NOMS andGPRA client outcomes. Third, we will explore the possibility of conducting propensity score matching with GPRA and NOMS data from CMHS and CSAT grant programs other than the Homeless programs. Such programs serve similiar individuals to the homeless programs—those with co-occurring mental and substance use disorders requiring treatment and wrap around services—but the projects themselves are not funded or are not designed to explicitly target or provide services for reduce homelessness or increase housing stability. Rather these projects may target criminal justice involvement or HIV prevention. These SAMHSA programs also collect GPRA and NOMS data and therefore their data contain the same elements as those for the Homeless program grantees and the same data collection procedures and time points (baseline and six month follow-up interviews). Additionally, all SAMHSA programs contain brief project descriptions describing the services and some population characteristics which will be used to clarify similarity and difference in project and overall program portfolios. Nonetheless, extreme care would be given to making sure that individuals from these programs are meaningfully similar in terms of individual characteristics and the nature of the programs serving them. Depending on the viability of such samples, we would continue to proceed with caution and would likely apply quasi-experimental methods such as propensity score matching and conduct an array of sensitivity analyses.


Such comparison data may provide utility in several ways. First, comparisons with similar individuals from other programs will provide context around the changes in the pre-post outcomes we will estimate in our population. We will be able to assess the likelihood of other biases, such as regression to the mean, secular trends and cyclical outcome patterns. Although we may not be able to make rigorous statistical conclusions about point estimates, such evidence is important in building a case for the validity of our study results. For example, it would be important context for our study results if we found equally large improvements homelessness for both our study population and other unrelated program populations. Second, we may discover that certain subgroups from our Homeless programs and from comparison programs may actually support a more formal rigorous statistical analysis. Such an analysis would likewise rely on propensity score or other quasi-experimental methods. Also, it would focus on achieving internal validity for specific service/program models and populations rather than representing our primary evaluation results.


In contrast to these data from similar SAMHSA programs, we have considered using data from other sources but do not think that it would be viable or useful. We would determine the exact matching method following pending preliminary analyses. We have an agreement in place with SAMHSA to obtain GPRA and NOMS data from other grantee programs, if desired. In any event, we will stop short of concluding that the Homeless programs caused any observed improvements in outcomes.


Economic Component. The economic component of the evaluation connects significantly to all other aspects of the evaluation by incorporating results from the structure/process and outcome components of the evaluation. The economic evaluation focuses on obtaining cost metrics which include grantee costs incurred during the fiscal year in which the site visit falls (e.g., FY 2013, 2014, 2015), labor hours in the past month, partner funding for services provided to clients in the grantee project and estimates of staff labor in a typical week. This data allows for comparison across the grantee programs, measuring costs and cost effectiveness and determining factors that affect the costs and cost effectiveness. .


Combined, the three evaluation components will provide SAMHSA with the information it needs to understand critical aspects of the Homeless Programs: how the programs are implemented in various contexts, the effects on targeted outcomes, and the associated costs. It will also provide information on the comparative effectiveness (e.g., how implementation of different types or combintations of services is associated with improved outcomes ) and cost effectiveness (e.g., how costs are associated with improved outcomes) of the various approaches.


We use the term “comparative effectiveness” in a broad sense. By comparative effectiveness, we do not mean that we will conduct controlled studies of competing interventions to determine which has the greatest impact on outcomes. Rather, we will use the natural variation in program implementation among grantees and explore which programs or service models are associated with greatest improvement in outcomes. It is important to note that we have increased rigor when analyzing whether outcomes vary by program model, service or component. Although, we do not have a “pure” placebo group of individuals, randomized control trials for combinatorial treatments do not always include a placebo. Rather, they are focused on the relative effects of alternative treatments and their interactions. Like these studies, the programs and services we are evaluating have a substantial evidence base (when implemented appropriately). From this perspective, our analyses will focus on the relative pre-post changes among a variety of service combinations. In this way, we eliminate some of the bias from permanent unobserved individual heterogeneity and from natural patterns like regression to the mean. Also, programs vary in their models and services in ways that are not necessarily correlated with population characteristics. Although not as rigorous as randomization, similarities and differences across groups of programs can allow us to isolate certain model components or services for analysis. By carefully assessing stage of implementation and level of fidelity to practice and service components through the proposed data collection we will also be able to conduct analyses for clusters of practices to assess the relationship of low, medium and high practice or service implementation adherence to client outcomes. It is also possible to apply instrumental variables, propensity score matching and other quasi-experimental methods to comparative effectiveness analysis to further test for and reduce bias due to any confounding from correlation between individual and program characteristics.


In addition to filling SAMHSA’s needs for specific information on the Homeless Programs, the evaluation will also make a broader contribution to the literature and the behavioral health services fields. The primary sources of data for this evaluation: the Project Director Telephone Interview (referred to as PD Interview), Site Visit Guides, Evidence-Based Practice (EBP) Self-Assessment Parts 1 and 2, and Permanent Supportive Housing (PSH) Self-Assessment. The information collected by each directly contributes to each evaluation component described above and to specific evaluation questions (described in detail below). Table 2 lists selected analytic goals, the evaluation questions and the corresponding data collection tools.

With regard to our plans for broadly comparing programs or service models on their associations with outcomes, see in particular EQ3 and EQ4.


Table 2. Analytic Goals, Evaluation Questions and Data Collection Tools

Select Analytic Goals

Evaluation Questions

Data Collection Tools

Evaluate the process of programs; how process differs from proposed programs and why; and the barriers and facilitators faced in implementing programs.

EQ1: To what extent have grantees achieved program implementation goals?

EQ2: What factors affect grantee success in meeting implementation goals?

  • PD Interview

  • Site Visit Guides

  • EBP Self-Assessment

  • PSH Self-Assessment

Determine the program outcomes, such as number of clients reached effectively, and identify program/contextual factors which contributed to positive client outcomes.

EQ3: What degree of change in targeted outcomes do clients experience?

EQ4: What client-level and program/system factors are associated with positive outcomes (e.g., reduce chronic homelessness, improve housing retention, improve behavioral health status) and with poorer outcomes?

The PD Interview, Site Visit Guides, EBP Self-Assessment and PSH Self-Assessment provide the grantee level data required to aggregate grantees and for mediational models that aid in interpretation of client level data.

Examine the implementation of evidence-based practices and the comparative effectiveness of evidence-based practices.

EQ4: What client-level and program/system factors are associated with positive outcomes (e.g., reduce chronic homelessness, improve housing retention, improve behavioral health status) and with poorer outcomes?

  • EBP Self-Assessment

  • PSH Self-Assessment

Evaluate the comparative effectiveness of Homeless Programs service models and the cost effectiveness of these programs and services offered.

EQ5: What is the comparative effectiveness and cost effectiveness of the service models?

  • PD Interview

  • Site Visit Guide – Cost Questionnaire

  • PSH Self-Assessment

Examine the sustainability of programs.

EQ6: What program and system-level factors are associated with sustaining the project after grant funding ends? [Not applicable to PATH]

  • PD Interview

  • Site Visit Guides


Measures Collected Through the Evaluation

The PD Interview, Site Visit Guides, EBP Self-Assessment, and PSH Self-Assessmen­t collect discrete information developed to ensure that, collectively, all required information is collected to meet SAMHSA’s goals of 1) evaluating the four Homeless Programs, 2) conducting comparative effectiveness of the service models indicative of these programs, and 3) informing SAMHSA of the resulting promising models. Figure 1 depicts how the tools fit together for data collection.


Figure 1. Data Collection Tools


1. Project Director Telephone Interview

The PD Interview (Attachment 1) is a semi-structured interview designed to systematically collect key grantee project characteristics which will directly inform the structure/process evaluation component. The PD Interview also provides essential contextual data for the outcome and economic components of the evaluation by documenting the partnerships and services each grantee includes in their project. Select questions that are not relevant to certain Homeless Programs (e.g., PATH) will not be asked of that program; which will be reflected in automatic skip-out patterns.


The PD Interview instrument is adapted from a project director interview guide that was developed and administered under a previous GBHI evaluation that focused on the 2009 cohort, led by the same contractor. Adaptations were made for the Homeless Programs PD Interview to better align the questions with the SSH, CABHI and PATH programs, including expanding the Housing section and dropping the Sustainability and Local Evaluation sections for PATH grantees.


The PD Interview is composed of the following sections: Grantee Agency and Project Characteristics, Target Population, Stakeholders/Partners, Services, EBPs/Best Practices, Housing, Project Organization and Implementation, Sustainability, Local Evaluation, Technical Assistance and Lessons Learned. Table 3 provides a brief description of each section, the Evaluation Questions that are answered, and source materials.


Table 3. PD Interview Sections, Evaluation Questions and Source Materials

Grantee Agency and Project Characteristics

Asks for general information on a grantee’s agency and project setup, including funding sources used, client count targets, geographic areas targeted and general information on the type of staff used.

Items

Evaluation Questions

Source

1 - 16

Question 2

GBHI FY2009 Evaluation PD Interview

Target Population

Asks for information on the grantee’s recruitment process, any demographic based admission criteria and any targeted populations by race/ethnicity, age, sex, health and treatment status, mental health and substance abuse severity, homeless status and other special population categories.

Items

Evaluation Questions

Source

17 –20

Question 2

GBHI FY2009 Evaluation PD Interview

Stakeholders and Partners

Asks the grantee to identify the type of partners and partnerships used to implement their project and their integration into the project.

Items

Evaluation Questions

Source

21 - 32

Questions 1 and 2

GBHI FY2009 Evaluation PD Interview

Services

Identifies the clinical and wraparound services available to project clients and the structure in which they are available—how many clients receive the service, who provides the service, where is the service provided, how is the service paid for and the length of time that clients can receive the service.

Items

Evaluation Questions

Source

33 - 34

Question 4

GBHI FY2009 Evaluation PD Interview

Evidence Based Practices/Best Practices

Identifies the primary clinical and non-clinical Evidence Based Practices (EBPs). It outlines which EBPs were proposed by grantee projects and which EBPs were ultimately implemented, how many clients receive the EBPs, who provides the EBPs and where the EBPs are provided. Questions also cover the phase of implementation, fidelity monitoring, and identification of barriers and facilitators to implementation.

Items

Evaluation Questions

Source

37–39

Question 1

GBHI FY2009 Evaluation PD Interview

Housing

Asks what types of housing are included in the project. For each type of housing implemented by the grantee, this section asks about the project’s focus in terms of housing clients, the type of housing sites available, type of funding used, degree of client choice, the type of housing support services provided, restrictions on placing clients in housing, the degree of separation between housing and services, and housing philosophy adopted by the project.

Items

Evaluation Questions

Source

40 - 46

Question 1

GBHI FY2009 Evaluation PD Interview

Project Organization and Implementation

Asks the grantee respondent to rate staff experience, partner support during implementation, implementation and operation effectiveness, and identify barriers to service delivery.

Items

Evaluation Questions

Source

47 - 49

Questions 1 and 2

GBHI FY2009 Evaluation PD Interview

Sustainability (GBHI, SSH and CABHI grantees)

Establishes a baseline understanding of how grantees plan to sustain their projects after SAMHSA funding ends by identifying sustainability plans, activities and the partners involved in sustainability efforts.

Items

Evaluation Questions

Source

50-57

Question 6

GBHI FY2009 Evaluation PD Interview

Technical Assistance

Asks if the grantee has made any Technical Assistance requests through SAMHSA and, if yes, asks for the type of TA provided and if it impacted implementation and/or ongoing program implementation.

Items

Evaluation Questions

Source

58 - 62

Question 2

GBHI FY2009 Evaluation PD Interview

Local Evaluation (GBHI, SSH and CABHI grantees)

Identifies the type of evaluation conducted by the grantee project, the type of data collected and from whom, and its planned use.

Items

Evaluation Questions

Source

63 – 70

Questions 2, 4 and 6

GBHI FY2009 Evaluation PD Interview

Lessons Learned

Asks grantees to briefly describe one lesson learned for each of the following categories: serving their target population, project implementation, implementing EBPs, partner collaboration, and sustainability.

Items

Evaluation Questions

Source

71

Question 2

GBHI FY2009 Evaluation PD Interview


2. Site Visit Guides

The purpose of the grantee Site Visits is to collect detailed qualitative information and economic data on project activities conducted by the selected grantees and their partners which will directly inform the structure/process and economic evaluation components. The qualitative data will also provide essential information for the outcome evaluation component by documenting the interventions provided to clients and the implementation, barriers, facilitators, challenges and successes for each SAMHSA Homeless grant project visited.


Site Visit Guides consist of semi-structured discussions with six types of respondents: (1) Opening Session/Project Director and Management Staff (e.g., grantee project directors, project managers/coordinators); (2) Case Managers, Treatment and Housing Staff/Providers (e.g., clinical treatment staff, support services staff, case managers, housing providers, etc); (3) Stakeholders (e.g., primary partners and other key stakeholders); (4) Evaluators; (5) Clients (project participants); and (6) Financial Staff. This approach allows information to be collected from multiple perspectives giving a fuller picture of the grant project; the interviews are complementary but not redundant.


As with the PD Interview, the Site Visit Guides (Attachment 2) are adapted from site visit guides that were developed and administered under a previous GBHI cross-site evaluation. Adaptions were made to better align the questions with the SSH, CABHI and PATH programs and reduce and consolidate questions based on grantee and site visitor feedback. The Site Visit Guides will be customized to each grantee as some questions may not be relevant to all grantees. The guides are structured as discussions, with written questions to be used as a general guide but adjusted depending on the interviewees’ experience and understanding of the grantee project. The topics covered in each discussion guide are reviewed below.


Opening Session/Project Director Discussion Guide

The purpose of this session is several fold: (1) To ensure the site visitors understand the grantee agency and its relationship to the program and community homeless services; (2) to gain a solid overview of the project by reviewing (and revising as needed) the organizational chart, project logic model, and client flow chart; (3) to understand the treatment and other services implemented by the project, barriers and facilitators to project implementation, and lessons learned; and (4) to obtain an overview of project staffing and sustainability.


Table 4. Opening Session/Project Director Discussion Guide: Sections, Evaluation Questions and Source Materials

Overview of Grantee and Partner Agencies

Provides qualitative data on the grantee’s mission, role in the local treatment system, how the grant program fits into the grantee’s other work and whether the grantee has any other SAMHSA grants.

Items

Evaluation Questions

Source

1-3

Question 2

GBHI FY2009 Site Visit Protocol

Community Context

Identifies key characteristics of the grantee’s local treatment and service system including the resources available, the services (clinical and recovery support) typically available, and the gaps in services. The local resources available for housing services are also identified along with the barriers to obtaining housing and gaps in housing services. To better understand how the grantee operates within the treatment and service system, a set of questions ask about the grantee’s relationship with system, and whether the grantee participates in local efforts to end homelessness, is aware of 10 year plans to end homelessness, and is involved in the homeless continuum of care.

Items

Evaluation Questions

Source

1 - 4

Question 2

GBHI FY2009 Site Visit Protocol

Brief Project Overview

Focuses on how the grant project and grantee are organized to provide services to clients, including a review of the grantee’s project logic model. The section also includes questions on how the grantee relates to its partners and stakeholders in the context of providing grant services, whether there have been any challenges and how partners/stakeholders have collaborated with the grantee.

Items

Evaluation Questions

Source

1 – 8

Question 1

GBHI FY2009 Site Visit Protocol

Target Population

Identifies the grantee’s target population and criteria for enrollment, which will be used to categorize the grantees and better understand any changes made to the target population.

Items

Evaluation Questions

Source

1 - 6

Questions 1 and 2

GBHI FY2009 Site Visit Protocol

Client Flow

Provides a detailed, step-by-step schematic of the grantee project from the client perspective. It is critical in helping to establish a solid understanding of the day-to-day workings of the project, how services are delivered and how clients move through the project over time. Questions cover how the grantee identifies, recruits and screens clients, how and what services are typically provided to clients and how housing services are integrated into the project.

Items

Evaluation Questions

Source

1-15

Question 1

GBHI FY2009 Site Visit Protocol

Systems and Client Outcomes

Asks the grantee how project implementation impacts the grantee’s agency and treatment system. The section also asks for grantee input on tracking client outcomes, which outcomes should be tracked and if the GPRA/NOMS questions are useful measures for the client outcomes of interest.

Items Evaluation Questions Source


1- 5


Question 4

--GBHI FY2009 Site Visit Protocol

--Additional questions developed around system outcomes based on CABHI grantee requirements

Barriers, Facilitators and Innovations

Asks the grantee to discuss barriers and facilitators faced by their project, how barriers were or are being addressed, and any innovations developed. Grantees are asked specifically about barriers and facilitators to implementation, service delivery and innovations developed in response to challenges.

Items

Evaluation Questions

Source

1-3

Question 2

GBHI FY2009 Site Visit Protocol

Lessons Learned

Allows the evaluation to collect data on the various grantee strategies used to successfully implement their project which will provide valuable information to SAMHSA for performance monitoring and future grantees. Grantees are specifically asked about lessons learned regarding project organization, target population, client outcomes, overall system outcomes and any changes they would make if they started the project over again.

Items

Evaluation Questions

Source

1-3

Question 2

GBHI FY2009 Site Visit Protocol

Sustainability

Asks about the grantee’s sustainability plans including whether they have a written plan, the project elements they would seek to sustain, how grant project sustainability fits within the agency’s strategic goals, how sustainability fits within community and service system goals including HUD Consolidated Plans or Continuum of Care, how involved partners are in sustainability plans, and plans to use evaluation and/or data to promote sustainability.

Items

Evaluation Questions

Source

1-9

Question 6

GBHI FY2009 Site Visit Protocol

Project Staffing

Explores whether and how the grant project was impacted by staffing issues such as hiring delays, staff turnover, staff alignment with the target population and staff training.

Items

Evaluation Questions

Source

1-6

Questions 1 and 2

GBHI FY2009 Site Visit Protocol

Evidence Based Practices (EBPs)/Best Practices and Training

Focuses on the main treatment EBPs used by the grantee, asking interviewees to describe the main components, any modifications, additional funding and staff requirements to implement the EBP/Best Practice. Additional questions ask grantees to describe the types of training and technical assistance received around EBPs. Identifying the EBPs implemented by grantees and better understanding what adaptations were made to fit the grantee’s local context is central to documenting how well grantees are meeting SAMHSA’s expectations in implementing EBPs.


This section is to be asked only of the PD and Program Manager and only if the grantee agency is responsible for implementing the main Project EBP.

Items

Evaluation Questions

Source

1-10

Question 1

GBHI FY2009 Site Visit Protocol

SAMHSA Training

Asks whether the grantee has requested and received any training from SAMHSA and, if so, its impact on the project.

Items

Evaluation Questions

Source

1 – 2

Question 2

GBHI FY2009 Site Visit Protocol

Permanent Supportive Housing Questions

A subset of grantees is expected to provide services within a Permanent Supportive Housing (PSH) setting. The questions in this section ask these grantees to describe the components of PSH (choice, services, payment, special needs, legal rights, readiness requirements, client control over unit, and client input) within the context of their project, which will help provide qualitative confirmatory information on the degree to which grantees implement PSH.

Items

Evaluation Questions

Source

1 - 10

Question 1

GBHI FY2009 Site Visit Protocol


The following section and Table 4a clarifies the differences between the PD Interview and the site visit Project Director Interview. The PD Interview is the primary program-level data collection tool that informs the process study (and will provide the program-level variables needed for outcome analyses). This telephone interview will be implemented with all 127 grantees (25 completed with the previous evaluations and 2 additional piloted) and 56 States and Territories, ensuring common information (with common data collection protocols) for the process evaluation and for relevant program-level outcome variables. As these programs are funded for three to five years, and given Project Director’s investment in the correct descriptions and categorization of their programs, Project Directors have noted that the time burden was not considered excessive.


Site visits are conducted on a select group of grantees rather than for all funded projects. Up to 15 of 56 PATH programs will be site visited and up to 60 of 127 grantees will be site visited (as described on page 33). As described below the sequence of data collection is not order dependent, rather these are separate cross-sectional data collection efforts, and as the data collected from the Project Director Telephone Interview and from the Site Visit Project Director Interview are non-overlapping and complementary. Further the site visit interview with Project Directors is a conversation that may differ for each site (questions are only probes as needed) and is specific to the unique purpose of the site visit, a site-specific program portrait and validation for the purpose of generalization around services implementation.


As outlined in Tables 3 and 4, the PD Interview and the Project Director Discussion Guide (PD Discussion Guide) share common topic areas which are complementary in design and purpose. The PD Interview typically collects quantitative data which provides the basic characteristics of the grantee’s program in a standardized format across all grantee programs. These data as described will be used to develop typologies across all 127 SSH/GBHI/CABHI grantee and 56 PATH States and Territories. The PD Discussion Guide, to be implemented with up to 75 grantees SSH/GBHI/CABHI/PATH grantees, allows grantees to discuss particular characteristics, how they came to be, why they are important, and the challenges and/or successes associated with that characteristic. For example, the PD Interview identifies the number and type of agencies the grantee partners with while the PD Discussion Guide discusses why a partner was selected, what they bring to the project and the challenges and/or successes in working with that partner. Similarly, the PD Interview allows for systematic identification of each grantee’s primary EBPs while the PD Discussion Guide questions identify how an EBP was implemented, the modifications needed and its fidelity to core components. In general, the site visit questions, unlike the PD Interview questions, are discussion prompts and guides which are asked as needed and as relevant to a particular site. The data from the PD Interview questions, will be used both for the process component, including development of typologies, and for the outcome component grantee level variables for HLM models, frontier analysis to indicate practice/service model effectiveness (and be used with client outcomes in effectiveness and cost-effectiveness analyses). The data from the site visit protocol interviews will be used to develop case studies to inform SAMHSA and the field of unique lessons learned with regard to implementing particular services models for particular subpopulations and settings and to validate and thus be able to generalize from the site visits to the sites not visits with similar services models. Table 4a (page 15) summarizes how each common topic area between the PD Interview and the PD Discussion Guide complement each other but yield distinct information.



Table 4a. Common Topic Areas between the PD Interview and PD Discussion Guide

Instrument

Common topic areas

Validation

Collects quantitative data

In- depth qualitative discussion

Reviews fidelity components

PD Interview

Grantee Agency and Project Characteristics


X



PD Discussion Guide

Overview of Grantee and Partner Agencies

X


X


PD Interview

Target Population


X



PD Discussion Guide

Target Population

X


X


PD Interview

Stakeholders and Partners


X



PD Discussion Guide

Overview of Grantee and Partner Agencies

X


X


PD Interview

Evidence Based Practices/Best Practices


X



PD Discussion Guide

Evidence Based Practices (EBPs)/Best Practices and Training (as relevant)

X


X

X

PD Interview

Housing


X



PD Discussion Guide

Permanent Supportive Housing Questions (PSH) (as relevant)

X


X

X

PD Interview

Project Organization and Implementation


X



PD Discussion Guide

Project Staffing (if not pre-filled)

X


X


PD Interview

Sustainability


X



PD Discussion Guide

Sustainability

X


X


PD Interview

Technical Assistance


X



PD Discussion Guide

SAMHSA Training

X


X


PD Interview

Lessons Learned





PD Discussion Guide

Lessons Learned



X





Case Managers, Treatment, and Housing Staff/Provider Discussion Guide

The purpose of this session is to collect detailed information on the program services and housing provided to clients from the staff delivering services. The discussion guide focuses on service implementation, alignment of services with client needs, barriers and facilitators, and lessons learned related to housing, treatment, and case management/wraparound services. These questions cover all of the types of services that may be provided under the SAMHSA funding, but sections will be administered only as relevant to the program.


Table 5. Case Managers, Treatment, and Housing Staff/Provider Discussion Guide: Sections, Evaluation Questions and Source Materials

Overview of Treatment, Case Management, & Housing Providers

As many Homeless grantees use partner case management, treatment and housing agencies, understanding how each of these providers fit into the grantee project is essential. If the grantee provides case management, treatment or housing services directly (i.e. not via partners), this section will be covered in the Opening Session/Project Director Interview. Questions focus on the nature of the collaboration between the grantee and the partner(s), the partner(s) role in delivering services, integration of treatment services with housing and wraparound services, and whether and how the grant has impacted the partner agency.

Items

Evaluation Questions

Source

1-12

Questions 1 and 2

GBHI FY2009 Site Visit Protocol

Client Flow

Provides a detailed, step-by-step schematic of the case management, treatment and housing services from the client perspective. It is critical in helping to establish a solid understanding of the day-to-day workings of the project, how services are delivered and how clients move through the project over time. Questions cover how the grantee identifies, recruits and screens clients, how and what services are typically provided to clients and how housing services are integrated into the project.


Here, the client flow section complements the client flow section in the Opening Session/Project Director discussion by obtaining the provider’s perspective, which will provide additional important details on how clients enter, receive and exit case management, treatment and housing.

Items

Evaluation Questions

Source

1-15

Question 1

GBHI FY2009 Site Visit Protocol

Evidence Based Practices (EBPs) and Best Practices

Focuses on the main treatment EBPs used by the providers, asking interviewees to describe the main components, any modifications, additional funding and staff requirements to implement the EBP/Best Practice. Additional questions ask grantees to describe the types of training and technical assistance received around EBPs. Identifying the EBPs implemented and adaptations made to fit the grantee’s local context is central to documenting how well grantees are meeting SAMHSA’s expectations in implementing EBPs.

Items

Evaluation Questions

Source

1-8

Question 1

GBHI FY2009 Site Visit Protocol

Permanent Supportive Housing EBP Questions

A subset of grantees is expected to provide services within a Permanent Supportive Housing (PSH) setting. The questions in this section ask grantees to describe the components of PSH (choice, services, payment, special needs, legal rights, readiness requirements, client control over unit, and client input) within the context of their project, which will help provide qualitative information on the degree to which housing providers implement PSH.

Items

Evaluation Questions

Source

1 - 10

Question 1

GBHI FY2009 Site Visit Protocol

Alignment of Services with Client Needs

Focuses on the degree to which services align with client needs and asks how service planning occurs, whether client strengths are identified, and whether and how clients are given choices in their treatment/services received. Housing providers are asked specific questions regarding treatment/recovery planning and the contact they have with clients.

Items

Evaluation Questions

Source

1-11

Question 1

GBHI FY2009 Site Visit Protocol

Client Outcomes

Asks for provider input on tracking client outcomes, which outcomes should be tracked and if the GPRA/NOMS questions are useful measures for the client outcomes of interest.

Items

Evaluation Questions

Source

1- 5

Question 3

GBHI FY2009 Site Visit Protocol

Barriers, Facilitators and Innovations

Allows for providers to discuss the various barriers and facilitators faced by their project, how barriers were or are being addressed and any innovations developed. Providers are asked specifically about barriers and facilitators to implementation, service delivery and innovations developed in response to challenges.

Items

Evaluation Questions

Source

1-8

Question 2

GBHI FY2009 Site Visit Protocol

Lessons Learned

Allows the evaluation to collect data on the various grantee strategies used to successfully implement their project which will provide valuable information to SAMHSA for performance monitoring and future grantees. Providers are specifically asked about lessons learned regarding client flow/resource use, use of EBPs, alignment with client needs, client outcomes, system outcomes and housing accessibility.

Items

Evaluation Questions

Source

1

Question 2

GBHI FY2009 Site Visit Protocol


Stakeholder Discussion Guide

The purpose of the stakeholder discussion is to learn about projects from the perspective of the associated partner providers, key stakeholders and local funders. The discussion aims to learn about the agencies/providers involved in the project, the ways in which they are involved, and their perspectives on how the project has been implemented, its impact on and contribution to the community, and efforts made toward sustainability. Some grantees may not have external providers or stakeholders but instead may be working with other departments within their agency (their internal partners) for treatment, wraparound services and/or housing. In these cases, the internal partners would participate in an abbreviated stakeholder discussion.


Table 6. Stakeholder Discussion Guide: Sections, Evaluation Questions and Source Materials

Overview of Associated Providers Involved with the Project

Allows the interviewer to better understand how each partner fits into the overall service system and their specific role in the grantee’s project. Questions ask about the services provided, client population typically served, geographic area targeted, and experience with SAMHSA grants and the grantee agency.

Items

Evaluation Questions

Source

1-2

Question 2

GBHI FY2009 Site Visit Protocol

Relationship between Associated Providers/Key Stakeholders/Local funders & the Project

Collects information on how the grantee and its partners collaborate and how partners may collaborate with each other. Questions help identify the mechanisms for collaboration, such as stakeholder committees, community consortiums, or other formal meetings.

Items

Evaluation Questions

Source

1-4

Questions 1 and 2

GBHI FY2009 Site Visit Protocol

Associated Providers/Key Stakeholders/Local Funders Perspective on Services and Client Outcomes

Provides the partner’s perspective on services provided through the grantee project and the impact the project has on clients. Questions ask whether the project is serving the targeted population as intended, whether there are similar services available to clients besides the grantee’s project, and the project’s effect on client outcomes.

Items

Evaluation Questions

Source

1-2

Questions 1 and 2

GBHI FY2009 Site Visit Protocol

Systems Change

Identifies any change to the service system, such as policies, housing markets, and funding streams, due to the grantee’s project.

Items

Evaluation Questions

Source

1-2

Question 4

GBHI FY2009 Site Visit Protocol

Barriers, Facilitators & Innovations

Asks stakeholders to discuss barriers and facilitators faced by the project, how barriers were or are being addressed, and any innovations developed. Stakeholders are asked specifically about barriers and facilitators to implementation, service delivery and innovations developed in response to challenges.

Items

Evaluation Questions

Source

1-3

Question 2

GBHI FY2009 Site Visit Protocol

Project Sustainability Activities

Asks questions about the partner’s involvement in sustainability plans, including whether they have participated in any meetings, reviewed evaluation reports and/or data, and the partner’s overall perspective on sustainability planning. Partners are also asked how sustaining the grantee’s project would fit into the overall system service, which components are the most important to sustain, and what impact not sustaining the project would have on clients and the community.

Items

Evaluation Questions

Source

1-5

Question 6

GBHI FY2009 Site Visit Protocol


Evaluator Discussion Guide

The purpose of the evaluator discussion guide is to understand the grantee project’s local evaluation and quality assurance activities; how the evaluation is incorporated into sustainability activities, project implementation, and EBP/PSH fidelity; and lessons learned.


Table 7. Evaluator Discussion Guide: Sections, Evaluation Questions and Source Materials

Evaluation Overview & Integration Into Project

Asks the grantee evaluator to describe the local evaluation and how it is integrated with the project’s planning, management, clinical meetings, sustainability planning, quality assurance and feedback to clients.

Items

Evaluation Questions

Source

1-2

Question 2

GBHI FY2009 Site Visit Protocol

Process Evaluation

Asks about the process evaluation component of the local evaluation, if a process evaluation is being conducted. The questions focus on the overall aim of the process evaluation, how client participation is measured, how housing received by clients is tracked, how data are collected on project sustainability and how the project uses data to improve.

Items

Evaluation Questions

Source

1-5

Questions 1, 3 and 4

GBHI FY2009 Site Visit Protocol

GPRA/NOMS Data and Outcome Evaluation

Asks how GPRA/NOMS data are collected, what is most useful about the GPRA/NOMS data, and whether GPRA/NOMS data are matched with other locally collected outcome data. If additional outcome data are collected, the measures and plans to use the data are discussed.

1-8

Question 3

GBHI FY2009 Site Visit Protocol

Items

Evaluation Questions

Source

Evaluation Analysis and Reporting

Collects information about the local evaluator’s role in reporting findings, how the evaluation is managed and what, if any, main findings are available so far.

Items

Evaluation Questions

Source

1 - 3

Questions 3 and 4

GBHI FY2009 Site Visit Protocol

Fidelity Assessment

Asks the evaluator whether and how fidelity assessments are conducted for the primary EBPs being implemented in the project.

Items

Evaluation Questions

Source

1-3

Question 1

GBHI FY2009 Site Visit Protocol


Client Focus Group Discussion Guide

The purpose of this session is to learn about the grantee project from the client perspective. The questions for the group begin with basic information about the participating clients, such as length of involvement with the project, history of homelessness, and prior participation in services similar to those provided by the project. The remaining questions focus on the types of services clients have received including housing, their satisfaction with these services compared to previous experiences, other services available in the community that are similar, and policy or program recommendations for other projects focused on reducing homelessness.


Table 8. Client Focus Group Discussion Guide: Sections, Evaluation Questions and Source Materials

Descriptive Client Information

Collects basic information on the clients participating in the group and includes questions on how long a client has been in the project, experience with other projects, and past incidents of homelessness and experiences with treatment/housing services.

Items

Evaluation Questions

Source

1-2

Questions 1 and 2

GBHI FY2009 Site Visit Protocol

Services Clients Receive through the Project

Asks clients how they became involved in the project, what type of services they have received, various requirements/restrictions to participate in the project, services they feel they need but have not received, whether they have had to pay for services and whether they received assistance in accessing benefits.

Items

Evaluation Questions

Source

1-12

Question 1

GBHI FY2009 Site Visit Protocol

Housing for Clients

Asks clients about the assistance they have received in obtaining housing, the process they had to go through, the type of housing they currently have and their plans once they finish the grantee project.

Items

Evaluation Questions

Source

1-4

Question 1

GBHI FY2009 Site Visit Protocol

Client Satisfaction and Recommendations for Change

Asks clients what they liked or did not like about the project, what they would change about the project, how satisfied they are with the housing services, and their opinion about the project staff. The section concludes by asking what outcomes the clients have experienced and whether they have any recommendations to help address barriers or challenges they have experienced.

Items

Evaluation Questions

Source

1-10

Question 1

GBHI FY2009 Site Visit Protocol


Financial Staff Cost Questionnaire

The questionnaire is designed to collect resource use and economic information from the grantee and includes costs incurred during the fiscal year in which the site visit falls (e.g., FY 2013, 2014, 2015), labor hours in the past month, and partner funding for services provided to clients in the grantee project. Questions ask for estimates of staff labor in a typical week; while estimates may be less precise than time diaries or data extraction from a staff time reporting system, the method allows for reliable labor data to be uniformly collected across all grantees with minimal staff burden. The questionnaire will be completed by the project director or other designated staff with the assistance of a financial officer, as needed. To help reduce grantee burden, sections will be pre-populated using information collected during a document review process and updated as needed by the site visitor during the cost interview. The costs and labor allocation data are used to calculate project and service level costs. Combined with partner funding information, it provides a fuller picture of the full cost of implementing grant projects which helps inform sustainability and future funding decisions made by SAMHSA.


Table 9. Financial Staff Cost Questionnaire: Sections, Evaluation Questions and Source Materials

Ongoing Costs

Collects economic data at a grantee project level, focusing on costs incurred in service delivery. Questions ask about the costs associated with labor (e.g., employees, contracted), building space costs, depreciation, supplies and materials, any miscellaneous costs, and administrative overhead.

Items

Evaluation Questions

Source


1 – 10.4


Questions 2 and 5

--GBHI FY2009 Site Visit Protocol

--Substance Abuse Services Cost Analysis Program (SASCAP™; trademark RTI International) cost module.

Labor Allocation

Collects economic data at the service level, which allows for the costs of specific services to be calculated. Estimates are provided for the number of hours per week grantee staff typically work on each unique service category defined in the “Labor Allocation”. For each staff type (e.g., Counselors, Case Managers), the average wage is also collected.

Items

Evaluation Questions

Source


1-5


Questions 1 and 5

--GBHI FY2009 Site Visit Protocol

--Substance Abuse Services Cost Analysis Program (SASCAP™; trademark RTI International) cost module.

Partner Services

Collects basic funding information on the services provided by the grantee’s partners and other service system stakeholders by asking the grantee to identify the funding sources for services not funded by SAMHSA Homeless Grant Project funding.

Items

Evaluation Questions

Source

1-2

Questions 2 and 5

GBHI FY2009 Site Visit Protocol


3. Evidence-Based Practice (EBP) Self-Assessment

The EBP Self-Assessment (Attachment 3) is a web-based survey completed by the grantee with the purpose of collecting information on the services implemented in grantee projects that have a demonstrable evidence base and are appropriate for the target population. Data will provide a description of the interventions received by project clients/consumers providing the ability to aggregate by practice, assess the relationship of specific services to project effects, and inform the interpretation of results. This data collection tool supports the process, outcome and economic evaluations by identifying which EBPs are implemented and how grantees achieve EBP implementation. The tool also directly answers Evaluation Question 1.


The EBP Self-Assessment tool is divided into two parts. Part 1 collects qualitative information on general implementation of grantees’ primary behavioral health treatment EBPs. This part will be administered to all Homeless Programs grantees, except PATH grantees. Part 2 collects implementation data on a selected group of EBPs and will be administered only to grantees using the selected EBPs in their projects.


The EBP Self-Assessment Part 1 was developed from three instruments specifically designed and used to examine factors that influence EBP implementation: the General Organizational Index ([GOI]; Bond et al., 2009); the State Health Authority Yardstick ([SHAY]; Finnerty et al., 2009); and the Installation Stage Assessment ([ISA]; Fixsen & Blase, 2010). Table 10 provides a detailed listing of the source material for each question. The questions on the EBP Self-Assessment have been adapted to better fit the grantee context and simplified, where appropriate, to reduce grantee burden.


Table 10. EBP Self-Assessment Part 1 Items and Source Materials

Readiness to Implement & Leadership (13 items)

Item #(s)

Source(s)

1-2

Populations in which EBP is used

3-4, 10

Experience with EBP/priority of EBP’s implementation

5

Adapted from the ISA (Fixsen & Blase, 2010); Aarons, Hurlburt & Horwitz, 2011

6, 8, 11-12

Adapted from the ISA

7, 9, 13-15

Adapted from the SHAY (Finnerty et al., 2009)

Funding (4 items)

16

McGovern et al, in press.

17-18

Adapted from the SHAY

19

Aarons, Hurlburt & Horwitz, 2011

Hiring, Training & Supervision (8 items)

20, 22, 24-25

Adapted from the SHAY

21

Adapted from the ISA

23

Adapted from the GOI (Bond et al., 2009)

26

Adapted from the GOI & ISA

27

McGovern et al, in press.

Fidelity/Outcomes Monitoring & Performance Improvement (12 items)

28-30

Adapted from the GOI

31-33, 37, 39

Adapted from the SHAY; 32, 37 & 39 also adapted from GOI & ISA

34-36, 38

Degree of implementation fidelity and plans to maintain over time

Overall Barriers/Facilitators (2 items)

40-41

Summary of factors and sub-factors above


The EBP Self-Assessment Part 2 focuses on a selected group of EBPs which includes Assertive Community Treatment (ACT), Integrated Dual Disorders Treatment (IDDT), Illness Management and Recovery (IMR), Supported Employment (SE) and Critical Time Intervention (CTI). These EBPs were selected because they have a SAMHSA EBP Fidelity Toolkit, are well-defined and measurable project models (instead of practice strategies), and are being used by at least 14 grantee projects. The cut-off of 14 grantee projects was selected as natural clustering was observed and to ensure sufficient sample size to aggregate and analyze; the next most implemented EBPs with developed fidelity toolkits were implemented by only 7 grantees. Limiting the scope helps balance respondent burden with the need to collect information on EBPs that is comparable across a number of grantee projects. Information provided during the PD Interview will be used to identify the grantees who are implementing the selected group of EBPs and this subset of grantees will be invited to complete Part 2 of the EBP Self-Assessment.


Each EBP has its own module with questions designed to collect both quantitative and qualitative data. These modules are based on standardized fidelity checklists that have tested scoring systems to be able to determine the degree of implementation fidelity for each EBP assessed (McHugo et al., 2007). The selected practices have a SAMHSA EBP Fidelity Tool Kit (ACT: DHHS Publication No. SMA-08-4344; IDDT: DHHS Publication No. SMA-08-4366; IMR: DHHS Publication No. SMA-09-4462; and SE: DHHS Publication No. SMA-08-4364) or a well-tested fidelity scale (CTI) documented in the National Registry of Evidence-based Programs and Practices (NREPP) from which the Part 2 self-assessment questions were developed, retaining the key fidelity dimensions/components of each EBP. At the end of each module is a question about dimensions of the EBP that the grantee may have found difficult to implement and two questions about specific modifications that may have been made to the EBP by the local grantee project. This information will be used to produce recommendations for SAMHSA and the field regarding needed future research and practice improvements. Table 11 details the specific source for each question item in Part 2 of the EBP Self-Assessment.


Table 11. EBP Self-Assessment Part 2 Items and Source Materials

Assertive Community Treatment (ACT)/Intensive Case Management (ICM) Module (41 items)

Item #(s)

Source(s)

1 – 18

Human Resources questions from ACT SAMHSA Toolkit/Fidelity Checklist

19 – 27

Organization Boundaries questions from the ACT SAMHSA Toolkit/Fidelity Checklist

28 – 38

Nature of Services questions from the ACT SAMHSA Toolkit/Fidelity Checklist

39 - 41

Adaptations and Challenges

Integrated Dual Disorders Treatment (IDDT) Module (42 items)

1 – 6

Staffing questions from the IDDT SAMHSA Toolkit/Fidelity Checklist

7 - 39

Service delivery questions from the IDDT SAMHSA Toolkit/Fidelity Checklist

40 - 42

Adaptations and Challenges

Illness Management and Recovery (IMR) Module (25 items)

1-2

Staffing questions from the IMR SAMHSA Toolkit/Fidelity Checklist

3-8

Programmatic questions from the IMR SAMHSA Toolkit/Fidelity Checklist

8-22

Assignment and Service type questions from the IMR SAMHSA Toolkit/ Fidelity Checklist

23-25

Adaptations and Challenges

Supported Employment (SE) Module (28 items)

1-4

Staffing questions from the SE SAMHSA Toolkit/Fidelity Checklist

5-10

Organization questions from the SE SAMHSA Toolkit/Fidelity Checklist

11-25

Services questions from the SE SAMHSA Toolkit/Fidelity Checklist

26-28

Adaptations and Challenges

Critical Time Intervention (CTI) Module (33 items)

1-7

Program Structure/Staffing questions (NREPP and fidelity scale developers)

8-10

Early Engagement questions (NREPP and fidelity scale developers)

11-18

Assessment/Treatment Planning questions (NREPP and fidelity scale developers)

19-23

Outreach/Early Linking questions (NREPP and fidelity scale developers)

24-30

Nature/Length of Services questions (NREPP and fidelity scale developers)

31-33

Adaptations and Challenges


4. Permanent Supportive Housing (PSH) Self-Assessment

The PSH Self-Assessment (Attachment 4) is a web-based survey completed by the grantee to identify the extent to which grantees implementing PSH models meet the relevant dimensions of PSH. To help reduce grantee burden, the PSH Self-Assessment will be administered only to grantees implementing PSH models.


The PSH Self-Assessment provides data which will help answer Evaluation Question 1 by further identifying the types of housing and housing support services implemented and by assessing achievement of implementations goals including linkage to housing, provision of supports to ensure housing stability and housing services delivered through EBPs. To operationalize this Evaluation Question, the instrument collects data to answer four sub-questions: 1) which of the dimensions are most frequently met across programs, 2) which of the dimensions are least frequently met across programs, 3) which of the dimensions have most variability among programs, and 4) which program characteristics are associated with the outcomes indicated by sub-questions 1, 2, 3?

The PSH Self-Assessment tool was developed using the SAMHSA PSH toolkit (DHHS Publication No. SMA-10-4509) as a primary resource. Additional resources included the Pathways Housing First Fidelity Scale-ACT version (Tsemberis, 2010) and the Full Service Partnership (FSP) Practices Scale (Gilmer et al., 2010). These resources were used to construct a comprehensive self-assessment instrument for PSH, which includes seven dimensions from the SAMHSA PSH toolkit and one Service Philosophy module from the Pathways Housing First Fidelity Scale and FSP Practices Scale that will be administered to all grantees implementing a PSH model. An ACT team module from the Pathways Housing First Fidelity Scale and FSP Practices Scale is also included, but applies only to projects with behavioral health teams. By incorporating items from the Housing First Fidelity Scale and FSP Scale, the PSH Self-Assessment tool will capture the variability of the PSH model among the SSH, GBHI and CABHI grantees using a reliable and accurate source of information that is efficient and effective (Gilmer & Katz, 2012; Macnaughton, Goering, & Nelson, 2012; Stergiopoulos et al., 2012). Table 12 lists each of the nine dimensions, the number of questions in each dimension, and the sources used.


Table 12. PSH Self-Assessment Items and Source Materials

Choice of Housing (6 items)

Item #(s)

Source(s)

1 – 6

SAMHSA PSH Toolkit

Functional Separation of Housing & Services (6 items)

7 - 12

SAMHSA PSH Toolkit

Decent, Safe, & Affordable Housing (2 items)

13 - 14

SAMHSA PSH Toolkit

Housing Integration (4 items)

15 - 19

SAMHSA PSH Toolkit

Rights of Tenancy (9 items)

20-29

SAMHSA PSH Toolkit

Access to Housing (15 items)

30 – 44

SAMHSA PSH Toolkit

Flexible, Voluntary, Services (18 items)

45 - 62

SAMHSA PSH Toolkit

Service Philosophy (26 items)

63 - 88

Pathways Housing First Fidelity Scale-ACT version (Tsemberis, 2010) and Full Service Partnership (FSP) Practices Scale (Gilmer et al. 2010).

ACT team module (6 items)

89 - 94

Pathways Housing First Fidelity Scale-ACT version (Tsemberis, 2010) and Full Service Partnership (FSP) Practices Scale (Gilmer et al. 2010).


Collected data will be used to understand the extent to which grantees are implementing key dimensions of PSH (Evaluation Question 1). Additionally, the data will provide valuable contextual information at the service level through which to help interpret client level outcome data (Evaluation Questions 3 and 4). Finally, the data will be used to form aggregations of grantees and inform the services models; they will also be used in comparative effectiveness and cost effectiveness analyses (Evaluation Question 5).


  1. Use of Information Technology

PD Interview

The PD Interview is designed as a telephone based interview. Respondents will be read questions by the interviewer who will then record each response into a web-based data entry form which allows for automated skip patterns and fill-ins based on prior responses to certain questions. This approach reduces administration time thereby reducing burden on the grantee. The completed PD Interview will be reviewed by the interviewer for accuracy.


Site Visits

As the Site Visit Guides are designed as in-person semi-structured discussions, use of information technology will be limited. In-person site visits collect data which are not collected well through questionnaires or telephone interviews. This data includes complex qualitative data on program implementation and the barriers, facilitators, challenges and successes encountered by each grantee program. Conducting these discussions in person also helps to ensure that the most accurate information will be obtained from each grantee as interviewers can ask immediate follow-up questions and more easily determine the respondent’s mood and opinions. Being on-site also allows interviewers to see the facilities the grantees use, meet clients in the focus group and observe services. Site visit discussions will be digitally recorded and additional notes will be taken on a computer. All recordings will be stored on RTI’s secure servers and destroyed once de-identified transcriptions are completed.


EBP Self-Assessment and PSH Self-Assessment

The EBP and PSH Self-Assessments will be programmed into web-based surveys to be completed by the grantees. Using a Web instrument allows for automated data checks, skip procedures and fill-ins based on prior responses to certain questions, which will significantly reduce the burden among subsets of respondents and increase data quality. This method also uses automated data entry and greatly reduces the possibility of data entry error. Responders unable to complete the Web instrument can elect to receive a paper version through the mail; hard copy forms will utilize specialized technology as appropriate (e.g., scannable TeleForm), which will reduce data entry burden and errors. Both web-based surveys will comply with the requirements of Section 508 of the Rehabilitation Act.


Data will be stored electronically and will be accessible by evaluation team members who will be made aware of what data are available. This reduces the possibility of any unnecessary contact with participants to collect redundant information.


  1. Effort to Identify Duplication


Data collected through the PD Interview, Site Visit Guides and the EBP and PSH Self-Assessments are specific and unique to each grantee program. As such, the data collection tools do not duplicate any other data collection effort. Additionally, the Homeless Programs evaluation team conducted an extensive literature review to confirm that data collected through the data collection tools would not be duplicative of any ongoing national or state-level data collection efforts. Data collected in this evaluation is not available from other sources and will be unique because of the scale and breadth of the initiative’s implementation: nationwide, across a spectrum of programs, and across a broad cross-section of populations.


5. Involvement of Small Entities


There will not be a significant impact on small entitites. Grantees in the SAMHSA Homeless Programs include state agencies, local services providers, and tribal organizations and some may be small entities; however, the PD Interview, Site Visit Guides and EBP and PSH Self-Assessments are designed to include only the most pertinent information needed to be able to effectively carry out the evaluation. Additionally, in accepting GBHI and CABHI funds, grantees in these programs agreed to participate fully in all SAMHSA-approved evaluation activities.


6. Consequences If Information Collected Less Frequently


The PD Interview, Site Visit Guides and EBP and PSH Self-Assessments will be conducted with grantee project directors, key management staff, clinical staff and clients/consumers once during the evaluation.


7. Consistency with the Guidelines in 5 CFR 1320.5(d)(2)


This information collection fully complies with the guidelines in 5 CFR 1320.5(d)(2).

8. Consultation Outside the Agency


The notice required by 5 CFR1320.8(d) was published in the Federal Register on Monday, June 17, 2013 (78 FR [Page #36204-36205]). One comment, provided by the Trevor Project, was received during the public comment period. The Trevor Project suggested respondants be allowed to select multiple response categories for the question on gender and a question on sexual orientation be added. Both suggestions are very much appreciated and understood by SAMHSA. This current data collection effort under consideration for OMB approval collects data only on grantee programs and organizations and does not collect client level data; therefore we are not able to incorporate the suggested changes for the instruments under review. However, when the client level data collection efforts for this evaluation come up for OMB review and renewal, the suggestions by the Trevor Project will be fully considered.


Representatives from each SAMHSA Center are developing consensus recommendations to implement LGBT identity questions through the GPRA-mandated performance monitoring data systems, specifically in the Common Data Platform to be implemented in 2014. SAMHSA will spend part of this calendar year developing the questions and preparing the relevant supporting statement for an OMB package submission.


There are a number of reasons to accelerate efforts to collect LGBT-specific data. First and foremost, those in the LGBT community are at risk, often at much greater risk than their sexual majority peers, for behavioral health issues. Second, these data are being called for by non-profit agencies engaged with and advocating for the LGBT community – as well as Federal, state, and local governments and other organizations that currently, or would like to, serve them. Third, there are pragmatic issues, as well, including the disparity impact statements required from grantees that do not currently allow SAMHSA to report LGBT health-related disparities. Adding these questions would also support the new RFA disparities impact statement requirement, and immediately enhance the ability to report on the service population at the grant, program, Center, and Agency level. In order to best serve this population, we need to ensure that data are of the highest quality.



With regard to the current evaluation, extensive use of experts in the areas of homeless, substance abuse, and mental health research, including current and previous Homeless Programs grantees, has been made to provide guidance on the design and analysis of the cross-program evaluation. An expert technical panel meeting was held in July 2012 to review various aspects of the cross-program evaluation, including the overall evaluation design and our approach to evaluating housing models and EBPs. The experts provided feedback on all aspects of the evaluation and their comments and suggestions were incorporated into the development of the surveys. The list of experts is provided in Exhibit 1.


Exhibit 1. Technical Panel Members

Expert

Affiliation

Contact Information

Margarita Alegría, Ph.D.


Harvard Medical School

Director and Professor of Psychiatry Center for Multicultural Mental Health Research, Cambridge Health Alliance

120 Beacon Street, 4th Floor

Somerville, MA 02143

Phone: (617) 503-8447

E-mail: malegria@charesearch.org

Gary Bond, Ph.D.


Dartmouth Psychiatric Research Center, Professor of Psychiatry

Rivermill Commercial Center

85 Mechanic Street, Suite B4-1

Lebanon, NH 03766

Phone: (603) 448-0263

E-mail: gary.bond@dartmouth.edu

Brian Dates, M.A.*

Southwest Counseling Solutions

Director of Evaluation and Research

1700 Waterman

Detroit, MI 48209

Phone: (313) 841-7442

E-mail: bdates@swsol.org

Louis Kurtz, M. Ed.*

Division of Behavioral Health

Kentucky Department for Behavioral Health, Development and Intellectual Disabilities

Acting Division Director

100 Fair Oaks Lane 4E-D

Frankfort, KY 40621

Phone: (502) 564-4456

E-mail: louis.kurtz@ky.gov


William McAllister, Ph.D.


Institute for Social and Economic Research and Policy

Senior Research Fellow

Center for Homelessness Prevention Studies, Associate Director

Columbia University

420 West 118th St, MC 3355

New York, NY 10027

Phone: (212) 854-5781

E-mail: wm134@columbia.edu


Stephen Metraux, Ph.D.*


Department of Health Policy & Public Health, Associate Professor

University of the Sciences in Philadelphia

600 South 43rd Street

Philadelphia, PA 19104-4495

Phone: (215) 596-7612

E-mail: s.metraux@usp.edu


Roger H. Peters, Ph.D.


Department of Mental Health Law and Policy, Professor

Louis de la Parte Florida Mental Health Institute, University of South Florida

13301 North Bruce B. Downs Boulevard

Tampa, FL 33612-3807

Phone: (813) 974-9299

E-mail: peters@fmhi.usf.edu

Alan Rabideau

Consumer and Family Member Consultant

112 Kincheloe Drive

Kincheloe, MI 49788

Phone: (906) 495-7158
E-mail: jawenodee_inini@yahoo.com

Michael Rowe, Ph.D.


Program for Recovery & Community Health, Associate Clinical Professor of Psychiatry & Co-Director

Yale University, School of Medicine

319 Peck Street, Building 1

New Haven, CT 06513

Phone: (203) 764-8690

E-mail: michael.rowe@yale.edu


David Smelson, Psy.D.*


University of Massachusetts Medical Center, Professor of Psychiatry

Translational Research

Edith Nourse Rogers Memorial Veterans Hospital, Director

55 Lake Avenue North

Worcester, MA 01655

Phone: (508) 856-3768 x 15122

E-mail: david.smelson@umassmed.edu


Sally J. Stevens, Ph.D.*


Southwest Institute for Research on Women

Executive Director

Department of Gender and Women’s Studies, Professor

College of Social and Behavioral Sciences, University of Arizona

925 North Tyndall Avenue

Tucson, AZ 85721

Phone: (520) 626-9558

E-mail: sstevens@email.arizona.edu


Sam Tsemberis, Ph.D.*


Pathways to Housing, Inc.

Founder and CEO

Columbia University Medical Center, Clinical Assistant Professor of Psychiatry

55 West 125th Street, 10th Floor

New York, NY 10027

Phone: (212) 289-0000 x1101

E-mail: stsemberis@pathwaystohousing.org


* Also current and prior Homeless Programs grantees


9. Payment to Respondents


No cash incentives or gifts will be given to respondents.


10. Assurance of Confidentiality


Contractor staff are trained on handling sensitive data and the importance of privacy. In addition, the contractor is in process of submitting the PD Interview, Site Visit Guides, EBP and PSH Self-Assessments and consent forms for each to the contractor’s Institutional Review Board (IRB) (Federal Wide Assurance #3331). In keeping with 45 CFR 46, Protection of Human Subjects, the procedures for data collection, consent, and data maintenance are formulated to protect respondents’ rights and the privacy of information collected.


Concern for privacy and protection of respondents’ rights will play a central part in the implementation of all study components; as such, personal identifying information such as birthdates, Social Security Numbers or other similar data are not collected. Respondents will be provided a unique identification number, which will be linked to the respondents’ name but only in a database separate from the data collected. Only the evaluation team will have access to the database linking the identification number and respondent name, which will be stored on a password protected, Point Sec-encrypted secure server. Additionally, this data collection effort does not intend to make national estimates.The privacy protection measures for each data collection tool are described below.


PD Interview

Prior to beginning the PD Interview, the respondent will be read a script for consent (Attachment 5) that informs the respondent of their rights, including the right to not answer any question, and asks for their verbal consent to participate in the interview. Information collected by the PD Interview will only be reported in aggregate and individual respondents will not be identified.


Site Visits

Prior to beginning a site visit discussion, the respondent(s), including both project staff and clients/consumers, will be read and provided a copy of the consent (Attachment 6 & 7) that informs them of their rights, including the right to not answer any question, and asks for their written consent to participate in the discussion and for the discussion to be recorded. Recordings will be used to ensure that information is correctly captured from multiple interviews, that information has been consistently captured across site visitors, and to correct and clarify brief written notes as needed and part of data quality assurance process. Recordings will only be accessible to the contractor and will be stored on password-protected secure servers and destroyed once de-identified notes are completed. Information collected by the site visit interviews is only reported in aggregate and individual respondents will not be identified.


EBP Self-Assessment and PSH Self-Assessment

Prior to beginning the web-based surveys the respondent will review a brief statement of consent for the EBP Self-Assessment (Attachment 8) and for the PSH Self-Assessment (Attachment 9) that informs them of their rights, including the right to not answer any question, and that completing the self-assessment is voluntary. Additionally, information collected by the EBP Self-Assessment and the PSH Self-Assessment is only reported in aggregate and individual respondents will not be identified.


For all data collection activities, the contractor will use passwords to safeguard all project directories and analysis files containing completed survey data to ensure that there is no inadvertent disclosure of study data. Strict procedures, as required by the contractor’s IRB, will be followed for protecting the privacy of respondents’ information and for obtaining their informed consent.


11. Questions of a Sensitive Nature


The majority of the information reported by respondents during the PD Interview, Site Visits and EBP and PSH Self-Assessments is not sensitive personal information as interviews focus only on programmatic details of the grantee’s project.


The data collection tools will be reviewed by the contractor’s IRB (FWA #3331) and data collection for each tool will not begin until it is approved or exempted.


Site Visit Guides do include a client focus group and service observations during which sensitive questions and topics may be discussed. Informed consent will be obtained for participation in the client focus group and all questions asked during the focus group are voluntary. Participants will be assured that they may stop participation in the focus group at any time without penalty from the grantee project. If a participant is caused any distress, the focus group facilitator will connect them, with permission from the participant, with someone from the grantee project who they can speak with. Participants are also asked to not provide their full names to maintain their privacy. During a service observation, clients are asked for their consent to be observed during the session but clients are not asked questions of any kind during a service observation. Additionally, service observations are not audio recorded and notes are written after the observation is complete.


12. Estimates of Annualized Hour Burden


PD Interview

A total of 158 grantees are expected to complete the PD Interview. This expectation is based on the full participation of grantees in the GBHI 2010, SSH 2009-2010, and CABHI 2011-2012 cohorts (n=102) and the 56 PATH grantees. Each interview is expected to take 3.5 hours with one response per grantee; this includes time to schedule the interview, review the PD Interview sent to the grantee beforehand, and conduct the interview.


As noted in Attachment 1, 34 questions will be prefilled accounting for 48% of the questionnaire. The percentage does not include front page questions A1 – A13 and sub-questions (e.g., 5a). Prefilling the questionnaire will reduce burden time by 48% or 1.44 hours; the adjusted interview length is 1.56 hours and with the half an hour preparation time the total burden estimate is 2.1 hours. To estimate grantee burden, the interview was tested with contractor staff who have previously been involved in implementing SAMHSA-funded Homeless Programs. A similar questionnaire (in length and topics covered) was used to evaluate GBHI 2009 grantees and all 25 grantee Project Directors completed the full questionnaire with no break-offs, refusals, or missing items. In fact, the response during the prior evaluations and recent piloting has been very positive. After each interview we reviewed with Project Directors feedback on the interview itself . The feedback on the questionnaire prompted revisions for the new measure under review (e.g., clarifying questions, removing redundant questions, expanding areas relevant to housing models). Additionally, some Project Directors requested that the interview be held in two one hour calls, and again the protocol allows for this.


All Project Director respondents felt that the topics and types of questions were necessary to ensure that the evaluation capture the unique aspects of each project, have sufficient common information across all grantees to aggregate information for similar models, develop typology and ensure that projects were correctly assigned to the correct model for analysis. Project Directors also noted that while a two hour interview was a substantial amount of time, that this time was anticipated as part of their work on the project and that it was actually a small amount given that it is a one-time interview during the three to five year project funded period. Finally, some Project Directors also noted that the length was not an obstacle for them, as they were used to these types of interviews as well as developing 2-3 hour assessment batteries for the clients they serve to ensure they had accurate information. Since the submission to OMB, the contractor has also piloted an additional two interviews with current projects. The feedback received in the prior evaluations and the new burden estimate was confirmed: revisions implemented were approved, comfort with the format and acceptability of the length were confirmed, and agreement received as to the utility of this specific data collection. Again there were no break-offs or negative feedback. Project Directors also reiterated that thinking about the questions was useful to them for the purpose of clarifying their project, that they believed there was substantial benefit in participating in the evaluation, that they felt the interview time was part of their paid and usual work activities, and that they wanted to ensure that their project was appropriately categorized, described and analyzed, which they thought would occur through the questions asked on the Project Director Interview. Finally, four webinars were held (in 2011 and early 2012) with regard to the data collection and evaluation with the Homeless Programs grantees (PATH, SSH, GBHI and CABHI), questions answered and the data collection activities including the Project Director Interview and site visit protocols were reviewed. Grantees consistently endorsed the program-level data collection activities and felt that time burden of these activities was acceptable. Further, they appreciated that calls, site visits and other activities would be scheduled at their convenience and over more than sitting as desired by the grantees. Exhibit 2 presents estimates of the annualized burden and cost for the PD Interview. The total cost of the time respondents will spend completing this interview is $15,807 (total burden hours X the estimated Bureau of Labor Statistics average hourly wage for project directors). The PD Interview will be administered once to each grantee and no follow-ups will be conducted.




Site Visit Guides

A total of 75 grantees are expected to participate in site visits and be interviewed through the Site Visit Guides over three years. Twenty site visits per year will be conducted with a sample of the GBHI 2010 (approximately 4-5 grantees), SSH 2009-2010 (approximately 6 grantees), and CABHI 2011-2012 (approximately 9-10 grantees) grantees and 5 site visits per year will be conducted with a sample of the PATH grantees, per the SAMHSA evaluation contract.

Participants will include grantee project directors, financial staff, evaluators, clinical treatment staff, support services staff, case managers, housing providers, primary partners and other key stakeholders, and project participants. The number of respondents will vary across projects; based on the contractor’s site visit experience during the previous GBHI cross-site evaluation, the following average number of respondents will participate in each discussion: 10 respondents will in participate in the Opening Session/Project Director Interview, 15 respondents will participate in the Case Manager, Treatment, Housing Staff/Provider Interview, 7 respondents will participate in the Stakeholder Interview, 3 respondents will participate in the Evaluator Interview, 12 respondents will participate in the Client Focus Group, and 3 respondents will participate in the Cost Interview. Site Visits will be administered once to each selected grantee and no follow-ups will be conducted.


As the estimated time burden, number of respondents and respondents’ average hourly wage (per the Bureau of Labor Statistics) vary by Site Visit Guide, the annualized burden and cost is broken out for each discussion guide in Exhibit 2. The time burden and number of respondents per interview were estimated based on experience conducting site visits for the previous GBHI evaluation.


Time estimates for the Opening Session/Project Director Interview (see Attachment 2) were updated following five pilot site visits. The average scheduled session time for the session was 3.2 hours and included approximately 0.7 hours for an Opening Session during which the agenda and logistics for the three day site visit were confirmed or adjustments made, grantees were introduced to the site visit team, provided an overview of the evaluation, and provided an opportunity to share information if they wanted about their program prior to beginning the Project Director Interview. Exhibit 2 has been updated to separate the Opening Session and the Project Director Interview burden estimates so the time needed to complete the Project Director Interview clear. The Opening Session, as briefly noted above, is not required of grantees and does not use a discussion guide; instead, grantees make adjustments to the agenda and logisitics if desired, can ask questions about the overall evaluation or request a brief in-person overview, and are invited to share what they consider the most important aspects of their program with the site visit team prior to beginning the formal interview. The average session time was further reduced as the EBPs/Best Practices &Training and the Permanent Supportive Housing EBP Questions sections duplicated questions asked during the Case Manager, Treatment, Housing Staff/Provider Interview. In total, the Project Director Interview protocol is estimated at a maximum of 2 hours including time for directions and informed consent.


Two of the Site Visit Guide sections, Evaluator and Cost, will not be conducted with the PATH grantees because they do not pertain to the PATH program. The annualized cost to respondents for each of the discussion guides is as follows: $4,251for the Opening Session, $16,195 for the Project Director Interview, $12,848 for the Case Manager, Treatment, Housing Staff/Provider Interview, $12,506 for the Stakeholder Interview, $2,858 for the Evaluator Interview, $3,263 for the Client Focus Group, and $5,717 for the Cost Interview.


EBP Self-Assessment, Parts 1 & 2

A total of 127 grantees are expected to complete the EBP Self-Assessment – Part 1. This expectation is based on the full participation of all grantees in the following cohorts: GBHI 2009-2010, SSH 2009-2010, and CABHI 2011-2012. Each self-assessment is expected to take 35 minutes (0.58 hours) with one response per grantee; this includes time for reviewing instructions, searching existing data sources, gathering the data needed, and completing the self-assessment. To estimate burden time, the self-assessment was tested with contractor staff who have previously been involved in implementing EBPs in populations similar to those found in the Homeless Programs (e.g., homeless, history of/current substance abuse disorders, mental health disorders, etc.). Exhibit 2 presents estimates of the annualized burden and total cost, $3,509, based on this testing.


A total of 87 grantees are expected to complete the EBP Self-Assessment – Part 2. This expectation is based on the number of GBHI 2009-2010, SSH 2009-2010, and CABHI 2011-2012 grantees who are implementing at least one of the 5 EBPs selected for an in-depth assessment, per information extracted from all of the Homeless Programs grant applications. Each self-assessment is expected to take 30 minutes (0.5 hours) with one response per grantee; this includes time for reviewing instructions, searching existing data sources, gathering the data needed, and completing the self-assessment. To estimate burden time, the self-assessment was tested with contractor staff who have previously been involved in implementing the 5 selected EBPs (ACT, IDDT, IMR, SE and CTI) in populations similar to those found in the Homeless Programs (e.g., homeless, history of/current substance abuse disorders, mental health disorders, etc.). Exhibit 2 presents estimates of the annualized burden and total cost, $2,072, based on this testing.


The EBP Self-Assessment, Parts 1 & 2 will be administered once to each selected grantee and no follow-ups will be conducted.


PSH Self-Assessment

The PSH Self-Assessment is expected to be completed by a total of 100 grantees. This expectation is based on the number of GBHI 2009-2010, SSH 2009-2010, and CABHI 2011-2012 grantees who are implementing a PSH model, per information extracted from all of the Homeless Programs grant applications. Each self-assessment is expected to take 40 minutes (0.67 hours) with one response per grantee; this includes time for reviewing instructions, searching existing data sources, gathering the data needed, and completing the self-assessment. To estimate burden time, the self-assessment was tested with contractor staff who have previously been involved in implementing PSH in populations similar to those found in the Homeless Programs (e.g., homeless, history of/current substance abuse disorders, mental health disorders, etc.). Exhibit 2 presents estimates of the annualized burden and total cost, $3,192, based on this testing. The PSH Self-Assessment will be administered once to each selected grantee and no follow-ups will be conducted.


Exhibit 2. Annualized Cross-Program Data Collection Burden

Instrument/Activity

Number of Respondents

Responses per Respondent

Total Number of Responses

Hours per Response

Total Burden Hours

Average Hourly Wage

Total Respondent Costa

Project Director Telephone Interview

158

1

158

2.1

331.8

$47.64

$15,807

Opening Session (informal session scheduled with the Project Director Interview)

250b

1

250

0.7

175

$32.39

$5,668

Project Director Interview

250b

1

250

2

500

$32.39

$16,195

Case Manager, Treatment, Housing Staff/Provider Interview

375c

1

375

2

750

$17.13

$12,848

Stakeholder Interview

175d

1

175

1.5

262.5

$47.64

$12,506

Evaluator Interview

60e

1

60

1

60

$47.64

$2,858

Client Focus Group

300f

1

300

1.5

450

$7.25

$3,263

Cost Interview

60g

1

60

2

120

$47.64

$5,717

EBP Self-Assessment Part 1

127

1

127

0.58

73.66

$47.64

$3,509

EBP Self-Assessment Part 2

87

1

87

0.5

43.5

$47.64

$2,072

PSH Self-Assessment

100

1

100

0.67

67

$47.64

$3,192

TOTAL

1,048h


1,942


2,502


$67,828

aTotal respondent cost is calculated as hourly wage × hours per response × total number of responses.

b10 respondents x 25 site visits per year = 250 total respondents

c15 respondents x 25 site visits per year = 375 total respondents

d7 respondents x 25 site visits per year = 175 respondents

e3 respondents x 20 site visits per year = 60 respondents

f12 respondents x 25 site visits per year = 300 respondents

g3 respondents x 20 site visits = 60 respondents

hEstimated number of total unique respondents; some respondents, such as project directors, will overlap across the data collection activities.


13. Estimates of Annualized Cost Burden to Respondents


There are no respondent costs for capital or start-up or for operation or maintenance.


14. Estimates of Annualized Cost to the Government


The total estimated cost to the government for the data collection is $1,567,481.  This includes approximately $1,506,938 for programming, contractor labor for conducting interviews and site visits and analyzing and reporting data, travel for site visits and housing and maintaining data. Approximately $20,181 per year represents SAMHSA costs to manage/administer the survey for 20% of one employee (GS-13, $100,904).  The annualized cost is approximately $522,494.


15. Changes in Burden


This is a new collection of information.


16. Time Schedule, Publications, and Analysis Plan


Time Schedule: Exhibit 3 outlines the key time points for the study and for the collection of information. The requested period also allows for training and start-up activities associated with the preparation for data collection.


Exhibit 3. Time Schedule for Data Collection

Activity

Time Schedule

Prepare for data collection, including document review

April 2013 – July 2013

Obtaining OMB approval for data collection

Pending

Year 1 Site Visits

September 2013

Project Director Interview

October 2013 – March 2014

Year 2 Site Visits

October 2013 – September 2014

EBP Self-Assessments

November 2014 – September 2014

PSH Self Assessments

November 2014 – September 2014

Year 3 Site Visits

October 2014 – September 2015

All Data Collection Completed

January 2016

Data analysis

August 2013 – June 2016

Dissemination of findings
Interim reports, presentations, manuscripts,

final report

September 2013 – September 2016


Publications: The Homeless Programs evaluation is designed to produce knowledge about the implementation and effects of the Homeless Programs. It is therefore important to prepare and disseminate reports, concept papers, documents, and oral presentations that clearly and concisely present project results so that they can be appreciated by both technical and nontechnical audiences. The contractor will:


  • Produce rapid-turnaround interim analysis papers, briefs, and reports while data collection is on-going;

  • Prepare and submit monthly technical progress reports, semi-annual briefings and annual progress reports;

  • Prepare special reports in concert with SAMHSA and expert panel input;

  • Prepare a final cross-program findings report, including an executive summary;

  • Deliver presentations at professional and federally sponsored conventions and meetings; and

  • Disseminate reports and materials to entities inside and outside of SAMHSA.


Analysis:

The Homeless Evaluation uses a combination of qualitative and quantative analysis to assess structure and process. Assessing structure and process is a key element of evaluating any behavioral healthcare program or system. Structure represents capacity and encompasses the resources available in a system and can apply to individual practitioners, groups of practitioners, organizations and agencies and programs. Process represents the development of a project as well as the services that are provided to the client. Economic data is also collected to specifically document the resources needed to implement a program’s process and services, which are used in cost benefit and effectiveness calculations. Combined, structure and process data provide the context to interpet client and system level outcome data.


Qualitative Data

While the PD Interview and the EBP and PSH Self-Assessments provide valuable qualitative data, the majority comes from the Site Visit Guides. Qualitative analysis focuses on describing the characteristics of the grantee organization and their partnerships, the system within which the project is embedded, relationships with stakeholders, characteristics of the target population, project planning, services provided including implementation of EBPs, the types and models of housing integrated into the project, and project outcomes including sustainability. Descriptive analyses of these measures provide information on implementation of the Homeless Programs at the grantee level. Qualitative analysis helps identify common trends and themes across grantees and projects and will especially focus on identifying barriers and facilitators to implementing project activities and the solutions grantees found to common challenges.


Qualitative narrative data (from Site Visit Guide transcriptions) will be subject to content analysis. Discussion guides will also include structured questions with close-ended responses that can be quantified and analyzed accordingly; for narrative information that does not lend itself easily to quantitative coding, data will be transcribed and uploaded into ATLAS.ti, a software package used for coding qualitative data. We will use a grounded theory approach to guide our coding process. First, all lines of text will be subject to open-coding, where codes are expressed in the present progressive tense. Second, open codes will be reduced into a set of axial codes. Finally, theoretical codes will be used to structure the presentation of the qualitative findings. Such analyses will reveal common themes across data. We expect to see the following themes emerge from the qualitative analyses: changes to grantee plans, the types of barriers, challenges and responses encountered during implementation; facilitators to implementation and operation; and collaboration among grantees, their partners, and other community agencies and organizations. If supported by the data, common themes, including barriers and facilitators, will be explored more deeply to identify possible mediators and moderators. Findings will be presented in narrative text. Commonalities identified across grantees may be incorporated into analyses, testing the association between structural and process variables and client and project outcomes through Hierarchical Linear Modeling (HLM), which is discussed below.


Quantitative Data

The PD Interview and the EBP and PSH Self-Assessments primarily collect quantitative data which is used for both descriptive statistics and statistical analysis. Descriptive analysis and tables will report key statistics, such as means, sample size, standard errors, and t- and χ2- test results, where appropriate. The basic approach will pool data across grantees and programs. When appropriate, findings will be presented separately for key project characteristics (e.g., type of housing integrated into the project; population targeted; type, number and combination of EBPs offered). Attachment 10 includes table shells in which descriptive grantee data may be reported. Statistical analysis, especially HLM, can identify associations between measures of structure and process and individual client access to core services, individual client outcomes, client perceptions of care, and project sustainability. Importantly, within this framework we can test the extent to which programmatic, clinical, and contextual characteristics moderate and/or mediate changes in client outcomes over time. Change in client outcomes will be measured through the longitudinal GPRA and NOMS baseline and follow-up data, and is central to the outcome component.



Statistical analysis, such as HLM, will be used to estimate the association between grantee characteristics collected with the PD Interview, Site Visit Guides (including Cost Data) and the EBP and PSH Self-Assessments and mean change in client-level outcomes between baseline and follow-up. HLM is appropriate for these analyses because this modeling approach allows the analyst to control for the clustering of clients within grantees and grantees within programs. The HLM framework allows the model to be adjusted for client characteristics and contextual factors, namely the grantee characteristics collected through the data collection tools. These adjusted mean changes will provide easy-to-understand estimates of possible program effects. Although these estimates are not intended to be causally interpreted, we do intend to compare them to estimates for similar models and populations in the scientific literature to confirm that they are within ranges that we would expect, conditional on the level of adherence to the models that we observe for each grantee. These estimates form a baseline for exploring how project decisions and characteristics alter service delivery and outcomes. In this way, variation among the grantees and programs will be used in analyzing ‘key ingredients’ of models for achieving different outcomes, such as linking clients to certain types of housing. Despite not having clients randomized to control groups, we plan to assess comparative effectiveness (broadly defined) of alternative programs and service models by applying approaches similar to those used in comparative effectiveness analysis that incorporates findings from rigorous trials in its analysis of observational health and healthcare data. However, our analyses will not be as definitive as true comparative effectiveness analyses that are based on randomized control trials, and we will temper our conclusions accordingly. In addition, the portfolio of quasi-experimental approaches we may employ in order to improve the accuracy of our inference both within programs and across programs include propensity score matching, instrumental variable methods and regression discontinuity designs. As appropriate, subgroup analyses will be conducted in which the data will be stratified by program type or client type to assess whether outcomes differ among the different types of programs (e.g., grantees using an SSH model, grantees offering certain EBPs) or for different types of clients (e.g., veterans, women). Attachment 10 includes table shells in which potential results can be reported.


Economic Data

The Site Visit Guides also include a cost questionnaire to collect data to estimate the costs and calculate cost-effectiveness of the Homeless Programs grants at the client, grantee, and system levels. For this evaluation, analysis focuses on estimating the cost and cost-effectiveness of the Homeless Programs as a mature program so that it can be directly compared with other models of treatment delivery and compared with the cost bands specified by the NOMs performance measurement initiative. To do this, the costs of implementing Homeless Programs services is separated from the costs of developing and revising the Homeless Programs protocols and from the costs of administering the Homeless Programs grant project. The cost analysis will provide both dollar estimates and estimates of the amount of resources used so that the results can be applied to different circumstances and prices. The economic evaluation will also identify the key drivers of cost, allowing decision makers to identify critical cost components of the intervention. The detailed economic study will also facilitate sensitivity analysis, which assesses the degree to which conclusions are robust to changes in key assumptions.


A cost-effectiveness analysis will combine the estimates of the program costs with the estimate of the program outcome for each of a limited number of outcomes. In addition to this analysis we will also use estimates of the costs of key outcomes to conduct a limited cost-benefit analysis, which will help stakeholders in the provider system understand the degree to which a program is justified fiscally. This approach, often termed a cost offset or a comparative cost analysis, is common when evaluating interventions that influence health outcomes and has recently been applied to homeless populations (e.g. Basu et al., 2012).   


The analysis plan described above was informed by past evaluations and the current evaluation is expected to improve future studies. Within the context of addressing the specific evaluation questions (see pages 7 - 8) for this evaluation, the prior two evaluations informed the development and revision of data collection procedures, measures and the areas that required further inquiry (e.g., housing models, implementation of evidence based practices) to best address gaps of knowledge in the field, as well as to attempt to answer the evaluation questions. The prior evaluations also informed the development of the current evaluation questions and framework to address the differences and commonalities among the SAMHSA Homeless program portfolios and their funded projects service models, as well as to produce meaningful new information for the field. While there is substantial literature on homelessness, there are no prior large scale evaluations of a homeless treatment population and for whom services receipt includes programmatic requirements for implementation of evidence based practice housing and other service models. The prior evaluations also familiarized the contractors with the homeless programs ensuring streamlining of questions, data collection, and areas were program differences would need to be controlled for models that aggregated across portfolios. This knowledge has ensured the ability to complete substantial work on operationalizing terms, organizing a framework with key dimensions to be used for outcome analytic models (using both primary and secondary data), refining analytic models and ensuring dissemination of information to inform SAMHSA, grantees and the field.


The previous GBHI cost evaluation directly informed the data collection and analytic approach taken by the cost component proposed for the Homeless Program Evaluation. The prior cost evaluation importantly identified additional data for collection which was not originally included in the scope of the evaluation; including the data in the Homeless Program Evaluation cost evaluation will increase rigor and contribute to the field. The Partner Services section (see page 18 in the Supporting Statement) was specifically added to more systematically collect information on the wide variety funding sources used by grantees and their partner’s to create comprehensive programs for homeless clients. This information was partially collected during the previous GBHI evaluation through document review but its usefulness warranted inclusion in the Site Visit Guides.


For the impact data collections/analysis, combining the GBHI and PATH evaluations significantly broadens the evaluation’s ability to more fully conceptualize program models. The inclusion of additional programs and grantee cohorts (e.g., GBHI, GBHI-CABHI and SSH) increases the data available to develop and test conceptual program models. The inclusion of PATH programs allows for further test of program models with a greater understanding of the programs which provide services to homeless individuals. This increased scope and breath will also further the evaluation’s contribution to the field.


As the current evaluation is finalized, SAMHSA and contractor(s) will evaluate the scientific rigor achieved through alternative and related analysis methods (e.g., propensity score matching and other quasi-experimental methods). This will yield insights in two ways. First, it will help identify the pros and cons of these analytic methods for future impact studies at SAMHSA and the field. It will also inform where new or different approaches need to be implemented. SAMHSA may use this information to inform new funding announcements with regard to the type of models, services, and requirements may be needed, based on evaluation findings, to more effectively implement services for the vulnerable populations served by SAMHSA’s programs. Second, we expect our results will inform the field on the relationships between services and program models for the population and relevant sub-populations (e.g., homeless families, homeless veterans, homeless youth and adults involved in the justice systems). Specifically, new hypothesizes and questions will emerge which will prompt additional studies with specific designs. Finally, SAMHSA will explore opportunities in the future to build upon this innovative and needed evaluation for future study.



17. Display of Expiration Date


OMB approval expiration dates will be displayed.


18. Exceptions to Certification for Statement


There are no exceptions to the certification statement. The certifications are included in this submission.


2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorStephen Orme
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy