Program Officer/Project Officer
Joel Dubenitz, Ph.D. – Social Science Analyst
U.S. Department of Health and Human Services
Office of the Assistant Secretary for Planning and Evaluation
200 Independence Avenue SW, Washington DC 20201
Respondent Universe and Sample Selection Methods
Overall, we expect to obtain data from 2,080 unique AOT/non-AOT respondents via both primary, interview-based data collection, and from secondary, administrative data. Additionally, we plan to obtain data from an additional 365 unique family member respondents of AOT recipients (i.e., n=347) and clinicians/other staff associated with the AOT program (i.e., n=18). Below we provide more detailed information on the number of expected respondents (N=2,445) for each data collection effort.
Structured Client Interviews: We will conduct structured client interviews with all AOT program participants at the six in-depth sites. We will also conduct similar interviews with non-AOT program participants at six to-be-identified comparison sites. Based on information from each of the AOT programs, we expect that each site will intake an average of 173 participants per site over two years, and we will target comparison sites that can provide a similar number of respondents. We anticipate that participants will be interviewed, on average, four times over the course of the evaluation: at baseline, 6-month, 12-month, and then 6 months following discharge from AOT. The following table provides more detailed figures of the total expected number of respondents to the structured client interview.
Sitea, b |
Intervention Group |
Comparison Group |
Total |
AltaPointe Health Systems |
210 |
210 |
840 |
Cook County Health and Hospital System |
200 |
200 |
800 |
Hinds County Mental Health Commission |
150 |
150 |
600 |
Doña Ana County |
80 |
80 |
320 |
ADAMHSBCC |
200 |
400 |
800 |
ODMHSAS |
200 |
400 |
800 |
Total |
1,040 |
1,040 |
2,080 |
a As noted in Supporting Statement A, the six sites noted above participated in the AOT Implementation Evaluation. However, these same six sites may not be part of the AOT Outcome Evaluation. The AOT HHS Advisory Committee will identify the six in-depth outcome sites.
b All funded sites, including the six sites identified for the outcome evaluation, will provide secondary, administrative data to supplement primary data collections. As detailed in Supporting Statement A, local evaluators at each of the sites will oversee linkage of survey data to secondary, administrative data in order to ensure that all datasets will be stripped of identifiable information prior to sharing with RTI.
Identifying comparison groups: To address questions that are focused on the impact of AOT, the evaluation design calls for the identification of a valid comparison group and the collection of common data elements on members of that comparison group (i.e., elements from the structured client interview and secondary data sources). To identify comparison sites, we will start with targeted assessments for each of the sites to test the feasibility of inter- and intra-county comparison groups, where feasibility is based on an available population of consumers who are receiving similar services, providers who are willing to partner in collecting client-level data, and availability of high quality secondary data sources. This approach, described in further detail below, was developed in consultation with AOT experts, mental health treatment experts, and HHS AOT program advisory committees during a technical advisory group meeting convened to guide the evaluation design for this project.
We will use a three-step approach to ensure that the comparison population is comparable at the community, clinic, and individual levels, thus making our subsequent analyses robust to many sources of unobserved confounding. Below we provide a brief overview of each step, including how we will assess comparability and where we will obtain data elements for assessing comparability.
Step One. For each of the selected case study settings, we will identify a comparison community where AOT is not being implemented. We will assess the degree of comparability of these communities based on the following characteristics: (1) sociodemographic composition (e.g., median household income), (2) market characteristics (e.g., number of community mental health centers per 100,000 persons), and (3) prevalence of need for SMI treatment. Measurements for these characteristics will come from census data, SAMHSA’s Behavioral Health Treatment Services Locator tool (https://findtreatment.samhsa.gov), or from synthetic county and state estimates of SMI prevalence produced by Professor Charles E. Holzer III.
Step Two. From each of these comparison communities, we will identify at least one comparison clinic. We will assess the degree of comparability of these clinics based on the following characteristics: (1) types of services provided (e.g., ACT, non-ACT), (2) facility type (e.g., outpatient mental health), and (3) average client case-mix. Service offerings and facility type data can be obtained from SAMHSA’s Behavioral Health Treatment Services Locator tool and AHRF. Client case-mix measures will be obtained from Medicaid databases once those data have been acquired.
Step Three. Finally, from each of the comparison clinics, we will identify a sample of comparable consumers. We will assess the degree of comparability based on an assignment algorithm we previously used in our evaluation of New York’s AOT program. The essential criteria are: (1) diagnosis codes of schizophrenia, bipolar, or affective disorders for an inpatient admission, (2) two or more psychiatric admissions during the baseline period, (3) 14 or more inpatient days during the baseline period, and (4) having received some intensive services such as ACT during the baseline period. We will also consider excluding consumers with few encounters at comparison clinics. Data to support identifying these criteria will primarily come from Medicaid claims and/or encounter data.
Although our comparison group strategy is designed to produce a highly similar population of consumers receiving treatment in similar settings, there will likely remain some differences between groups. Additionally, because it is not feasible to consider all consumer characteristics in the three-step approach outlined above, there are several additional consumer characteristics for which we may observe important differences. These include characteristics such as age, sex, race/ethnicity, patient history, and illness severity. To account for these differences, we will use propensity score methods. Specifically, we will use inverse probability of treatment weighting (IPTW), which starts with a probabilistic model to model the probability that an individual is a member of the intervention group as a function of all observed confounders. Weights will then be used to assess whether after weighting the groups have similar characteristics, and in the outcome regression modeling. This combination of IPTW and regression modeling is doubly robust to model misspecification, as only one model needs to be correctly specified (i.e., either the probabilistic model predicting intervention membership or the outcome regression model). This approach was successful in our prior AOT evaluation, but we can explore relatively new alternatives (e.g., entropy balancing) if we find that IPTW does not perform sufficiently well.
Family Satisfaction Survey: We will randomly sample from the population of AOT program participants at each of the six in-depth sites. We will target one-third of each of the site-specific participant populations, and administer the survey with a family member from each of these families. The table below provides more detailed figures of the expected number of respondents to the family satisfaction survey.
Site |
Number of Respondents |
AltaPointe Health Systems |
70 |
Cook County Health and Hospital System |
67 |
Hinds County Mental Health Commission |
50 |
Doña Ana County |
27 |
ADAMHSBCC |
67 |
ODMHSAS |
67 |
Total |
347 |
Cost Questionnaires: We will administer a separate cost questionnaire to at least three respondents from each of six AOT sites on two occasions, once annually. The three respondents will be a representative of the site’s psychiatric hospital, court system, and community treatment provider. Questionnaires will be tailored to each type of respondent. The following table provides more detailed figures of the expected number of respondents to the cost questionnaires.
Site |
Number of Respondents |
AltaPointe Health Systems |
3 |
Cook County Health and Hospital System |
3 |
Hinds County Mental Health Commission |
3 |
Doña Ana County |
3 |
ADAMHSBCC |
3 |
ODMHSAS |
3 |
Total |
18 |
Docket Case Monitoring Form: One respondent at each site will enter information about each program participant’s court hearing, from the petition hearing to the final hearing terminating an AOT order, into the Docket Case Monitoring Form. One respondent at each site will enter a response for each court hearing. The respondent is expected to be an AOT local evaluator.
Site |
Number of Respondents |
AltaPointe Health Systems |
1 |
Cook County Health and Hospital System |
1 |
Hinds County Mental Health Commission |
1 |
Doña Ana County |
1 |
ADAMHSBCC |
1 |
ODMHSAS |
1 |
Total |
6 |
AOT Characteristics Form: The AOT characteristics form will be completed on a monthly basis by a local evaluator at each site. The table below shows the number of respondents across sites.
Site |
Number of Respondents |
AltaPointe Health Systems |
1 |
Cook County Health and Hospital System |
1 |
Hinds County Mental Health Commission |
1 |
Doña Ana County |
1 |
ADAMHSBCC |
1 |
ODMHSAS |
1 |
Total |
6 |
Procedures for Collection of Information
Structured client interview. We will provide training to local evaluators and/or site clinicians to conduct face-to-face interviews with clients. While site clinicians may be used, we will stipulate that non-treating clinicians conduct these interviews. Interviews will be conducted every six months, beginning at baseline entry to AOT, through 6 months following termination of the order. We will aim to sample from the entire universe of AOT participants and individuals encountered at comparison sites. All respondents will be asked to provide informed consent.
Family satisfaction survey. We will administer a survey with family members of a random sample of AOT program participants. Random sampling will be conducted within each site. We will conduct a simple random sample of all AOT participants.
Cost questionnaire. We will provide respondents with electronic-based worksheets to provide information about the costs of implementing and maintaining their AOT program. We will give respondents a short period of time to complete as much of the worksheet as they can. After they return their cost questionnaire worksheet, we will review the cost data they provide as well as other financial information available in budgets and federal financial reports. We will then schedule a follow-up interview to discuss their responses, and fill in any missing information that each respondent was unable to provide. We will sample from the entire universe of AOT program administrators, court representatives, and state psychiatric hospital representatives to collect organization-specific costs associated with implementing and maintaining an AOT program.
Docket case monitoring form. We will provide respondents with an electronic-based worksheet with instructions for when and how to register each response in the worksheet. Respondents will be asked to record information on each observed court hearing that takes place.
AOT characteristics form. We will provide respondents with an electronic-based form to provide information about the civil and legal processes of their AOT program. Local evaluators will complete this form once per month for the duration of the outcome evaluation.
Methods to maximize response rates and to deal with nonresponse
As a condition of the grant funding that AOT sites receive, programs are mandated to help with collecting data for the purposes of this evaluation. There are also funds in our evaluation budget to help support data collection activities that AOT grantees conduct on behalf of the evaluation. We will collaborate closely with local evaluators and other grantee personnel to help ameliorate issues they face in achieving a sufficient response rate to the structured client interview and family satisfaction survey. For the cost questionnaire, we will work closely with respondents to ensure that they are able to complete the worksheet and schedule follow-up interviews with us in a timely fashion.
Tests of procedures or methods to be undertaken
We will not do any formal testing of our procedures prior to conducting data collection. All three instruments and protocols are similar to those used without difficulty in prior evaluation work by the study team.
Consultants on statistical aspects of the design and people who will collect and analyze the information
In October 2016, ASPE awarded task order #16-233-SOL-00683 to RTI International and its subcontractors, Duke University and Policy Research Associates (PRA), to design and conduct the implementation and outcome evaluations. The study team includes Richard Van Dorn, Kiersten Johnson, and Will Parish of RTI International, Marvin Swartz of Duke University, and Hank Steadman and Brian Case of PRA. The study team designed the evaluation in conjunction with the HHS advisory committee and a technical advisory group. The evaluation will be conducted by RTI and subcontractors under contract with ASPE.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Brian Case |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |