The Office of the Assistant Secretary for Planning and Evaluation (ASPE) at the U.S. Department of Health and Human Services (HHS) is requesting Office of Management and Budget (OMB) approval for new qualitative and survey data collection activities to support its evaluation of the Certified Community Behavioral Health Clinic (CCBHC) demonstration program. Quantitative data collection activities with their respective OMB approval numbers are described further in the final paragraph of this section.
Section 223 of the Protecting Access to Medicare Act (Pub.L. 113–93; PAMA) authorized the Certified Community Behavioral Health Clinic (CCBHC) demonstration to allow states to test a new strategy for delivering and reimbursing a comprehensive array of services provided in community behavioral health clinics. The demonstration aims to improve the availability, quality, and outcomes of outpatient services provided in these clinics by establishing a standard definition for CCBHCs and develops a new Medicaid prospective payment system (PPS) in each state that accounts for the total cost of providing nine types of services to all people who seek care. The PPS in each state is designed to provide CCBHCs with the financial support and stability necessary to deliver these required services. The demonstration also aims to incentivize quality through quality bonus payments to clinics and requires CCBHCs to report quality measures and costs. The demonstration was originally authorized for two years.
In December 2016, the U.S. Department of Health and Human Services (HHS) selected 8 states to participate in the demonstration based on the ability of their CCBHCs to (1) provide the complete scope of services described in the certification criteria and (2) improve the availability of, access to, and engagement of clients with a range of services. In August 2020, HHS announced that Kentucky and Michigan would begin participating in the demonstration as a result of expansion of the demonstration by the Coronavirus Aid, Relief, and Economic Security Act (Pub.L. 116–136).
PAMA mandates that HHS submit reports to Congress about the Section 223 demonstration that assess (1) access to community-based mental health services under Medicaid in the area or areas of a state targeted by a demonstration program as compared to other areas of the state, (2) the quality and scope of services provided by certified community behavioral health clinics as compared to community-based mental health services provided in states not participating in a demonstration program and in areas of a demonstration state that are not participating in the demonstration, and (3) the impact of the demonstration on the federal and state costs of a full range of mental health services (including inpatient, emergency, and ambulatory services). The ability of ASPE to provide this information to Congress requires a rigorously designed and independent evaluation of the CCBHC demonstration.
In 2016, ASPE began a five-year mixed-methods evaluation of the first two years of the demonstration to address the PAMA requirements and describe implementation successes and challenges to inform annual reports to Congress. Data collection activities for the original evaluation were covered under a separate OMB approval (0990-0461). As the demonstration has continued in the original states and has expanded to others, ASPE will further evaluate the implementation and outcomes of the demonstration in accordance with PAMA. The current evaluation will focus on the extension period in the original seven states that continued in the demonstration (Pennsylvania did not continue) and the full demonstration period for the two new states (Kentucky and Michigan) that entered the demonstration in late 2021. The primary goals of the current evaluation are to provide information to inform future reports to Congress and to further assess implementation of the demonstration in each state.
In accordance with the current evaluation contract timeline, the evaluation will take place over 48 months (the evaluation contract began September 30, 2021 and ends September 29, 2025). Evaluation data collection that requires OMB approval will not begin until ASPE receives final OMB approval.
To learn about the implementation of the CCBHC program and address the topics of access, quality and scope of services, and costs as required by PAMA, the study team will use a mixed-methods approach to assess:
The structures and processes states and CCBHCs implemented to increase access to care, and the impact of these investments on access to care relative to care provided elsewhere in demonstration states;
Changes in the quality and scope of services and care coordination provided by CCBHCs as a result of the demonstration and how the scope of services provided to CCBHC clients compares with the services available in other service settings; and
The impact of the CCBHC demonstration on Medicaid services and their associated state and federal costs.
The qualitative and survey components of the evaluation are designed to help ASPE understand the services offered by CCBHCs; state and clinic perceptions on costs of these services; the extent to which CCBHCs experience challenges with demonstration implementation; their experience reporting and using quality measures and improving quality of care; and the perspectives of state officials, CCBHCs, and clients on the overall successes and challenges associated with the CCBHC model. The evaluation will also include an impact study that examines changes in service use and costs of serving CCBHC clients relative to comparison groups and examine impacts of the CCBHC demonstration on access and quality. The evaluation will address specific questions regarding costs, quality, and impacts through analysis of Medicaid claims data obtained from CMS, cost reports that states will submit to CMS under a separate OMB approval (OMB 0938-1148 CMS 10398), quality measures that states are required to submit to SAMHSA under a separate OMB approval (OMB 0938-1148 CMS 10398); data from the National Substance Use and Mental Health Services Survey collected under a separate OMB approval (0930-0386); and CMS-64 forms states submit to CMS under a separate OMB approval (0938-1265); thus, this submission focuses solely on new primary qualitative and survey data collection efforts.
Section 223 of PAMA requires the Secretary of HHS to provide annual reports to Congress that include an assessment of access to community-based mental health services under Medicaid, the quality and scope of CCBHC services, and the impact of the demonstration on federal and state costs of a full range of mental health services. ASPE’s prior evaluation of the first two years of the CCBHC demonstration informed past reports to Congress. The data collected under this submission will help ASPE address the PAMA topics to inform future reports to Congress. Each proposed data collection instrument is described below, along with how, by whom, and for what purpose the collected information will be used. Table A.1 provides additional detail about how the content areas in each data collection instrument will be used to address the PAMA requirements.
All of the primary data collection efforts for this evaluation will be virtual because it 1) is the most efficient method of data collection (more resources can be dedicated to data collection as there are no travel expenses); 2) it provides additional flexibility in sampling (there is no need to select CCBHCs that are close together because traveling to a specific geographic region for site visits is not necessary); and 3) it is prudent in light of the COVID-19 pandemic. Increased use of virtual platforms during the COVID-19 pandemic has improved respondents’ comfort levels with communicating virtually/telephonically and navigating technological platforms. It also may be more convenient for respondents as they can participate in interviews or focus groups from any location. The instruments ASPE is seeking approval for are as follows:
State official telephone interview protocols. The evaluation team will use the telephone interview protocols to conduct three rounds of semi-structured interviews with state officials to gather information about states’ progress at different stages of demonstration implementation. Each round of interviews will include state Medicaid and behavioral health officials (both mental health and substance abuse officials) and will have a slightly different focus, reflecting the stage of implementation at the time of the interviews. To limit respondent burden, for the seven original demonstration states, the current evaluation will ask respondents to reflect on the implementation period not covered by the first evaluation (i.e. after the end of the second demonstration year).
To capitalize on the specific perspectives state leadership can offer, a key focus of these interviews in each year of data collection will be states’ demonstration plans, investments, and support as well as external policy and program context. The first round of interviews, conducted in early 2023, will use the evaluation year two protocol and include questions that align with the PAMA topic areas of access, quality, and scope of services. For example, for all states we will inquire about the types of care management and care coordination provided and changes in the quality of care over time.
The second round of interviews, conducted in the spring of 2024, will use the evaluation year three protocol and address questions from all PAMA areas (access, quality and scope of services, costs, and cross-cutting issues). The third round of interviews, conducted in winter 2025, will use the evaluation year four protocol and focus largely on outcomes and final implementation experiences. For example, they will ask for state officials’ perceptions regarding CCBHCs’ impacts on the use of services, seek final reflections and updates on the steps CCBHC took to increase access to care, and ask about CCBHCs’ ability to maintain the required services over time and about any challenges encountered.
CCBHC leadership interview protocol. The evaluation team will conduct one telephone interview with CCBHC leadership (CCBHC project director/chief executive officer, CCBHC medical director, and any other key leaders suggested by the CCBHC) per CCBHC at up to 15 CCBHCs in the third evaluation year (fall 2023/winter 2024). Interviews will include CCBHCs in a diverse group of states and that encompass the demonstration’s different payment models, organizational structures, and implementation contexts. Interviews will cover most evaluation questions, including those related to access, quality and scope of services, costs, and cross-cutting topics and will gather reflections on the CCBHC model, including implementation successes and challenges, from those directly implementing the model and serving clients.
Client focus group protocol. The evaluation team will conduct up to eight virtual focus groups with clients receiving CCBHC services. Up to four focus groups will be held toward the end of the third evaluation year (spring 2024) and the remaining groups will be completed at the beginning of the fourth evaluation year (fall 2024). Each focus group will include up to five clients for a total of approximately 40 clients. Recruitment efforts will target clients from the same CCBHCs that are included in the CCBHC leadership interviews. Recruiting clients from the same CCBHCs that participate in telephone interviews is an efficient way to help ensure this data collection activity also reflects a balance of CCBHC characteristics. The evaluation team will hear firsthand from clients regarding the PAMA topics—for example, respondents will reflect on whether they have perceived improvements in access to care and whether they are satisfied with their current access to care.
CCBHC survey. The CCBHC project director or other designated staff at each CCBHC will complete an online survey in the second and third evaluation years (about February 2023 and 2024). The survey will gather key information over time about clinics’ operations and how their structures, procedures, and services align with the CCBHC certification criteria and support the goals of the demonstration. For example, the survey will collect information about staffing, scope of services, and accessibility; use of health information technology; relationships with other providers for the purposes of service delivery and care coordination; and quality reporting and improvement activities. The survey will include structured fields to gather comparable information from each CCBHC, using prompts and preset response categories such as check boxes. The survey will include skip patterns to reduce burden on CCBHC respondents and improve consistency of data collection.
Table A.1. Data collection activities, by data source
Data source |
Mode, timing, and respondent |
PAMA topic area (PTs) |
Content |
Analysis |
Qualitative data sources |
||||
State official telephone interviews |
In Years 2, 3, and 4 of the evaluation, the evaluation team will conduct telephone interviews with state Medicaid and behavioral health officials. |
PT1, PT2, PT3 |
(1) Demonstration oversight, support, and implementation; (2) access to care; (3) service use patterns; (4) COVID-19 effects; (5) scope of services and coordination of care; (6) EHR/HIT; (7) primary care; (8) crisis services; (9) quality of care; (10) reporting and use of data to improve quality; (11) cost, payment, and PPS; (12) policy initiatives; (13) implementation successes and challenges; (14) sustainment/expansion activities |
Descriptive analyses |
CCBHC leadership telephone interviews |
In Year 3 of the evaluation, the team will conduct telephone interviews with CCBHC leadership (CCBHC project director/chief executive officer, CCBHC medical director, and any other key leaders suggested by the CCBHC). |
PT1, PT2, PT3 |
(1) Access to care; (2) service use patterns; (3) COVID-19 effects; (4) scope of services; (5) collaborations and relationships with other organizations; (6) care management and care coordination; (7) EHR/HIT capacity; (8) primary care; (9) crisis services; (10) quality of care; (11) reporting and use of data to improve quality; (12) cost, payment, and PPS; (13) state supports and policy initiatives; (14) implementation successes and challenges; (15) sustainability plans |
Descriptive analyses |
Client focus groups |
In Years 3 and 4 of the evaluation, the team will conduct focus groups with clients. |
PT1, PT2 |
(1) Access to care; (2) service use patterns; (3) scope of services; (4) care management and coordination; (5) primary care; (6) quality of care |
Descriptive analyses |
CCBHC survey |
During Years 2 and 3 of the demonstration, all CCBHCs will submit the CCBHC survey to the evaluation team. |
PT1, PT2, PT3 |
(1) CCBHC staffing; (2) accessibility; (3) care coordination; (4) scope of services; (5) data sharing, quality, and other reporting, (6) costs, (7) sustainability |
Descriptive analyses |
The evaluation is expected to be completed in four years. Table A.2 shows the schedule of data collection activities covered by this OMB request.
Table A.2. Timeline for the data collection
Data source |
Dates |
State official telephone interviews |
Winter 2023, Spring 2024, Winter 2025 |
CCBHC leadership telephone interviews |
Fall/Winter 2023-2024 |
CCBHC client focus groups |
Spring 2024, Fall 2024 |
CCBHC survey |
Winter 2023, Winter 2024 |
CCBHCs will submit survey responses through Confirmit, a secure, online survey platform. Focus groups with CCBHC clients will be held virtually using WebEx or Zoom in virtual meeting rooms. The evaluation team will monitor the attendance and participation of all individuals in the virtual meeting room during each focus group.
In formulating the evaluation design, ASPE has carefully considered ways to minimize burden by supplementing existing data sources with targeted primary data collection. To this end, the evaluation incorporates the following approach:
Using data from existing sources while conducting supplemental primary data collection: To the extent possible, information regarding demonstration implementation will be gathered through a review of available sources, including, for example, state demonstration and planning grant applications; Section 1115 demonstration waiver applications; the annual National Substance Use and Mental Health Services Survey; CMS-64 reports; cost reports and quality measures states will submit as part of demonstration requirements; and Medicaid claims. However, the level of detail and consistency of the information provided in these source documents and other data sources will likely vary from state to state and may not fully address the PAMA requirements. To supplement data gathered from these sources, ASPE is requesting OMB clearance to conduct telephone interviews with state officials and CCBHC staff, conduct an online survey, and hold virtual focus groups with CCBHC clients. The evaluation team will use the information gathered from telephone interviews to clarify and fill in gaps in the data gathered from the survey and document review. The team will conduct telephone interviews with clinic leadership in up to 15 CCBHCs to have in-depth discussions about access, quality and scope of services, costs, and cross-cutting topics. To minimize respondent burden, interview and survey questions for each data source will be tailored to reflect the expertise and insight offered by each respondent type.
The CCBHCs in the participating states vary in size, from small entities to large provider organizations. The qualitative data collection protocols have been designed to minimize burden on these entities and on CCBHC clients who participate in focus groups. The evaluation team will make every effort to schedule telephone interviews and focus groups at the convenience of these respondents and participants. The evaluation team will request the minimum amount of information from CCBHCs that is required to evaluate the CCBHC demonstration effectively.
Each of the data sources provides information needed for the evaluation. If the data are not collected, the evaluation team will not have adequate information to address the PAMA requirements. The inclusion of all planned data sources is needed to obtain information about demonstration implementation and impacts on quality and costs.
CCBHC leadership interviews will take place only once during the evaluation. If they are not conducted, the evaluation team will not have adequate information to evaluate whether implementation is consistent with PAMA and/or ensure that the Secretary has the information necessary to provide Congress with the information mandated by PAMA. CCBHCs will submit surveys twice during the course of the evaluation; repeated reports are needed to examine changes in access, scope of services, and other demonstration requirements over time. Similarly, the evaluation team will conduct interviews with state officials at three points during the demonstration to understand the evolution of demonstration administration and investments; implementation successes and challenges; and changes in access to, costs, and quality of care over time. Lastly, it is essential to obtain information directly from the clients of CCBHC services to understand how implementation of the model affects their access to care and experiences with care. The evaluation team is conducting one round of focus groups with clients to obtain their perspectives after CCBHCs have had an opportunity to gain experience implementing the new model.
This information collection fully complies with 5 CFR 1320.5(d)(2).
This is a new data collection. The 60-day notice was published in the Federal Register on June 3, 2022 (87 FR 33798; pages 33798-33799). No comments were received.
If a CCBHC is willing to assist with the logistics of participant recruitment for client focus groups, the clinic will receive an honorarium of $1,000 for their assistance. Each client who participates in a focus group will receive a $50 gift card. State official and CCBHC staff participation in other data collection activities will be carried out in the course of their employment; no additional compensation will be provided outside of their normal pay.
The Privacy Act does not apply to this data collection. Participants will not be asked about, nor will they provide, individually identifiable health information. The data produced through this project will not contain nor be retrievable by personal identifiers.
Before the start of state official interviews, CCBHC leadership interviews, and client focus groups, the evaluation team will remind all respondents that the information gathered will be used for evaluation purposes only and not be attributable to any individual. Responses should not contain private information but will be aggregated to the extent possible so that individual answers will not be identifiable. Because of the limited number of respondents interviewed per state and CCBHC, however, it might be possible to infer individual responses from reports. (For example, there may be only one state Medicaid official participating per state. Similarly, for states with few CCBHCs, it may be possible to infer which CCBHCs were selected for interviews.) For each state official and CCBHC leader interviewed, the evaluation team will collect name, professional affiliation, and title, but not Social Security numbers, home contact information, and similar information that could identify the respondent directly. The reports from the evaluation will not contain the names of respondents or the names of CCBHCs that participated in the interviews.
For clients participating in the focus group, the evaluation team will provide instructions for clients to log into the technological platform in a way that ensures only their first name will be displayed. The team will only use first names during the focus group (or, in the case of groups with two members with the same first name, will only use the first initial of a participant’s last name). Prior to agreeing to participate in the focus group, clients will have reviewed and signed consent forms that explain and assure the privacy of the data. The evaluation team will revisit the issue of data privacy when they cover the ground rules at the beginning of the focus group. Further, the team will emphasize that respondents should not share anything they hear during the discussion with anyone outside of the focus group. In order to provide gift cards and communicate with respondents, the evaluation team will gather email addresses. They will also obtain participants’ names and signatures as part of the consent process, but these data will not be associated with the data provided by participants. The project will not collect any other information that could identify participants directly.
Before each interview and focus group, the evaluation team will ask all respondents to give permission to allow the evaluation team to record the conversation, solely for the purpose of filling in any gaps in the research notes. Only the evaluation team will have access to the recording; it will be destroyed at the conclusion of the evaluation. If the respondent does not wish to have the interview recorded, the interviewer will take notes instead. The evaluation team will maintain the recording and interview and focus group notes in a secure electronic folder that only a minimum number of evaluation staff members may access.
The evaluation team will not ask state officials or CCBHC leaders any questions of a sensitive nature. However, they will ask them for their honest viewpoints on aspects of the demonstration that may or may not be working as planned. The evaluation team will assure them that answers to the interview questions will not be attributed to them in reports. All data privacy and security procedures described in the previous section will apply to the information collected.
By definition, focus group respondents will be receiving care for mental health conditions and/or substance use disorders and be asked to reflect on the care they receive to help manage these conditions. However, the evaluation team will not ask sensitive questions such as questions about diagnoses or symptoms. Recruitment materials will give participants a sense of the topics to be covered and the steps the evaluation team will take to protect the privacy of the data and security of the information shared, so respondents will be able to make an informed decision to participate based on their comfort levels at the outset. Additionally, many people receiving mental health and substance use disorder treatment participate in group therapies, and are therefore accustomed to sharing information in group settings.
As part of an opening discussion of ground rules for the focus groups, the evaluation team will explain that 1) respondents do not need to answer any question they do not want to, 2) the evaluation team will not discuss the conversation with the organizations that provide care or services, so participation and responses will not have any effect on respondents’ care, 3) there are no “right” or “wrong” answers, and 4) while there are no formal breaks, respondents should feel free to get up any time they need to (among other ground rules). The ground rules should help ensure everyone feels comfortable and safe participating and help to reinforce the voluntary nature of participation throughout the entire data collection.
Table A.3 provides estimates of the average annual burden for collecting the proposed information. Below we provide details on the total time and cost burdens for each of the separate data collection activities.
Interviews with state officials: The evaluation team will conduct semi-structured telephone interviews with state Medicaid and behavioral health officials in each demonstration state in three evaluation years:
Interview with state officials, each lasting one hour (9 states x 3 officials x 1 hour x 3 interview years)
CCBHC leadership interviews: The evaluation team will conduct telephone interviews with CCBHC staff in year three of the evaluation (fall/winter 2023-2024) for up to 15 CCBHCs.
Interview with CEO/Medical Director and any other key leadership staff recommended by CCBHC, lasting one hour (15 clinics x 2 respondents x 1 hour x 1 interview year)
Focus groups with clients: The evaluation team will conduct one round of eight virtual focus groups with up to five clients of CCBHC services; four will occur in the third evaluation year (spring 2024) and four in the fourth evaluation year (fall 2024).
Focus group with five clients of CCBHC services, lasting one hour (8 focus groups x 5 clients x 1 hour x 1 interview years)
CCBHC survey: The evaluation team will ask all CCBHCs including those from the original and new demonstration states to participate in a brief survey in the second and third evaluation years.
Survey with 74 CCBHCs with one respondent per site, lasting approximately three hours (1 survey x 74 CCBHCs x 3 hours x 2 interview years)
Table A.3. Estimated annualized burden hours
Type of Respondent |
No. of Respondents |
No. Responses per Respondent |
Average Burden per Response (in Hours) |
Total Burden Hours |
Average Hourly Wage |
Total Hour Cost Burden |
State official interviews |
27 |
1 |
1 |
27 |
$63.36a |
$1,710.72 |
CCBHC leadership interviews |
10 |
1 |
1 |
10 |
$50.13b |
$501.30 |
CCBHC client focus groups |
14 |
1 |
1 |
14 |
$31.31c |
$438.34 |
CCBHC survey |
50 |
1 |
3 |
150 |
$52.45d |
$7,867.50
|
TOTAL |
101 |
|
|
201 |
|
$10,517.86 |
aState government, professional and related category (https://www.bls.gov/news.release/ecec.t03.htm)
bOccupational Outlook Handbook: Medical and Health Services Managers (https://www.bls.gov/ooh/management/medical-and-health-services-managers.htm#tab-1)
cAverage hourly and weekly earnings of all employees on private nonfarm payrolls by industry sector, seasonally adjusted (https://www.bls.gov/news.release/empsit.t19.htm)
dBLS category of clinical, counseling, and school psychologists at outpatient care centers (https://www.bls.gov/oes/current/oes193031.htm).
There will be no capital, start-up, operation, maintenance, or purchase costs incurred by the respondents participating in data collection for the evaluation.
We estimate that two ASPE employees will be involved for 10 percent of their time. Annual costs of ASPE staff time are estimated to be $22,000. Additional costs include the contract awarded for these evaluation activities by ASPE ($496,838.00 over four years, or an annualized cost of $124,209.50). The total estimated average cost to the government per year is $146,209.50.
This is a new data collection.
The evaluation team will incorporate aggregate results from the evaluation in text and charts in the following documents which will serve as the basis for annual reports to Congress developed by ASPE:
A first annual report that will cover: 1) state and CCBHC responses to COVID-19; 2) implementation findings related to access and scope of services; and 3) costs in the first year of the extension period and changes in costs over time, due in August 2022
A second annual report that will cover: 1) the policy environment in which the CCBHC demonstration is operating; and 2) implementation findings related to access, scope of services, and quality, including performance on quality measures in the first year of the extension period and changes in performance over time, due in July 2023
A third annual report that will cover 1) implementation findings related to access, scope of services, and quality; 2) costs in the second year of the extension and expansion period and changes in costs over time; and 3) impacts of the CCBHC demonstration on measures of access, quality, and costs, due in August 2024
A fourth annual report that will cover 1) implementation findings related to access, scope of services, quality, and costs, including performance on quality measures in the second year of the extension period and changes in performance over time; and 2) impacts of the CCBHC demonstration on measures of access, quality, and costs, due in August 2025
Table A.4 provides an overview of the evaluation tasks and in which years the evaluation team will conduct the tasks. This generally translates to the evaluation years in which the team will collect, analyze, and report on the various data sources. However, there are some instances when information will be gathered and analyzed from a certain data source on a specific research question, but reported in a subsequent year, so that the information can be synthesized across data sources.
ASPE may also incorporate the aggregate results from the cross-site evaluation into journal articles, scholarly presentations, and congressional testimony related to the outcomes of the CCBHC demonstration.
Table A.4 Evaluation tasks timeline
Data collection timeline |
Evaluation year 1 (2021-2022) |
Evaluation year 2 (2022-2023) |
Evaluation year 3 (2023-2024) |
Evaluation year 4 (2024-2025) |
Development of evaluation plan and instrumentation |
X |
|
|
|
OMB Submission |
X |
|
|
|
State document review (e.g. demonstration applications, 1115 waivers) |
X |
|
|
|
Interviews with state officials |
|
X |
X |
X |
CCBHC leadership interviews |
|
|
X |
|
Client focus groups |
|
|
X |
X |
CCBHC survey |
|
X |
X |
|
We are requesting no exemption.
There are no exceptions to the certification. These activities comply with the requirements in 5 CFR 1320.9.
List of attachments
A. State Official Interview Protocol - Year Two
B. State Official Interview Protocol - Year Three
C. State Official Interview Protocol - Year Four
D. CCBHC Leadership Interview Protocol
E. CCBHC Client Focus Group Protocol
F. CCBHC Client Focus Group Consent Form
G. CCBHC Year One Survey Template
H. CCBHC Year Two Survey Template
CCBHC
Extension Evaluation, OMB No. 0990-NEW
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Mathematica Report Template |
Author | Allison Wishon Siegwarth |
File Modified | 0000-00-00 |
File Created | 2022-09-14 |