Evaluation of the Children’s Health Insurance Program Reauthorization Act of 2009 (CHIPRA) Quality Demonstration Grant Program
OMB# 0935-0190
Revision of Previously Approved Protocol #0935-0190
Agency of Healthcare Research and Quality (AHRQ)
B. Collections Of Information Employing Statistical Methods 3
1. Respondent Universe and Sampling Methods 3
2. Information Collection Procedures 5
3. Methods to Maximize Response Rates 6
4. Tests of Procedures 7
5. Statistical Consultants 8
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
1. Respondent Universe and Sampling Methods
The information collected under this request is not based on probability samples and may not be generalizable beyond the states included in the demonstration. No sampling, imputation, or other statistical estimation techniques are used.
The first round of interviews were completed in August of 2012 under the previously approved information collection request (OMB# 0935-0190). For the first round of interviews, interview respondents were selected based on a review of grantee applications and final operational plans and in consultation with key project staff. Under this revision to the approved information collection request, we intend to conduct a second round of interviews. Since the second round of interviews builds on the information gained in the first round, we plan to interview the same respondents, unless they have had little or no project involvement since 2012, as well as others who may have become involved since 2012. The total number of interviews needed to yield a comprehensive, multi-faceted understanding of project implementation will range considerably, from 20 to 40, depending on the number, scope and complexity, and nature of the projects in a given state. Below are the details on the selection criteria for the different interview respondent types.
Key project staff (up to 4 respondents in each state). Key project staff will include the project director, project manager, principal investigator, and/or medical director. We will review semi-annual progress reports submitted by grantees/demonstration States to determine if key staff changed since the first round of interviews.
Other implementation personnel (up to 16 respondents). Other implementation staff will be staff involved in the day-to-day implementation of grant funded projects and will include state agency employees, provider trainers, health information technology vendors, and/or project consultants. We will review semi-annual progress reports and consult with key project staff to select up to 16 staff members, many of whom may also been interviewed in 2012. In many States, we will be able to interview all staff involved in day-to-day operations. In States with more than 16 staff members, we will consult with the project director to select the respondents who are most knowledgeable about or involved in the demonstration.
External stakeholders (up to 8 respondents in each state). The stakeholders are likely to be familiar with CHIPRA projects and may serve on advisory panels or workgroups but will not be involved in day-to-day project implementation. We will consult with key project staff to develop a list of up to 8 external stakeholders to interview, many of whom may also have been interviewed in 2012. Stakeholders selected for interviews will (1) be knowledgeable about the demonstration and/or interested in the quality of care provided by their state’s Medicaid and CHIP programs and (2) represent a range of stakeholder groups in the state such as consumer advocacy groups, professional associations, human services agencies, and large health systems.
Health care organization staff (up to 12 respondents in each state). Health care organization staff will be actively participating in demonstration grant activities. AHRQ’s contractor will attempt to interview the same staff members interviewed during the first round of site visits to assess how implementation progressed over the course of the demonstration. For the first round of visits, States furnished a list of all health care organizations participating in their grant-funded projects. If more than 12 organizations were included on the list, AHRQ’s contractor used it to select organizations to represent a range of specified criteria, such as provider type, size, and location. If organizations selected in round one are not available or are no longer participating, we will select replacement organizations that fit similar criteria. At each selected organization, we will interview the clinical and administrative staff members (up to three) most directly involved in project implementation.
The focus group sample was designed to provide insights into the key evaluation questions rather than to be statistically representative. AHRQ’s evaluation contractor is using a purposive, nested sampling design to select focus group participants for up to 20 focus groups in 5 States. Our sampling strategy has also been specifically designed to be able to answer evaluation questions related to disparities by comparing the experiences of English and Spanish speaking Medicaid and CHIP beneficiaries among CHIPRA demonstration sites. The following selection criteria will be applied at each of three levels in sequential order:
States: We selected States that are sufficiently advanced in the implementation of their demonstrations to increase the likelihood that patients have experienced and can speak to the impact of these demonstrations. In addition, we targeted States where there are a sufficient number of Spanish speakers to conduct focus groups with Spanish speaking beneficiaries. States were also targeted to include a variety of care models, including 4 States that have selected patient-centered medical home (PCMH) and 1 state that has selected school based health center (SBHC) as the care model for their demonstrations. Finally, to the extent possible, we selected States that are not participating in another major quality initiative in order to minimize the confounding factor of these initiatives in our results. We selected Oregon, Utah, Florida, and South Carolina for PCMH focus groups and New Mexico for the SBHC focus groups.
Health Care Organizations: The contractor will consult with key project staff from each selected state to determine key practice or clinic characteristics to represent a relevant range of experiences from their state. These characteristics will include participation and level of engagement in the CHIPRA quality demonstration, proportion of Spanish-speaking patients, organization size, organization location (including rural/urban), type of organization (FQHC, independently-owned, system-owned, etc.). In each PCMH state, we will select practices to allow for 2 Spanish focus groups and 2 non-Spanish focus groups. Depending on the state, more than one practice or school-based health clinic may be selected for recruitment for a single focus group, or the same practice or school-based health clinic may be the recruitment site for multiple focus groups. Therefore, the number of practices selected may not necessarily be the same as the number of focus groups conducted. The provider site recruitment letters are included as Attachment W.
Participants: Focus group participants will be recruited through flyers posted at the CHIPRA demonstration sites where they receive their care and letters handed out to parents and adolescents as they arrive for care (see Attachments L and Q for parent and adolescent focus group recruitment materials). The flyers and letters will include a toll-free number for potential participants to use to take part in a telephone screening in English or Spanish. As part of the call, we will determine whether the person is within the target population for the focus groups (see Attachments M and R for parent and adolescent telephone screening scripts) and, if they are eligible, conduct a pre-focus group interview (see Attachments N and S for parent and adolescent pre-focus group interview guides). If these relatively passive strategies for recruitment of focus group participants are not successful, ARHQ’s evaluation contractor will ask practices or SBHCs to mail a postcard to parents or adolescents in their database (see Attachment L and Q). Through the telephone screening, individuals will be identified who are the parent of a Medicaid or CHIP beneficiary in PCMH States or who are an adolescent beneficiary in SBHC States, and for whom the participating provider is the beneficiary’s usual source of care. Some individuals may also be excluded if they appear to be potential outliers (for example, serial focus group participants who have participated in one or more focus groups in the past year). For each focus group, 8-10 participants will be recruited. A maximum of 160 parents will participate in 16 focus groups across 4 States implementing PCMH-focused demonstration projects. Up to 40 adolescents will participate in four focus groups completed in one State with a SBHC demonstration project.
Exhibit 1 illustrates our sampling design, which will allow us to make comparisons within and across States in order to answer multiple evaluation questions.
E xhibit 1. Focus Group Sampling Design
2. Information Collection Procedures
Semi-structured interviews and focus groups will be used for this data collection effort.
We will use the same information collection procedures for the proposed 2014 semi-structured interviews as we used for the previously approved interviews completed in 2012. Interviews will be conducted with an individual respondent and two-member interview team. The interview guides (included in attachments B, D, E, F, G, H, I, and J) will be customized based on the scope and nature of projects in a given state. The interview guides address detailed questions about project implementation and impact that do not lend themselves to self-administered questionnaires or other quantitative data collection methods.
Interview appointments will be scheduled well in advance. All interview respondents will be sent the invitation request (included in attachment C) by email. AHRQ’s contractor will then follow up with the non-respondents every three days, alternating phone and email contacts. If the respondent does not respond to any attempts at follow up within three weeks and the contractor cannot identify a reason for non-response (e.g., the respondent is out of the office), the contractor will stop attempting to contact the respondent. Respondents who agree to be interviewed will be sent a confirmation email (included in attachment C) one week prior to their scheduled interview.
The focus group guides (Attachments K and P) will be used to facilitate the conversation with 8-10 focus group participants. Focus group participants will be encouraged to respond to questions asked by the focus group facilitator or comments made by other participants. The process for focus group recruitment is described in the previous section.
Quality Control Procedures. AHRQ’s evaluation contractor has designated a team of experienced qualitative researchers to collect and analyze interview and focus group data described in this statement. The team leaders will host a team training session so that all researchers involved in data collection employ uniform, high-quality methods and are thoroughly familiar with the data collection instruments. All interviews and focus groups will be conducted by two-person teams (a lead facilitator and note taker) and they will be digitally recorded (audio only) if respondents consent.
3. Methods to Maximize Response Rates
The in-depth interview data collection is not based on probability samples, and a response rate does not apply to this activity. However, in awarding grants to demonstration States, CMS stipulated that States cooperate fully in the cross-state demonstration evaluation, including participation in in-depth interviews. Given this, and AHRQ’s experience conducting the prior round of interviews, ARHQ expects a high level of participation from key project staff, other implementation personnel, and external stakeholders. To further ensure the cooperation of respondents, contractor staff will attempt to minimize individual burden and develop interview schedules that respect site constraints and pressures. The strategies proposed to encourage interview participation under the revised information collection request are the same as those used under the previously approved package.
Minimize individual burden. Willingness of respondents to participate in in-person interviews may hinge on the time these meetings require. To minimize the burden, guides are designed to gather information that is as complete as possible in as little time as possible. AHRQ’s contractor has developed separate discussion guides for each respondent type so that respondents are not asked about activities or issues that are not applicable to them or the state in which they work. In addition, interviewers will meet with interview respondents in person in their own offices or at a location of their choice.
Develop interview schedules that respect site constraints and pressures. The project team will work with each site to determine logistics and a schedule for the in-person interviews. The schedule will avoid conflict with other activities and allow individuals to find time in their calendars to spend with contractor staff.
Offer additional accommodations for providers. While AHRQ expects a high degree of participation from all respondent types, we expect providers may be less readily available for in-person interviews than other respondent types. AHRQ will offer additional accommodations to this respondent type to increase the likelihood of their participation. We will offer to meet with providers outside of clinical hours, restrict the interview to 30 minutes if 45 minutes is not acceptable, and conduct the interview by telephone if the respondent says that would be more convenient.
AHRQ expects recruitment for focus groups to be more difficult than for interviews. To increase participation in focus groups, AHRQ’s contractor will work provide an incentive to practices to help recruit participants, provide an incentive to focus group participants, and host groups at locations that are convenient for participants.
The research team will work with state project staff to identify potential physician practices to serve as recruitment sites for the focus groups. These practices may be more highly engaged in CHIPRA demonstration activities than practices that are not identified or who do not agree to assist with recruitment. Practices will be offered a $500 gift card for their assistance with recruitment, identification of a convenient meeting space, as well as for providing logistical support for the focus group.
Each adult focus group participant will receive a $50 gift card for their participation in the focus group. Adolescent participants will receive a $25 gift card.
Focus groups will be held at locations that are convenient for participants (for example, near bus lines and parking—locations to be selected with input from State demonstration and practice staff) as well as held during convenient times such as during lunchtime or in the evenings. Parent focus groups may be held at libraries, YMCA community centers, or a hotel meeting space. Adolescent focus groups will be held at school at the end of the school day.
The interview protocols we plan to use for the second round of interviews proposed under this revised information collection request reflect lessons learned during our first round of interviews. Our pilot procedures for the first round of interviews are described below. In addition, we will pilot test the 2014 interview protocols with respondents in North Carolina. AHRQ’s objectives during the pilot visit are to assess whether (1) the interviews successfully build on information gained during the first round of interviews, (2) interviewers could collect the information needed in the allotted time, (3) respondents could readily understand and answer the interview questions, (4) interviews flowed sensibly from topic to topic, and (5) the questions seemed to yield thoughtful, candid responses.
In preparation for the first round of interviews, AHRQ conducted pilot tests of the protocols for key project staff and participating health care organization staff. The key project staff and participating health care organization staff protocols were selected for pretesting because project staff and health care organization staff are the most essential respondent types to the study, and because those protocols are the basis for other protocols (external stakeholders, other implementation personnel).
The pretests were conducted as individual telephone interviews with a total of seven respondents (Because of limited resources and time, the agency could not conduct pretests in person, although the actual interviews will be in person.) Pretest respondents were selected to represent a range of demonstration States (Alaska, Utah, Florida, Illinois, Massachusetts, Oregon, and Pennsylvania) and activities in all five grant categories.
The organization of the protocols attached to this supporting statement directly reflect the pretest results. (Because of the overlap in protocol content across respondent types, we applied the insights gained from pretesting the three protocols to those for the other respondent types.) Specifically, the protocol for key project staff consists of a set of general questions to be addressed to a principal investigator or medical director, and sets of category-specific questions to be addressed to other key project staff, such as a project manager or director. This approach ensures AHRQ will capture both broad, contextual information and specific, technical information while making the most effective use of each respondent’s time. All other protocols consist of core and supplemental sections. The core sections contain the high-priority questions that the pretests suggest most respondents will be able to answer in the allotted interview time. The supplemental sections contain lower priority questions that interviewers will be trained to select from if the respondent answers core questions in less than the allotted interview time.
The recruitment and confirmation emails attached to this supporting statement also reflect insights from the pretests. Specifically, pretest respondents suggested it would be helpful to know in advance: the types of questions they will be asked, information about confidentiality, identify of research sponsors, and the use of audio recording during interviews.
AHRQ has contracted with Mathematica Policy Research, Urban Institute and Academy Health to conduct the evaluation of the CHIPRA quality demonstration grants. Table 1 identifies the individuals at these organizations who were consulted regarding the qualitative methods used in this project.
Table 1. Individuals Consulted Regarding Qualitative Methods of Evaluation
Name |
Title |
Phone Number |
|
Grace Ferry (Mathematica) |
Research Analyst |
gferry@mathematica-mpr.com |
202-250-3571 |
Danna Basson (Mathematica) |
Survey Researcher |
dbasson@mathematica-mpr.com |
510-830-3713 |
Cindy Brach (AHRQ) |
Project Officer |
cindy.brach@ahrq.hhs.gov |
301-427-1444 |
Tennille Brown (AHRQ) |
Project Officer |
tennille.brown@ahrq.hhs.gov |
301-427-1664 |
Rachel Burton (Urban Institute) |
Research Associate |
rburton@urban.org |
202-261-5825 |
Kelly Devers (Urban Institute) |
Senior Fellow |
kdevers@urban.org |
202-261-5905 |
Leslie Foster (Mathematica) |
Senior Researcher |
lfoster@mathematica-mpr.com |
510-830-3709 |
Ian Hill (Urban Institute) |
Senior Fellow |
ihill@urban.org |
202-261-5374 |
Henry Ireys (Mathematica) |
Senior Fellow |
hireys@mathematica-mpr.com |
202- 554-7536 |
Lisa Simpson (Academy Health) |
President and CEO |
lisa.simpson@academyhealth.org |
202-292-6747 |
Dana Peterson, Mathematica |
Researcher |
dpeterson@mathematica-mpr.com |
510-830-3713 |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Sheena Flowers |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |