RECOVERY: Increasing Adoption of Patient Centered Behavioral Health Research by Primary and Behavioral Health Providers and Systems
Supporting Statement
A. Justification
The Substance Abuse and Mental Health Services Administration, Center for Behavioral Health Statistics and Quality (SAMHSA/CBHSQ) is requesting approval from the Office of Management and Budget (OMB) for the data collection activities for the “RECOVERY: Increasing Adoption of Patient Centered Behavioral Health Research by Primary and Behavioral Health Providers and Systems” project (hereafter referred to as the CER—comparative effectiveness research—project). In this request, SAMHSA is seeking OMB approval for data collection activities involving the administration of 7 surveys, of which each individual participating in this evaluation will complete no more than 5:
Baseline Survey (two versions; see Attachments A and B)
Director Version
Staff Version
Followup Survey (two versions; see Attachments C and D)
Director Version
Staff Version
Dissemination Evaluation Survey of the Packet (see Attachment E)
Dissemination Evaluation Survey of the Implementation Webinar (see Attachment F)
Dissemination Evaluation Survey of the Coaching Webinar (see Attachment G)
The CER project is authorized under Title VIII of the American Recovery and Reinvestment Act (ARRA) of 2009.
Several recent seminal reports (CBO, 2007; IOM, 2009) have underscored that the mechanisms by which CER—also referred to as patient-centered health research—are disseminated can have important implications for the treatment and care of patients, particularly those with mental and/or substance use disorders. While ARRA allocates $1.1 billion for investment in conducting and synthesizing CER, both Congressional and Administration officials recognize the importance of allocating a small portion of these funds to efforts that identify and support dissemination strategies and mechanisms for promoting implementation of CER results by intended audiences (e.g., hospitals, healthcare systems, employers and managed care organizations, State and community-based providers).
One of SAMHSA’s purviews is to ensure the maximum return on anticipated investments in CER within the mental health and substance abuse fields. Therefore, from SAMHSA’s perspective, it is essential to systematically improve knowledge of how CER results can be most effectively packaged, disseminated, and implemented by a range of mental health and substance abuse organizations and other front-line providers.
As part of an ARRA-funded project submitted by SAMHSA and approved by the Department of Health and Human Services (HHS) Coordinating Committee on Comparative Effectiveness Research, the objective of this evaluation is to evaluate the effect of different strategies for disseminating and promoting adoption of patient-centered health research results among behavioral health providers and organizations, and primary care providers and organizations responsible for delivering behavioral health services. This project seeks to generate important knowledge to address three fundamental questions:
Under what (if any) conditions or circumstances is packaging and providing the results of patient-centered health research to behavioral healthcare organizations and providers sufficient to encourage the adoption and/or integration of these results into existing practice?
What (if any) added benefits in the adoption and implementation of patient-centered health research are realized from using additional technical assistance (TA) methods (e.g., Webinars, consultation)?
How does the specific content of a particular result (or results) of behavioral health patient-centered health research influence or otherwise interact with contextual and organizational factors within entities seeking to adopt and/or implement these results?
To address these questions, CBHSQ will use an experimental evaluation design devised to examine the influence of two different dissemination strategies on the decision to adopt a patient-centered health practice among community health and community behavioral health centers (Figure 1). Fifty of each type of organization will be recruited for participation in the evaluation, for a total of 100 organizations.
In the evaluation, Motivational Interviewing (MI) will be disseminated to community health organizations and community behavioral health organizations. MI is a counseling approach that attempts to increase the patient’s/consumer’s awareness of the potential problems caused, consequences experienced, and risks faced as a result of the particular behavior in question. MI is a client-centered directive approach designed to enhance intrinsic motivation to change by exploring and resolving ambivalence. The practice of MI is adaptive, not prescriptive, so it can be provided in a flexible manner to meet the specific needs of diverse populations and settings. Although the practice was initially developed to address problem drinking behavior, it has been more recently adapted for use with drug-addicted populations, psychiatric populations, and other aspects of behavioral health.
Community health and community behavioral health organizations will be matched as closely as possible into pairs based upon a number of factors that relate, for example, to their size, available resources, geographical location, and populations served. Each pair will then be randomly assigned to an evaluation group: Evaluation Group 1 (exposed to dissemination strategy 1) or Evaluation Group 2 (exposed to dissemination strategy 2).
Dissemination strategy 1 consists of the dissemination of an informational packet that provides information related to core components, adaptations, evidence of effectiveness (including CER), and resources related to implementation and reimbursement. Dissemination strategy 2 involves the dissemination of the same informational packet, plus participation in two Webinars, one focusing on implementation (i.e., detailed information on MI and potential barriers and facilitators) and the other on coaching (i.e., interactions with an MI program coach to ask questions and receive feedback on implementation strategies).
Figure 1. Experimental Evaluation Design Used by Project
Data will be obtained from several data collection instruments, including a baseline survey, followup survey, and three dissemination evaluation surveys (packets, implementation Webinar, and coaching Webinar). The primary purpose of the data collection is evaluation. The data collected will enable CBHSQ to document and examine the effect of the two dissemination strategies described above on the decision to adopt MI as a treatment approach supported by patient-centered health research. The collected data will also allow for an examination of contextual factors, both organizational and individual, that influence the decision to adopt a new evidence-based treatment approach. Ultimately, the findings of this evaluation will help inform future dissemination strategies aimed at increasing adoption rates for practices supported by patient-centered behavioral health research. The findings will also inform the design of future strategies aimed at maximizing facilitators and overcoming barriers to the adoption of evidence-based treatment options for individuals with mental health or substance use problems.
The following describes of the data collection instruments that will be used to inform this evaluation project.
Baseline Survey (Attachments A and B): The purpose of the baseline survey is to obtain information related to organizational factors, individual factors, and current stage in the adoption-decision process before implementation of the intervention (i.e., dissemination strategies). Items for this survey were obtained from previously validated tools and include:
Survey of Structure and Operations (Texas Christian University, 2006)
Organizational Readiness for Change Treatment Staff Version (Texas Christian University, 2005)
Organizational Readiness for Change Treatment Director Version (Texas Christian University, 2002)
Survey Instrument for Measuring Organizational Barriers to Implementing Evidence-Based Practices (Haug, Shopshire, Tajima, Gruber, & Guydish, 2008)
Management Strategies to Support Evidence-Based Practices (Haug et al., 2008)
Organizational Readiness and Capacity Assessment (Allred, Markiewicz, Amaya-Jackson, Putnam, Saunders, Wilson, et al., 2005)
The Evidence-Based Practice Attitude Scale (Aarons, 2004)
Adoption of Clinical Practices Scale (McGovern, Fox, Xie, & Drake, 2004)
The baseline survey will be administered to a sample of administrators and healthcare providers at each organization participating in the evaluation. Depending on the role of the respondent within the organization (director or other staff), one of two versions of the survey will be administered. Survey items differ across these two surveys on one of the scales (Organizational Readiness for Change) based on the role of the respondent within the organization (i.e., staff version and director version). The Survey of Structure and Operations Scale will only be included in the version of the instrument presented to health center directors.
Followup Survey (Attachments C and D): Followup surveys will be administered at two time points: within 1 month of receiving the intervention, and 9 months after receiving the intervention. The followup survey contains all the questions included in the baseline survey and contains different versions based on respondent (i.e., staff version and director version). Additional questions are included in the followup survey related to consumer involvement in the decisionmaking process. Questions on consumer involvement were modified from work done examining the principles and indicators of successful consumer involvement in National Health Service research (Boote, Barber, & Cooper, 2006; Barber, Boote, & Cooper, 2007). The purpose of the followup survey is to collect information related to organizational factors, individual factors, consumer involvement, and the adoption-decision process to determine if these elements changed after administration of the intervention, to determine the degree to which factors changed based on intervention exposure (packet only versus packet and Webinars), and to evaluate the influence of contextual factors and consumer involvement on the decision to adopt.
Dissemination Evaluation Surveys (Attachments E, F, and G): The purpose of the dissemination evaluation surveys is to obtain feedback from participants regarding each of three dissemination strategies (packet, implementation Webinar, and coaching Webinar). This feedback will provide data necessary to evaluate the quality and participant perceptions of each strategy. The time line and specific components of each survey are described below:
Packet: Feedback related to the packet will be solicited 1 week after dissemination of the packet. The survey evaluates the perceived quality, presentation, and helpfulness of the packet.
Implementation Webinar: Feedback from participants will be solicited 3 days after participation in the Implementation Webinar. This survey evaluates the perceived quality of information and presentation, the quality of the Webinar platform, and overall helpfulness of the Webinar.
Coaching Webinar: Feedback will be solicited 3 days after participation in the Coaching Webinar. This survey evaluates the perceived quality of information and presentation, quality of the Webinar platform, and overall helpfulness of the Coaching Webinar.
Organizations will submit their responses for all surveys via Qualtrics, an online Web-based software. Qualtrics is an easy-to-use online survey tool that provides a platform for designing, distributing, and evaluating survey results. Electronic submission of the surveys will decrease any unnecessary burden and will increase the practical utilization of the data for analytical purposes. Qualtrics allows for customized survey design, such as skip patterns and missing data prompts, decreasing the burden on participants and increasing the validity of the results. Participants are also able to answer questions at their leisure by being able to stop and return to their survey at a later time through the use of an individually created password. Participants will be asked to complete the surveys within 2 weeks. Qualtrics also allows for information to be exported in many formats including SPSS, Excel, Word, PDF, and PowerPoint. The surveys were designed so they are compliant with the requirements of Section 508 of the Rehabilitation Act to permit accessibility to people with disabilities. Qualtrics has a built-in accessibility checker in the design program that was used to ensure accessibility. The first screen respondent will see is identical for each survey and displays the OMB control number, expiration date and response burden statement. A copy of this screen is provided in Attachment H.
A considerable amount of new research has been, and continues to be, conducted in CER and the identification of “best practices” in the area of mental health. While this focus on research signifies an important step forward in the move toward evidence-based behavioral health practice, it is not enough to ensure the dissemination, adoption, and implementation of these practices. Research indicates that even for practices that have demonstrated extensive evidence of effectiveness, many mental health programs do not provide evidence-based practices (EBPs) to the clients who need them most (HHS, 2000; Torrey, Drake, Nixon, Burns, Flynn, Rush, et al., 2001). Many healthcare innovations ultimately fail because of “the gap that is frequently left unfilled between the point where innovation-development ends and diffusion planning begins” (Orlandi, Landers, Weston, & Haley, 1990).
Despite calls for increased efforts in the evaluation of dissemination strategies for EBPs (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005; Greenlaugh et al., 2004; Rubenstein & Pugh, 2006), particularly in mental health and trauma-related services (Frueh et al., 2009), the knowledge base in this area remains small. Much of the research on dissemination and implementation has focused on the fields of medicine, education, and managed care; much less effort has been expended examining the nuances associated with the dissemination of EBPs in mental health care settings (Fixsen et al., 2005; Gold et al., 2006; Greenlaugh, Robert, Bate, Kyriakidou, MacFarlane, & Peacock, 2004). Furthermore, while research has been conducted examining the numerous contextual factors that influence the implementation of EBPs (e.g., Estabrooks, Floyd, Scott-Findlay, O’Leary, & Gushta, 2003; Frueh, Grubaugh, Cusack, & Elhai, 2009; Gold, Glynn, & Mueser, 2006; Greenlaugh et al., 2004), few studies have examined the interrelationships between these factors. The influence of these factors on the decision to implement EBPs also has not been adequately explored.
Based on a review of the extant literature and consultation with experts in the field of dissemination science, mental health service delivery, and organizational decisionmaking, it was concluded that no other previous or current projects have collected this type of information in ways that will address the proposed evaluation questions. Rather, the information collected for this project will complement other efforts in this area of inquiry and improve the knowledge base regarding what works for the dissemination of behavioral health services supported by patient-centered health research.
There is not a significant burden on small businesses or small entities or on their workforces.
If the data collection activities are not conducted as described in this document, it will impede CBHSQ’s ability to assess the effects of the CER project and diminish efforts in understanding the effectiveness of different types of dissemination strategies on the adoption of patient-centered behavioral health interventions. The purpose of collecting followup data at two time points (at 1 month and 9 months) is to assess the potential short-term and longer-term effects of the dissemination strategies. Organizations and individuals tend to experience a “honeymoon effect” immediately after receiving new information when their intentions to implement the new information are strongest (Ashforth, 2001; Boswell, Boudreau, & Tichy, 2005; Helmreich, Sawin, & Carsrud, 1986). Collecting information at the two followup time points will enable us to test for this effect and obtain a more accurate picture of the decisionmaking process.
This data collection complies fully with 5 CFR 1320.5(d)(2).
Consultations on the evaluation design, sample design, data sources, dissemination strategies, and participant materials occurred during the evaluation design phase and continue to take place as the evaluation design is being finalized. The purpose of these consultations is to ensure the integrity of the evaluation design and the relevance of the data collection activities, and to maximize the likelihood that the findings of this evaluation will generate valuable information regarding the adoption of evidence-based behavioral interventions in community healthcare settings.
During the evaluation design phase, the SAMHSA project officer, with the support of the contracting agency on this project, MANILA Consulting Group Inc. (MANILA), convened an Expert Panel Meeting on October 14, 2010, to discuss the purpose of the evaluation, the selected EBPs for implementation, and possible dissemination strategies. This panel offered valuable recommendations regarding the evaluation design, the outcome measures that should be assessed, and the processes involved in the implementation and adoption of EBPs in community health center settings.
Since the Expert Panel Meeting, the SAMHSA project officer has had regular meetings with MANILA staff to revise the evaluation design based on feedback obtained at the meeting. SAMHSA staff and Expert Panel members who have provided guidance on the present evaluation are listed below:
Kevin D. Hennessy, Ph.D.
Senior Advisor
Center
for Behavioral Health
Statistics and Quality
Science to Service Coordinator, SAMHSA
Jeffrey A. Buck, Ph.D.
Senior Advisor for Behavioral Health
Center for Strategic Planning
Centers for Medicare and Medicaid Services
David A. Chambers, D.Phil.
Associate
Director
Dissemination and Implementation Research
National Institute of Mental Health (NIMH)
Timothy Cuerdon, Ph.D.
Senior Advisor
Research and Evaluation Group
Office
of Research,
Development, and Information
Centers for Medicare and Medicaid Services
A. Seiji Hayashi, M.D., M.P.H.
Chief Medical Officer
Bureau of Primary Health Care
Health Resources and Services Administration (HRSA)
Cherry Lowman, Ph.D.
Coordinator
Health Services Research Program
National Institute on Alcohol Abuse and Alcoholism (NIAAA) Division of Treatment and Recovery Research (DTRR)
Jennifer L. Magnabosco, Ph.D.
Principal Investigator and Projects Director Veterans Affairs Center for Implementation Practice and Research Support and
Health Services Research and Development (HSR&D) Center of Excellence
Center
for the Study of Healthcare
Provider Behavior
Harold I. Perl, Ph.D.
Senior Lead for Behavioral Research, Dissemination, and Training
Center for the Clinical Trials Network
National Institute of Drug Abuse (NIDA)
Alexander F. Ross, Sc.D.
Senior Health Policy Analyst
Office of Special Health Affairs
HRSA
Charlotte A. Mullican, M.P.H.
Senior Advisor for Mental Health Research
Center
for Primary Care, Prevention,
and Clinical Partnerships
Agency
for Healthcare Research
and Quality
(AHRQ)
Sonia Tyutyulkova, M.D., Ph.D.
Medical Officer
AHRQ
Lori Ashcraft, Ph.D.
Executive Director
Recovery Opportunity Center
Rhonda Robinson Beale, M.D.
Chief Medical Officer
OptumHealth Behavioral Solutions
Bruce L. Bird, Ph.D.
President and CEO
Vinfen Corporation
Stephen Day, M.S.W.
Executive Director
Technical Assistance Collaborative, Inc.
Michael Franczak, Ph.D.
Senior Vice President
Marc Center
Howard H. Goldman, M.D., Ph.D.
Professor of Psychiatry
University of Maryland School of Medicine
John A. Morris, M.S.W.
Executive Director
Annapolis
Coalition on the
Behavioral Health Workforce
Sandra F. Naoom, M.S.P.H.
Doctoral
Candidate, Research,
Measurement and Evaluation
Associate Director
National Implementation Research Network
Frank
Porter Graham
Child
Development Institute
University of North Carolina—Chapel Hill
Gary R. Bond, Ph.D.
Professor
Dartmouth Medical School
Jean Campbell, Ph.D.
Research Associate Professor
University of Missouri-St. Louis
Missouri Institute of Mental Health
Jeff Capobianco, M.A.
Research Investigator
University
of Michigan
School of Social Work
Patricia Nemec, Ph.D.
Independent Trainer and Consultant
Nemec Consulting
Gary Oftedahl, M.D.
Chief Knowledge Officer
Institute for Clinical Systems Improvement
Manuel Paris, Psy.D.
Associate Professor of Psychiatry
Deputy
Director, Hispanic Services, Connecticut
Mental Health Center
Yale University School of Medicine
Mark Salzer, Ph.D.
Professor
and Chair of
Rehabilitation Sciences
Temple University
Ron Schraiber, M.A.
Mental Health Analyst III
Los
Angeles County
Department of Mental Health
David L. Shern, Ph.D.
President and CEO
Mental Health America
David Hughes, M.A.
Vice President
Human Services Research Institute
David M. Stevens, M.D., F.A.A.F.P.
Director, Quality Center, National Association of Community Health Centers
Research
Professor, Department of Health Policy, School of Public Health
and
Health Services
George Washington University
Charles S. Ingoglia, M.S.W.
Vice President
National Council for Community Behavioral Healthcare
H. Stephen Leff, Ph.D.
Associate Professor
Harvard Medical School Department of Psychiatry at the Cambridge Health Alliance
Senior Vice President, HSRI
The SAMHSA project officer and MANILA have also had extensive consultations with the two subcontracting organizations on this contract, the National Association of Community Health Centers (NACHC) and the National Council for Community Behavioral Healthcare (NCCBH). These meetings helped to clarify the particular patient-centered health practice that would be best suited for dissemination through this evaluation and ultimately led to the decision to disseminate MI.
All surveys were also pilot tested with study staff and 8 community health or behavioral health organizations in order to gather feedback regarding the usability, clarity, and completion time of the surveys. Questions were revised based on this feedback. The hours per response used to calculate our estimates of annualized hour burden were also based on results of this pilot testing.
No payment is being provided to respondents.
All individual data collected will be conducted in accordance with the Privacy Act of 1974 (5 U.S. Code [U.S.C.] 552a), SAMHSA Participant Protection requirements, and other Federal and HHS regulations on the protection of human subjects (e.g., 5 U.S.C. 301; 42 U.S.C. 289(a)).
This study has been submitted and approved by the MANILA Institutional Review Board (IRB; Attachment I). The evaluation team will continue to work closely with the MANILA IRB to ensure that human subject protections are assured. For data collection activities, personal identifiers will be collected (i.e., name, place of employment, and job position) to link the data collected at various time points, organize the data by organization, and examine the influence of job position on various responses. Once the data files have been linked, the names will be removed. No respondent identifiers will be made available from the evaluation. When reporting data, the evaluation team will use organization codes rather than organization names, and the data will be aggregated so the responses cannot be identified according to any individual or organization.
Each respondent will be given an assurance of privacy, and the project will protect the privacy of respondents. The privacy statement will state that participation in the evaluation is strictly voluntary and individuals have the right to refuse to complete any of the surveys. Respondents will be assured the information will be reported only in aggregate form in reports, that their names and other personal identifiers will not be associated with their answers, and that no one will have access to this information except as may be required by law, regulation, or subpoena, or unless permission is given by the respondent.
Qualtrics, the Web-based survey platform that will be used, has SAS 70 Certification and meets the rigorous privacy standards imposed on health care records by the Health Insurance Portability and Accountability Act (HIPPA). All Qualtrics accounts are hidden behind passwords, and all data are protected with real-time data replication.
All survey data will be will be housed on a secured network drive on a file server located in the MANILA data center. A complete Information Technology (IT)/Security Plan, consistent with the National Institute of Standards and Technology Special Publication 800-18, Revision 1, Guide for Developing Security Plans for Federal Information Systems, was submitted to SAMHSA December 3, 2010. All data files on multiuser systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only.
No questions of a sensitive nature are included in the data collection instruments for this project.
To identify individuals for participation in the evaluation, organizations will be asked to identify staff who are key to the decisionmaking process within their organization, and as a result, the number and position of respondents will vary across organizations. Accordingly, the type and number of respondents presented in the table below represent an “average” decisionmaking team at a community health organization and consist of five members (one director, one administrator, and three providers).
The estimated burden for data collection is 920 hours across 500 participants. Using the median hourly wage estimated reported by the Bureau of Labor Statistics (BLS), May 2009 National Occupational Employment and Wage Estimates, and a loading rate of 25%, the estimated total cost to respondents is $61,715.45. A breakdown of these estimates is provided in Table 1.
Table 1. Estimated Burden for Data Collection
Form Name |
No. of Respondents |
No. of Responses per Respondent |
Hours Per Response |
Total Hour Burden |
Estimated Hourly Wage |
Total Hour Cost |
Health Center Directors1 |
||||||
Baseline
Survey, |
100 |
1 |
0.50 |
50 |
$96.59 |
$4,829.50 |
Followup
Survey, |
100 |
2 |
0.50 |
100 |
$96.59 |
$9,659.00 |
Dissemination Evaluation Survey of the Packets |
100 |
1 |
0.17 |
17 |
$96.59 |
$1,642.03 |
Dissemination Evaluation Survey of the Training Webinar |
50 |
1 |
0.17 |
8.5 |
$96.59 |
$821.02 |
Dissemination Evaluation Survey of the Coaching Webinar |
50 |
1 |
0.17 |
8.5 |
$96.59 |
$821.02 |
Director Subtotal |
100 |
|
|
184 |
|
$17,772.57 |
Health Center Administrators2 |
||||||
Baseline
Survey, |
100 |
1 |
0.50 |
50 |
$54.68 |
$2,734.00 |
Followup
Survey, |
100 |
2 |
0.50 |
100 |
$54.68 |
$5,468.00 |
TA Evaluation Survey of the Packets |
100 |
1 |
0.17 |
17 |
$54.68 |
$929.56 |
TA Evaluation Survey of the Training Webinar |
50 |
1 |
0.17 |
8.5 |
$54.68 |
$464.78 |
TA Evaluation Survey of the Coaching Webinar |
50 |
1 |
0.17 |
8.5 |
$54.68 |
$464.78 |
Administrator Subtotal |
100 |
|
|
184 |
|
$10,061.12 |
Practitioners3 |
||||||
Baseline
Survey, |
300 |
1 |
0.50 |
150 |
$61.38 |
$9,207.00 |
Followup
Survey, |
300 |
2 |
0.50 |
300 |
$61.38 |
$18,414.00 |
TA Evaluation Survey of the Packets |
300 |
1 |
0.17 |
51 |
$61.38 |
$3,130.38 |
TA Evaluation Survey of the Training Webinar |
150 |
1 |
0.17 |
25.5 |
$61.38 |
$1,565.19 |
TA Evaluation Survey of the Coaching Webinar |
150 |
1 |
0.17 |
25.5 |
$61.38 |
$1,565.19 |
Practitioner Subtotal |
300 |
|
|
552 |
|
$33,881.76 |
Total |
500 |
|
|
920 |
|
$61,715.45 |
1Based on BLS labor category Chief Executives
2Based on BLS labor category Medical and Health Services Managers
3Based on an average of wage estimates for three types of providers: General Practitioner (MD), Physician Assistant, and Registered Nurse
There are no capital, startup, operations, or maintenance costs to respondents associated with this project.
The estimated cost to the Government for the overall conduct of this evaluation is $1,226,402. This includes $1,166,402 for a 2-year contract (No. 283-10-0358) for evaluation activities associated with this particular project and approximately $30,000 per year representing SAMHSA costs to manage and oversee the evaluation activities for 20% of one employee (GS-15). Accordingly, the annualized cost to the Government is approximately $613,201.
This is a new collection of information.
Figure 2 outlines the key time points for the evaluation, collection and analysis of data, and the dissemination of findings.
Data collected during the present evaluation are intended to produce new knowledge about the effect of different dissemination strategies on the decision of whether to adopt a new patient-centered behavioral health intervention. These data will also provide enlightenment as to the many factors that influence this decision. Given this, a crucial component of this project is the preparation and dissemination of reports, manuscripts, presentations, and other documents that clearly describe the results of the project so they are readily understood by the intended audience. Specific publication activities will include:
Annual Report outlining the data collection activities and an analysis of the baseline survey questions
Final Report including a detailed and comprehensive analysis of the survey questions and other relevant information gathered during the evaluation
Short Reports highlighting key evaluation findings and intended for dissemination to entities inside and outside CBHSQ, and for technical and nontechnical audiences
Briefing Materials for presentation to senior HHS, Administration, and Congressional audiences
Figure 2. Time Line for Program Activities
Survey data will be collected and stored in a dedicated Structured Query Language (SQL) database. This SQL database will be housed and maintained by MANILA data management staff. Quantitative and qualitative data sets will be exported from the database and imported into SPSS (quantitative data) or EZ-Text (qualitative data). Quantitative data will be obtained through multiple-choice and Likert-type scaled responses. Qualitative data will also be captured through several open-ended questions.
Data Cleaning: Following the completion of the survey, data will be imported into an SQL database. The survey administrator will manually screen for inadequately completed survey responses. If more than 25% of the required item responses are missing on a questionnaire, the participant who submitted it will be excluded from the analysis. During the data cleaning phase, CBHSQ will also examine and categorize text responses for each of the questions with “other” text response options. If a text response could be classified clearly into one of the predefined categories, CBHSQ will recode the response to that category.
Data analysis will include descriptive statistics (frequencies, proportional frequencies, means, modes, standard deviations, and the number of nonresponders) as well as formal statistical comparisons using cross tabulations of the data (chi square tests and various measures of association [Gamma statistic for ordinal comparisons and Cramer’s V for comparison of nominal/categorical variables]).
Aside from analyzing the data to determine the answer to the main evaluation objective—determining whether the identified dissemination programs improve the likelihood that a patient-centered intervention will be adopted and implemented—CBHSQ will also examine the effect of followup time to determine whether any initial enthusiasm shown for adopting and implementing a CER-based intervention diminishes with time. In addition to examining differences in adoption and implementation outcomes, CBHSQ will also examine the effect of interactions (both between and within evaluation groups) between outcome and a number of predefined characteristics of the enrolled organizations and the populations they serve. To do this, CBHSQ will identify characteristics believed to be important moderators prior to the onset of the evaluation, and models will be developed.
Analysis of complex sample survey data will take into account characteristics of the sample design, including stages of sample selection, clustering, stratification, and unequal probabilities of selection. CBHSQ’s analysis will begin with a series of exploratory univariate analyses; distributions will be determined and necessary transformations performed to ensure that the data can be modeled using general linear modeling techniques. A series of univariate and multivariate models will then be developed taking into account the clustered nature of the data (repeated measures). Exactly which techniques CBHSQ will use depends entirely on the type of data examined (e.g., continuous, dichotomous, count) and the distribution characteristics of the data, but they are likely to include generalized estimating equations (GEE) and mixed-model linear regression analysis.
The expiration date for OMB approval will be displayed.
This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.
File Type | application/msword |
File Title | Abstract |
Author | Jessica Williams |
Last Modified By | summer.king |
File Modified | 2011-05-02 |
File Created | 2011-05-02 |