OMB_CER_Supporting Statement_B_5.2.11

OMB_CER_Supporting Statement_B_5.2.11.doc

RECOVERY: Increasing Adoption of Patient Centered Behavioral Health Research by Primary and Behavioral Health Providers and Systems

OMB: 0930-0327

Document [doc]
Download: doc | pdf


RECOVERY: Increasing Adoption of Patient Centered Behavioral Health Research by Primary and Behavioral Health Providers and Systems


Supporting Statement


B. Statistical Methods


1. Respondent Universe and Sampling Methods


The protocol for this evaluation calls for the participation of 50 community health centers and 50 community behavioral health centers. These 100 community health centers will be recruited from the membership of MANILA’s 2 subcontracting organizations, the NACHC and the NCCBH. The membership of these two organizations exceeds 2,600 community health organizations.


Invitations to participate in the evaluation will be sent via email to the entire membership of the NACHC and the NCCBH. These invitations will provide justification for the evaluation, an overview of the evaluation protocol, and details of the expected burden resulting from participation in the trial. While CBHSQ cannot anticipate the number of community health centers that will volunteer, based on the previous experience of NACHC and NCCBH, CBHSQ expects this number to be approximately 260 to 300. This pool of volunteer community health centers will define the sampling frame for this project.


Participation in this evaluation is voluntary; therefore, it is valuable to know whether our sampling frame is representative of the entire membership of NACHC and NCCBH. For example, it is likely that those community health centers that have volunteered for inclusion in the evaluation will be “early adopters” of a new EBP. To identify differences between community health centers included in the sampling frame from those that are not, NACHC and NCCBH will query their membership databases to determine whether differences exist between those community health centers that volunteered and those that did not. Variables considered in this analysis will include size of organization (number of individuals served, number of practitioners employed, etc.), demographics of individuals served, geographic location (State, urban, suburban, rural, etc.), and financial capacity.


Once the sampling frame has been established, information from NACHC and NCCBH will be used to group the community health centers and community behavioral health centers by services provided, size, financial capacity, demographics of served population, etc. The purpose of this procedure is to try to identify pairs of community health centers and community behavioral health centers that are as similar to one another as possible.


Having identified pairs of similar health centers, 50 pairs will be randomly selected for inclusion in the evaluation. Each pair of community health centers in the sampling frame will be assigned a number, and SAS will be used to randomly generate a list of 50 numbers from this frame. The corresponding pairs of community health centers will represent the sample for this evaluation. Once the sample of 50 matched pairs of community health centers is established, 1 center from each pair will be randomly assigned (again using SAS) to an intervention group.


Each organization will be asked to nominate individuals typically involved in the decisionmaking process pertaining to the adoption and implementation of clinical practice. Given the heterogeneity in size and capacity of the health centers that represent the membership of NACHC and NCCBH, CBHSQ expects the number and position of nominated respondents will vary considerably across organizations; however, CBHSQ anticipates that an “average” decisionmaking team at a community health center will consist of one director, one administrator, and three healthcare providers.

2. Information Collection Procedures


Survey data will be collected from individuals involved in the adoption decision at each of the 100 volunteer agencies using a Web-based survey platform (Qualtrics). Survey data will be collected at three time points: baseline (prior to exposure to the dissemination strategy), 1 month postexposure, and 9 months postexposure.


At each data collection point, respondents will be notified by email (see Attachment J) that the Web-based survey is open, and a hyperlink to the survey Web site will be provided. The Web-based survey will be accessible to respondents 24 hours a day for a total of 2 weeks at each of the three data collection time points. Upon entrance into the survey, invited respondents will view an introduction page that explains the survey objectives and stresses the importance of participation. Following the access page, there will be a page describing specific instructions on how to complete the survey. Respondents will be able to easily respond to the survey items by clicking on precoded options for closed-ended items and typing in “text boxes” for open-ended items. The survey also contains a “skip logic” pattern that ensures respondents are only required to complete relevant questions.


Following data collection, survey responses will be compiled and assessed formally for data quality to produce a finalized database for statistical analyses. Surveys that contain incomplete data will be flagged, and our data management team will be notified. Incomplete response data pose a substantial threat to confident interpretation and generalization of the evaluation results. Consequently, any respondent who submits an incomplete survey will be contacted by phone and asked to complete and resubmit the survey. CBHSQ will exclude all surveys where respondents answered fewer than 25% of the total number of questions. To minimize incomplete data, respondents will be prompted when a missing entry is found and asked to clarify if they intended to leave the entry blank.


Survey data will be collected and stored in a dedicated SQL database. This SQL database will be housed and maintained by MANILA data management staff. Quantitative and qualitative data sets will be exported from the database and imported into SPSS (quantitative data) or EZ-Text (qualitative data).


As noted above, EZ-Text (which was developed for the Centers for Disease Control and Prevention by MANILA) will be used for coding and analysis of qualitative response data. For open-ended qualitative responses, a preliminary code list will be defined and revised if necessary, based on a review of the data after identifying any additional common themes. Coding will be performed by two evaluation associates. Intercoder agreement will be tested by double-coding an initial set of interviews, and once 80% agreement has been reached, coding will proceed.

3. Methods to Maximize Response Rates


Based on the extensive experience of the subcontractors (NACHC and NCCBH) working with community and behavioral health organizations on similar projects, SAMHSA anticipates achieving an 80%–85% response rate for each survey administered. Although the particular surveys included in this project have not been previously used, prior research that has used the specific instruments included in the surveys reports response rates ranging from 82% to 96%, providing additional support for the projected response rate for this project (Aarons, 2004; Huag et al., 2008; McGovern et al., 2004).

Due to the fact that this evaluation is not anonymous, efforts can be made to maximize response rates and minimize the impact of nonresponse bias. A key aspect of our approach to maximizing response rates at each data collection point is to make numerous and varied contacts with each of the 5 respondents within each of the 100 sampled health centers. The sequence to be used in the present project is as follows:


  • Preletter (via email). This letter introduces the project to each respondent and informs that an invitation to take the first of three to five surveys will soon arrive.

  • Initial invitation (via email). An invitation will be sent to each participant at each of the 100 included health centers. A hyperlink to the survey instrument will be included in the email. This invitation will be sent approximately 5 days following the preletter, and approximately 3 weeks prior to exposure to the dissemination package.

  • Second invitation. One week after sending the initial invitation, nonrespondents will be identified and a second invitation will be sent. A hyperlink to the survey instrument will be included in the email.

  • Telephone Followup. Telephone followup will be initiated for all nonrespondents 10 days after sending the second invitation. The purpose of this call will be to ensure that each enrollee has the opportunity to respond within the 2-week data collection window.


The sequence of events described above will be repeated at each of the three data collection periods. At the end of each data collection phase, a report will be generated that shows the response characteristics of each participating health center. Response reports will be sent to the primary point-of-contact at each of the participating health centers via our subcontractors. This will allow the primary point-of-contact at each participating health center to examine the response characteristics of that organization and make efforts to improve response rates.

To assess nonresponse bias as a function of the different stakeholder groups that CBHSQ intends to target, CBHSQ does not have descriptive information in email lists that would enable identification of the stakeholder group or groups that an individual email is associated with. CBHSQ does know, however, that certain lists are predominantly associated with particular stakeholders (e.g., the American Medical Association [AMA] list will be largely made up of physicians and non-National Guidance Clearinghouse users; the National Network of Libraries of Medicine list will have a preponderance of medical librarians). CBHSQ may be able to detect response bias by looking at differences in the response rates by the list that the email was drawn from, and then triangulate patterns to identify if nonresponse is more or less likely to be associated with different types of member organizations (those representing providers, researchers, nonusers, etc.).


4. Tests of Procedures


To test Web-based survey procedures, MANILA will conduct a pretesting of the Web-based survey with a subsample of no more than nine voluntary respondents. These respondents will be drawn from experts serving on the project’s Technical Advisory Panel and members of the evaluation team. During pretesting, CBHSQ will ask each volunteer to take the survey in the presence of a MANILA evaluation analyst and “think aloud” when answering each question. In doing so, the evaluation analyst will be able to examine the thought processes of the respondent as he or she hears, interprets, and decides on an answer. The results of the pretesting will be used to refine the survey prior to field-testing. In the event that fine-tuning of the survey instrument is required, OMB will be notified in a memorandum with a copy of the final version of the Web-based survey.


CBHSQ will also test the data-capture procedures to ensure the Web-enabled survey captures and renders correctly. Two members of our project team will do this by manually completing 10 surveys (on hard copy), in parallel with our online data entry component, and comparing the outputs to ensure all data were captured correctly.

5. Statistical Consultants


The primary contractor (MANILA) will have overall responsibility for implementation and execution of the project, including data collection and analysis.


The primary contractor for this project is:


MANILA Consulting Group, Inc.

Gary Hill, Ph.D., Project Manager

1420 Beverly Road, Suite 220

McLean, VA 22101

571- 633-9400, ext. 208

ghill@manilaconsulting.net


The project officer for the Federal Government is:

Kevin D. Hennessy, Ph.D.

Senior Advisor

Center for Behavior Health Statistics and Quality

Substance Abuse and Mental Health Services Administration

One Choke Cherry Road, Room 7-1041

Rockville, MD 20857

240- 276-2234

kevin.hennessy@samhsa.hhs.gov


References


Aarons, G. A. (2004). Mental health provider attitudes toward adoption of evidence-based practice: The evidence-based practice attitude scale (EBPAS). Ment Health Serv Res, 6(2), 61–74.


Allred, C., Markiewicz, J., Amaya-Jackson, L., Putnam, F., Saunders, B., Wilson, et al. (2005). The organizational readiness and capacity assessment. Durham, NC: UCLA-Duke National Center for Child Traumatic Stress.


Ashforth, B.E. (2001). Role transitions in organizational life: An identity-based perspective. Mahwah, NJ: Erlbaum.


Barber, R., Boote, J. D., & Cooper, C. L. (2007). Involving consumers successfully in NHS research: A national survey. Health Expectations, 10, 380–391.


Boote, J., Barber, R., & Cooper, C. (2006). Principles and indicators of successful consumer involvement in NHS research: Results of a Delphi study and subgroup analysis. Health Policy, 75, 280–297.


Boswell, W. R., Boudreau, J. W., & Tichy, J. (2005). The relationship between employee job change and job satisfaction: The honeymoon-hangover effect. Journal of Applied Psychology, 90, 882–92.


CBO (Congressional Budget Office). (2007). Research on the comparative effectiveness of medical treatments. Publication No. 2975.


Estabrooks, C. A., Floyd, J. A., Scott-Findlay, S., O’Leary, K. A., & Gushta, M. (2003).

Individual determinants of research utilization: A systematic review. Journal of Advanced Nursing, 43(5), 506–520.


Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: The National Implementation Research Network.


Frueh, B. C., Grubaugh, A. L., Cusack, K. J., & Elhai, J. D. (2009). Disseminating evidence-based practices for adults with PTSD and severe mental illness in public-sector mental health agencies. Behavior Modification, 33(1), 66–81.


Gold, P. B., Glynn, S. M., & Mueser, K. T. (2006). Challenges to implementing and sustaining comprehensive mental health service programs. Evaluation & the Health Professions, 29(2), 195–218.


Greenhalgh, T., Robert, G., Bate, P., Kyriakidou, O., Macfarlane, F., & Peacock, R. (2004). How to spread good ideas: A systematic review of the literature on diffusion,dissemination and sustainability of innovations in health service delivery and organization. London: National Coordinating Centre for NHS Service Delivery and Organization R&D.


Haug, N. A., Shopshire, M., Tajima, B., Gruber, V., & Guydish, J. (2008). Adoption of evidence-based practices among substance abuse treatment providers. J Drug Educ, 38(2), 181–192.


Helmreich, R. L., Sawin, L. L. & Carsrud, A. L. (1986). The honeymoon effect in job performance: Temporal increases in the predictive power of achievement motivation. Journal of Applied Psychology, 71(2), 185–188.


HHS (Department of Health and Human Services), U.S. Public Health Service. (2000). Mental health: A report of the Surgeon General. Washington, DC: Author.


IOM (Institute of Medicine). (2009). Initial national priorities for comparative effectiveness research. Washington, DC: Institute of Medicine.


McGovern, M. P., Fox, T. S., Xie, H., & Drake, R. E. (2004). A survey of clinical practices and readiness to adopt evidence-based practices: Dissemination research in an addiction treatment system. Journal of Substance Abuse Treatment, 26, 305–312.


Orlandi, M., Landers, C., Weston, R., & Haley, N. (1990) Diffusion of health promotion innovations. In Glanz, K., Lewis, M. F., & Rimer, B. K. (Eds.), Health behavior and health education: Theory, research and practice (pp. 228–313). San Francisco: Jossey-Bass.


Rubenstein, L. V., & Pugh, J. (2006). Strategies for promoting organizational and practice change by advancing implementation research. Journal of General Internal Medicine, 21, S58–S64.


Texas Christian University, Institute of Behavioral Research. (2002). The organizational readiness for change: Treatment director version (TCU ORC-D). Retrieved November 3, 2010, from www.ibr.tcu.edu.


Texas Christian University, Institute of Behavioral Research. (2005). The organizational readiness for change: Treatment staff version (TCU ORC-S). Retrieved November 3, 2010, from www.ibr.tcu.edu.


Texas Christian University, Institute of Behavioral Research. (2006). Survey of structure and operations (TCU SSO). Retrieved November 3, 2010, from www.ibr.tcu.edu.


Torrey, W. C., Drake, R. E., Dixon, L., Burns, B. J., Flynn, L., Rush, A. J., et al. (2001). Implementing evidence-based practices for persons with severe mental illnesses. Psychiatric Services, 52, 45–50.

List of Attachments


Attachment A: Baseline Survey, Director Version

Attachment B: Baseline Survey, Staff Version

Attachment C: Followup Survey, Director Version

Attachment D: Followup Survey, Staff Version

Attachment E: TA Evaluation Survey of the Packet

Attachment F: TA Evaluation Survey of the Training Webinar

Attachment G: TA Evaluation Survey of the Coaching Webinar

Attachment H: Copy of Survey Screen Displaying OMB Requirements

Attachment I: Institutional Review Board Letter of Approval

Attachment J: Email Correspondence to Participants



7


File Typeapplication/msword
File TitleAbstract
AuthorJessica Williams
Last Modified Bysummer.king
File Modified2011-05-02
File Created2011-05-02

© 2024 OMB.report | Privacy Policy