SUPPORTING STATEMENT
Part A
Assessing the Impact of the National Implementation of TeamSTEPPS Master Training Program
OMB CONTROL NO. 0935-0170
Version: December 17, 2013
Agency of Healthcare Research and Quality (AHRQ)
Table of Contents
A. Justification 3
1. Circumstances that make the collection of information necessary 3
2. Purpose and use of information 5
3. Use of Improved Information Technology 6
4. Efforts to Identify Duplication 6
5. Involvement of Small Entities 7
6. Consequences if Information Collected Less Frequently 7
7. Special Circumstances 7
8. Consultation outside the Agency 7
9. Payments/Gifts to Respondents 7
10. Assurance of Confidentiality 7
11. Questions of a Sensitive Nature 8
12. Estimates of Annualized Burden Hours and Costs 9
13. Estimates of Annualized Respondent Capital and Maintenance Costs 9
14. Estimates of Annualized Cost to the Government 10
15. Changes in Hour Burden 10
16. Time Schedule, Publication and Analysis Plans 10
17. Exemption for Display of Expiration Date 12
List of Attachments 12
The mission of the Agency for Healthcare Research and Quality (AHRQ) set out in its authorizing legislation, The Healthcare Research and Quality Act of 1999 (see Attachment A), is to enhance the quality, appropriateness, and effectiveness of health services, and access to such services, through the establishment of a broad base of scientific research and through the promotion of improvements in clinical and health systems practices, including the prevention of diseases and other health conditions. AHRQ shall promote health care quality improvement by conducting and supporting:
1. research that develops and presents scientific evidence regarding all aspects of health care; and
2. the synthesis and dissemination of available scientific evidence for use by patients, consumers, practitioners, providers, purchasers, policy makers, and educators; and
3. initiatives to advance private and public efforts to improve health care quality.
Also, AHRQ shall conduct and support research and evaluations, and support demonstration projects, with respect to (A) the delivery of health care in inner-city areas, and in rural areas (including frontier areas); and (B) health care for priority populations, which shall include (1) low-income groups, (2) minority groups, (3) women, (4) children, (5) the elderly, and (6) individuals with special health care needs, including individuals with disabilities and individuals who need chronic care or end-of-life health care.
As part of their effort to fulfill their mission goals, AHRQ, in collaboration with the Department of Defense’s (DoD) Tricare Management Activity (TMA), developed TeamSTEPPS® (aka Team Strategies and Tools for Enhancing Performance and Patient Safety) to provide an evidence-based suite of tools and strategies for training teamwork-based patient safety to health care professionals. In 2007, AHRQ and DoD coordinated the national implementation of the TeamSTEPPS program. The main objective of this program is to improve patient safety by training a select group of stakeholders such as Quality Improvement Organization (QIO) personnel, High Reliability Organization (HRO) staff, and healthcare system staff in various teamwork, communication, and patient safety concepts, tools, and techniques and ultimately helping to build a national infrastructure for supporting teamwork-based patient safety efforts in healthcare organizations and at the state level. The implementation includes the training of Master Trainers in various health care systems capable of stimulating the utilization and adoption of TeamSTEPPS in their health care delivery systems, providing technical assistance and consultation on implementing TeamSTEPPS, and developing various channels of learning (e.g., user networks, various educational venues) for continuation support and improvement of teamwork in health care. During this effort, AHRQ has trained a corps of 2400 participants to serve as the Master Trainer infrastructure supporting national adoption of TeamSTEPPS. Participants in training become Master Trainers in TeamSTEPPS and are afforded the opportunity to observe the tools and strategies provided in the program in action. In addition to developing a corps of Master Trainers, AHRQ has also developed a series of support mechanisms for this effort including a data collection Web tool, a TeamSTEPPS call support center, and a monthly consortium to address any challenges encountered by implementers of TeamSTEPPS.
Participants applied to the program as teams representing their organization and were accepted as training participants after having completed an organizational readiness assessment. Due to the differences among the types of organizations participating in the program, each participant has a different potential to apply tools and concepts within and/or beyond their home organizations. For example:
Healthcare System staff (or implementers) from hospitals, home health agencies, nursing homes, large physician practices, and other direct care organizations are more likely than other participants to implement the TeamSTEPPS materials on a daily basis and will be more likely to affect specific work processes being conducted within an organization. As a result, health care system participants are likely to have a focused and specific impact within that organization only.
QIO\HRO\Hospital Association\State Health Department participants (or facilitators) will be more likely to have both an in-depth and broad impact if they use the TeamSTEPPS materials to assist a particular organization in their patient safety activities, as well as to provide general patient safety guidance to a large number of organizations.
To clarify the differences among the participants, a logic model has been developed (see Attachment B) that highlights the roles of the different types of participants, the types of activities in which they are likely to engage post-training, and the potential outcomes that may stem from these activities. The logic model served as a guide for developing questions for a web-based questionnaire and qualitative interviews to ensure that participant and leadership feedback is capture as thoroughly and accurately as possible.
To understand the extent to which this infrastructure of patient safety knowledge and skills has been created, AHRQ will conduct an evaluation of the National Implementation of TeamSTEPPS Master Training program. The goals of this evaluation are to examine the extent to which training participants have been able to:
1) implement the TeamSTEPPS products, concepts, tools, and techniques in their home organizations and,
2) the extent to which participants have spread that training, knowledge, and skills to their organizations, local areas, regions, and states.
To achieve these goals the following two data collections will be implemented:
1) Training participant questionnaire (see Attachment C) to examine post-training activities and teamwork outcomes as a result of training from multiple perspectives. The questionnaire is directed to all master training participants. Items will cover post-training activities, implementation experiences, facilitators and barriers to implementation encountered, and perceived outcomes as a result of these activities. Advance notice, invitations to participate, reminder e-mails, and thank you letters to respondents are included in Attachment D for the participant questionnaire.
2) Semi-structured interviews (see Attachment E) will be conducted with members from organizations who participated in the TeamSTEPPS Master Training program. Information gathered from these interviews will be analyzed and used to draft a “lessons learned” document that will capture additional detail on the issues related to participants’ and organizations’ abilities to implement and disseminate the TeamSTEPPS post-training. The organizations will vary in terms of type of organization (e.g., QIO or hospital associations versus healthcare systems) and region (i.e., Northeast, Midwest, Southwest, Southeast, Mid-Atlantic, West Coast). In addition, we will strive to ensure representativeness of the site visits by ensuring that the distribution of organizations mirrors the distribution of organizations in the master training population. For example, if the distribution of organizations is such that only one out of every five organizations is a QIO, we will ensure that a maximum of two organizations in the site visit sample are QIOs. The interviews will more accurately reveal the degree of training spread for the organizations included. Interviewees will be drawn from qualified individuals serving in one of two roles (i.e., implementers or facilitators). The interview protocol will be adapted for each role based on the respondent group and to some degree, for each individual, based on their training and patient safety experience. Attachment F contains the informed consent form that each participant will be required to sign prior to beginning the interview.
This study is being conducted by AHRQ through its contractor, Health Research & Education Trust (HRET), pursuant to AHRQ’s statutory authority to conduct and support research on healthcare and on systems for the delivery of such care, including activities with respect to the quality, effectiveness, efficiency, appropriateness and value of healthcare services and with respect to quality measurement and improvement. 42 U.S.C. 299a(a)(1) and (2).
The National Implementation of TeamSTEPPS program represents a new approach to team training for AHRQ. This program focused on disseminating teamwork, communication, and patient safety information, building skill sets, and ultimately fostering a national network of individuals who support and promote the adoption of TeamSTEPPS. To meet the objective of creating a national infrastructure, AHRQ first required that each organization’s application include a readiness assessment and a letter of endorsement and support from each team member’s Chief Executive Officer (CEO) or equivalent regarding the organization’s participation in the program. In addition, participants were selected, in part, on the expectation that they would later disseminate their newly acquired knowledge and skills to others in their home organizations or in other organizations locally, regionally, or state-wide. The program involved training teams of participants, led by representatives from either Quality Improvement Organizations (QIOs) or members of individual hospitals or healthcare systems.
As a result of the time and commitment invested over the three years of program implementation, AHRQ seeks to learn the extent to which they have succeeded in spreading TeamSTEPPS and forming the requisite infrastructure to support on-going and future patient safety efforts.
The final product for this evaluation will be a report that documents the background, methodology, results including any patterns or themes emerging from the data, limitations of the study, and recommendations for future training programs and tool development. The results of this evaluation will help AHRQ understand the extent to which participants and participating organizations have been able to employ various TeamSTEPPS tools and concepts and the barriers and facilitators they encountered. This information will help guide AHRQ in developing and refining other patient safety tools and future training programs for patient safety and other areas.
In order to reduce respondent burden, the training participant questionnaire will be administered via the Web. The contact lists acquired by AHRQ and DoD will be used to develop the questionnaire distribution lists. Each potential respondent will receive a minimum of four e-mail contacts to encourage participation (i.e., an advance notice of the questionnaires, an initial invitation to complete the questionnaire, and two follow-up e-mails to remind respondents to complete the questionnaire).
Using an on-line system for data collection rather than a paper-based questionnaire makes completing and submitting the questionnaire less time-consuming for respondents. Any skip patterns included in the questionnaire (i.e., questions that are only appropriate for a proportion of the respondents) will be automatically programmed into the Web-based form of the questionnaire, thereby eliminating any confusion during questionnaire completion. In addition, the contractors can also ensure that important items are not inadvertently skipped or ignored by setting software requirements to ensure proper completion of questionnaires based on specific respondent selections.
AHRQ’s interagency agreement with the DoD included conducting the training sessions and training evaluation. As part of this agreement, DoD and AHRQ have collected traditional participant feedback via end-of-course questionnaires. These efforts were used to make revisions and improve the program and are not redundant with the proposed evaluation effort. This study takes the next step and builds upon the information already collected by AHRQ and DoD by first enabling the development of a logic model that depicts the relationship between participants’ patient safety roles, possible activities in which they may engage post-training, and resulting outcomes. Second, this information provides the basis by which questions to participants and their organizational leaders could be streamlined to enhance AHRQ’s understanding of the linkages between roles, activities, and outcomes. For example, participant responses to previous interview questions have been leveraged to reformulate open-ended questions into closed-ended response options, further reducing the burden on the respondents. Items have been designed to minimize redundancy with the data already collected and delve more deeply into post-training issues.
No small businesses will be involved in this study.
This request is for a one-time data collection effort.
This request is consistent with the general information collection guidelines of 5 CFR 1320.5(d)(2). No special circumstances apply.
As required by 5 CFR 1320.8(d), notice was published in the Federal Register on August 27th, 2013 for 60 days (see Attachment G). No Comments were received.
American Institutes of Research
Deborah Milne
David P. Baker
Margarita Hurtado
Laura Steighner
Kristin Carman
Department of Defense Office of the Secretary of Defense
TRICARE Management Activity
1. Heidi King
University of Central Florida and Booz Allen Hamilton
1. Michael Rosen
Respondents to the Web-based questionnaire will not receive any gifts or payment in exchange for their participation. Respondents to the semi-structured interviews will not receive any gifts or payment in exchange for their participation.
Individuals and organizations will be assured of the confidentiality of their replies under Section 934(c) of the Public Health Service Act, 42 USC 299c-3(c). They will be told the purposes for which the information is collected and that, in accordance with this statute, any identifiable information about them will not be used or disclosed for any other purpose.
AHRQ’s authority is likely to withstand a FOIA request of information that would identify a person or establishment.
Names and business contact information for TeamSTEPPS Master Training participants will be taken from a pre-established records system belonging to the AHRQ and the DoD. This records system was developed through participating organizations’ applications to the program and subsequent communications related to the training program. The list provided will include training participant’s name, e-mail address, organizational affiliation, state, and job title.
This information will be used solely to recruit and contact respondents. An HRET staff member will contact those individuals by e-mail to inform them of the study, its purpose, and request their participation. The invitation e-mail and questionnaire responses will be stored on a secure server at HRET.
Identifiable questionnaire responses and semi-structured interview responses will be accessible only to members of the HRET project team all of whom have signed affidavits of nondisclosure. Responses will be de-identified using ID numbers corresponding to each respondent. HRET project staff will be instructed that any downloaded or print versions of the questionnaire results or interview notes are only to be shared within the HRET project team, and are to be de-identified (using ID numbers instead of names and/or organizational or state affiliations). All de-identified data files will be stored on HRET’s secure server, which is password-protected. Only HRET project staff will be able to access the data files and server.
The data files containing identifiable information will be destroyed at the end of the project. All remaining data files will include de-identified information and will be provided to AHRQ at the end of the contract.
Results will be reported in the aggregate, and no information collected by HRET will be shared with persons outside the project. Reports on questionnaire results or semi-structured interviews will not reveal the identity of the respondents unless they provide specific, written permission authorizing its release.
The explanations regarding confidentiality provided to respondents are included in the e-mail correspondences to request their collaboration. The explanations are also included in the introductory page of the questionnaire. These statements assure respondents that the information they provide will be “treated in a confidential manner” by HRET researchers and AHRQ. The text for both of these communications is shown below.
Text for both the initial screen for the Web-based questionnaire and e-mail correspondences: Please note that all of your information will remain confidential and that all information provided to AHRQ as a result of this questionnaire will be reported at the aggregate level to ensure your confidentiality.
Questionnaire items and semi-structured interviews do not require respondents to provide information of a sensitive nature as defined by OMB and DHHS or to provide information such as social security numbers or Medicare/Medicaid numbers. HRET has developed an introduction to the questionnaire that includes aspects of informed consent such as a description of the research objectives, a discussion of the importance of their input and experiences, details concerning how the data will be used, and aspects regarding confidentiality. The introduction will be positioned at the beginning of the questionnaire. Continuation to complete the questionnaire will indicate the respondent’s consent.
12. Estimates of Annualized Burden Hours and Costs
Exhibit 1 shows the estimated annualized burden hours for the respondent’s time to participate in the study. Semi-structured interviews will be conducted with a maximum of 9 individuals from each of 9 participating organizations and will last about one hour each. The training participant questionnaire will be completed approximately 10 individuals from each of about 240 organizations and is estimated to require 20 minutes to complete. The total annualized burden is estimated to be 881 hours.
Exhibit 2 shows the estimated annualized cost burden based on the respondents’ time to participate in the study. The total cost burden is estimated to be $38,923.
Exhibit 1. Estimated annualized burden hours
Form Name |
Number of respondents |
Number of responses per respondent |
Hours per response |
Total burden hours |
Semi-structured interview |
9 |
9 |
60/60 |
81 |
Training participant questionnaire |
240 |
10 |
20/60 |
800 |
Total |
249 |
NA |
NA |
881 |
Exhibit 2. Estimated annualized cost burden
Form Name |
Number of respondents |
Total burden hours |
Average hourly wage rate* |
Total cost burden |
Semi-structured interview |
9 |
81 |
$44.18 |
$3,579 |
Training participant questionnaire |
240 |
800 |
$44.18 |
$35,344 |
Total |
249 |
881 |
NA |
$38,923 |
* Based upon the mean of the average wages for all health professionals (29-0000) for the training participant questionnaire and for executives, administrators, and managers for the organizational leader questionnaire presented in the National Compensation Survey: Occupational Wages in the United States, May, 2012, U.S. Department of Labor, Bureau of Labor Statistics. http://www.bls.gov/oes/current/oes_nat.htm#37-0000
Capital and maintenance costs include the purchase of equipment, computers or computer software or services, or storage facilities for records, as a result of complying with this data collection. There are no direct costs to respondents other than their time to participate in the study.
The total contractor cost to the government to conduct the one-time questionnaire and conduct nine site visits, as well as to analyze and present all results is estimated to be $181,521,. As shown in Exhibit 3a, this amount includes costs for developing the data collection tools ($24,889); collecting the data ($108,667); and analyzing the data ($35,061), and reporting the findings ($12,903).
Exhibit 3a. Estimated Total Contractor Cost
Cost Component |
Total Cost |
Project Development |
$24,889 |
Data Collection Activities |
$108,667 |
Data Processing and Analysis |
$35,061 |
Publication of Results |
$12,903 |
Total |
$181,521 |
A Social Science Analyst (the Project Officer) will be responsible for project management and oversight. This will include oversight of the one-time data collection approach and review of the report of summarized results. The estimated cost to the Federal Government for these activities is provided in Exhibit 3b. The average hourly salary for the position of Social Science Analyst at the GS-15 grade level, Step 10 is $75.28 per hour. The Federal hourly salary information is available on the OPM website at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2014/DCB_h.pdf.
Exhibit 3b. Federal Government Personnel Cost
Activity |
Federal Personnel |
Hourly Rate |
Estimated Hours |
Cost |
Data Collection Oversight |
Social Science Analyst |
$75.28 |
6 |
$451.68 |
Review of Results |
Social Science Analyst |
$75.28 |
4 |
$301.12 |
Total |
$752.80 |
The estimated total annualized cost for this activity is $182,274. This cost includes contractor costs ($181,521) and Federal personnel costs ($752.80).
This is a new collection of information.
The time schedule for the data collection, data analysis, and final report preparation is presented in Exhibit 4.
Exhibit 4. Timeframe for data collection, analysis, and preparation of final report
Data Collection and Analysis |
Timeframes |
Conduct semi-structured interviews |
Immediately upon OMB approval |
Administer training participant questionnaire |
Immediately upon OMB approval |
Analyze data |
60 days from end of data collection |
Prepare final report |
90 days from end of data analysis |
HRET will analyze the survey data to identify trends in usage of the TeamSTEPPS curriculum, as well as the perceived impact of the program on organizational outcomes. To that end, we propose a three-phase analysis that includes (1) ensuring the quality of the data collected, (2) conducting descriptive analyses, and (3) conducting comparisons of specific master trainer types and cohorts. These phases of analyses are described below.
To ensure maximum integrity of the results, we will conduct several data screening and checking procedures (Tabachnick & Fidell, 1996). Specifically, we will perform data quality checks by searching for deviant response ranges, anomalous response patterns, excessive missing data, extreme outliers, and highly skewed or irregular distributions. From these analyses, we will flag, correct, and/or eliminate faulty data or data of poor measurement quality. For example, excessive missing data is an indicator of poor data quality. We will identify respondents who fail to respond to more than 10 percent of the protocol questions on the web survey and review their pattern of responses more carefully to determine if, for example, we should eliminate the respondent’s data. Obviously any strategies that result in the elimination of data would first be discussed with AHRQ representatives and then fully documented in the final report.
For the descriptive analyses, HRET will employ the following approach: (1) compute a number of descriptive statistics for each variable measured by the survey; (2) develop early warning data protocols (specific statistical analyses to indicate significant variability, low response rates, or error in the data); (3) conduct item analyses; and (4) conduct comprehensive group and subgroup analyses.
HRET will calculate frequency distributions, means, and standard deviations for each closed-ended item included in the survey and combinations of related items that focus on a particular variable or issue. In addition, we will calculate these statistics for each subgroup represented in the sample (e.g., year of training attendance) and conduct analyses to identify subgroup differences. Frequency distributions will show the percentage of people who responded to each response option for each item included in the protocol. Means and standard deviations will be used to examine the relative importance of different items and item combinations that measure specific issues associated with each survey. Finally, standard deviations will be used to examine the level of agreement among respondents regarding issues that are identified as important.
A few items included will be open-ended in nature as a means of following up on closed-ended items to obtain richer detail on unique activities being conducted post-training. We will record and compile the individual responses to the open-ended items and examine the results for any themes or patterns of interest. If appropriate, codes will be defined based on the themes identified and the open-ended responses would be coded into closed-ended categories, which would then be tabulated. Otherwise, the results will be summarized in a memo as described in the proposed analyses for structured interviews.
The data gathered from the survey will allow us to determine differences between roles that master trainers may fulfill. For example, we will conduct comparisons of each TeamSTEPPS tool and strategy based upon the ratings of usefulness by averaging the usefulness ratings of each tool across all participants and then conduct t-tests to assess the magnitude of any subgroup differences. In addition to conducting comparisons of tool usefulness, we will also conduct comparisons of tool usage and perceived impact, and will analyze variations in these characteristics by training participant type. We will conduct t-tests to identify the magnitude of any differences by training participant type. For example, we will analyze how useful a specific tool is for implementers as opposed to those master trainers whose roles are primarily to facilitate implementation or training of TeamSTEPPS.
AHRQ does not seek this exemption.
Attachment A: Healthcare Research and Quality Act of 1999
Attachment B: Logic Model of Post-TeamSTEPPS Master Training Patient Safety Activities and Outcomes by Participant Role
Attachment C: Training participant questionnaire
Attachment D: Advance notice, invitation, reminder notices, and thank you letters for training participant questionnaire
Attachment E: Semi-structured interview guide
Attachment F: Interview informed consent form
Attachment G: Federal Register Notice
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | OMB Clearance Application |
Author | hamlin-ben |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |