SUPPORTING STATEMENT
Part A
Workflow Assessment for Health IT Toolkit Evaluation
Version: September 14, 2012
Agency of Healthcare Research and Quality (AHRQ)
A. Justification 3
1. Circumstances That Make the Collection of Information Necessary 3
2. Purpose and Use of Information 6
3. Use of Improved Information Technology 6
4. Efforts to Identify Duplication 7
5. Involvement of Small Entities 7
6. Consequences if Information Collected Less Frequently 7
7. Special Circumstances 7
8. Federal Register Notice and Outside Consultations 7
8.a. Federal Register Notice 7
8.b. Outside Consultations 7
9. Payments/Gifts to Respondents 8
10. Assurance of Confidentiality 8
11. Questions of a Sensitive Nature 9
12. Estimates of Annualized Burden Hours and Costs 9
13. Estimates of Annualized Respondent Capital and Maintenance Costs 10
14. Estimates of Annualized Cost to the Government 10
15. Changes in Hour Burden 10
16. Time Schedule, Publication and Analysis Plans 11
17. Exemption for Display of Expiration Date 11
List of Attachments 12
The mission of the Agency for Healthcare Research and Quality (AHRQ) set out in its authorizing legislation, The Healthcare Research and Quality Act of 1999 (see http://www.ahrq.gov/hrqa99.pdf), is to enhance the quality, appropriateness, and effectiveness of health services, and access to such services, through the establishment of a broad base of scientific research and through the promotion of improvements in clinical and health systems practices, including the prevention of diseases and other health conditions. AHRQ shall promote health care quality improvement by conducting and supporting:
1. research that develops and presents scientific evidence regarding all aspects of health care; and
2. the synthesis and dissemination of available scientific evidence for use by patients, consumers, practitioners, providers, purchasers, policy makers, and educators; and
3. initiatives to advance private and public efforts to improve health care quality.
Also, AHRQ shall conduct and support research and evaluations, and support demonstration projects, with respect to (A) the delivery of health care in inner-city areas, and in rural areas (including frontier areas); and (B) health care for priority populations, which shall include (1) low-income groups, (2) minority groups, (3) women, (4) children, (5) the elderly, and (6) individuals with special health care needs, including individuals with disabilities and individuals who need chronic care or end-of-life health care.
In particular, AHRQ is a lead Federal agency in developing and disseminating evidence and evidence-based tools on how health IT can improve health care quality, safety, efficiency, and effectiveness.
Understanding clinical work practices and how they will be affected by practice innovations such as implementing health IT has become a central focus of health IT research. While much of the attention of health IT research and development had been directed at the technical issues of building and deploying health IT systems, there is growing consensus that deployment of health IT has often had disappointing results, and while technical challenges remain, there is a need for greater attention to sociotechnical issues and the problems of modeling workflow. As Karsh et al. note, there is a need for “substantial research on how clinical work is actually done and should be done.”1
The majority of health IT implementation evidence, however, has arisen from large hospitals and health systems2 while the experiences of small- and medium-sized practices are under-reported.3 Many primary care practices are small- to medium-sized single specialty practices not aligned with a larger health system. Practices with limited resources obtain IT support and guidance that is local and, based on information collected through membership surveys and in experiences from past research, often provided by a practice staff family member. It is particularly important to study the success of health IT implementation in these settings because of the cost and complexity of successful implementation. Unless implemented well, there is no guarantee that systems will have the desired effects even after this investment, as under-resourced practices may be more likely to find that health IT tools are cumbersome, do not integrate third party information, or do not contain accurate and complete patient data needed by clinicians for prescribing, as found by McKibbon et al. for e-prescribing applications.4
It is clear that implementation of health IT in practice is costly in time and effort and that less is known about these issues in small- and medium-sized practices where the impact of improved or disrupted workflows may have especially significant consequences because of limited resources. Practices would derive great benefit from effective tools for assessing workflow during many types of health IT implementation, such as creating disease registries, collecting quality measures, using patient portals, or implementing a new electronic health record system. To that end, in 2008, AHRQ funded the development of the Workflow Assessment for Health IT toolkit (Workflow toolkit). Through this toolkit, end users should obtain a better understanding of the impact of health IT on workflow in ambulatory care for each of the following stages of health IT implementation: (1) determining system requirements, (2) selecting a vendor, (3) preparing for implementation, or (4) using the system post implementation. They should also be able to effectively utilize the publicly available workflow tools and methods before, during, and after health IT implementation while recognizing commonly encountered issues in health IT implementation. In the current project AHRQ is conducting an exploratory evaluation to ensure that the newly developed Workflow toolkit is useful to small- and medium-sized ambulatory care clinic managers, clinicians, and staff.
The evaluation will consist of field assessments of use of the Workflow toolkit in 18 small- and medium-sized practices and gathering feedback from two Health IT Regional Extension Centers (RECs) who are providing support to some of these practices. The evaluation will address the issues of system validation as classically defined in software engineering: determining whether the software or system actually meets the requirements of the user to perform the relevant tasks. The evaluation will answer the following question:
Do decisions change? Do user decisions about workflow assessment change? Do user decisions about health information technology (health IT) implementation change?
To answer this question the proposed exploratory evaluation will be conducted to examine usefulness of the Workflow toolkit in small- and medium-sized practices. The evaluation will be conducted with 18 practices affiliated with one of two Practice-based Research Networks (PBRNs) in Oregon and Wisconsin, and with the Health IT Regional Extension Centers (RECs) in those States. Participants will be recruited who agree to use the Workflow toolkit in their specific health IT project for a minimum of 10 weeks. This will provide an opportunity to observe use of the Workflow toolkit amongst its intended end users, who are best positioned to provide critical feedback to improve the functionality of the Workflow toolkit.
To address this question we will conduct the following activities and data collections:
Creation of Clinic Study Team: Each participating practice will form small teams, referred to as Clinic Study Teams, who will participate in the Pre-Workflow Toolkit Interview, use the Workflow toolkit and participate in Observations, and participate in the Post-Workflow Toolkit Interview. Each team will include a maximum of 14 individuals and may represent the following types of respondents: clinicians, office managers, front office staff, medical assistant or nurse, nurse care manager, social worker, health educator, information technology specialist, and/or quality improvement director (see Attachment J).
Pre-Workflow Toolkit Interview: these will consist of semi-structured interviews with practice staff and with three specialists from each Health IT Regional Extension Center. These interviews are designed to examine the knowledge, attitudes, and barriers to and facilitators of workflow assessment for implementation of health IT. Respondents will be asked to define workflow, to explain how an understanding of workflow is important to the practice or REC and to health IT implementation and to describe previous experience with health IT implementation and the effect of this implementation on work processes in their practice (practices) or for their clients (RECs). See Attachment A for the practice-focused interview guide and Attachment B for the REC-focused interview guide. These interview guides illustrate the depth and breadth of topics intended for coverage in each discussion.
Observations: Participating practices will form small teams (Clinic Study Teams) who will use the Workflow toolkit. A member of the project staff will join each Clinic Study Team or the three specialists at each of the two RECs, as participant-observer and will meet with the team at times to be determined by the teams, but at least every two weeks after the Pre-Workflow Toolkit Interview for at least four visits. During these visits project staff will participate in and keep field notes regarding the practice’s or REC’s workflow assessment activities. See Attachment C.
Usage Logs: As part of their workflow assessment process, Clinic Study Teams, and REC staff, will be asked to meet weekly. For weekly meetings at which a project staff member is not present, Clinic Study Teams and REC staff will keep a record of workflow assessment activities including use of the workflow assessment toolkit, recording in a free-form journal the purpose and results of the activity as well as issues that arose in the process. See Attachment D.
Post-Workflow Toolkit Interview: This final interview will consist of individual semi-structured interviews of practice staff and three specialists from each Health IT Regional Extension Center. These interviews will (a) re-examine their knowledge and attitudes about workflow assessment;(b) explore the use of the Toolkit in terms of: (b.1) for practices, the perceived impacts on clinicians, the practice staff, the practice, and the patients; and (b.2) for RECs, describe workflows the practices talked about; and finally (c) assess the overall impressions about the usefulness of the Workflow toolkit as well as any suggested changes. See Attachment E for the practice-focused guide and Attachment F for the REC-focused guide. These interview guides illustrate the depth and breadth of topics intended for coverage in each discussion.
As noted above, data collection will be conducted at a total of 20 organizations over the six months of data collection for this project. These organizations include 18 small- to medium-sized ambulatory care practices and two Health IT Regional Extension Centers. The composition of personnel involved will vary between each individual practice, but will contain at least a clinician, scheduler, nurse or medical assistant, and office manager. As they exist within practices and are able to contribute to the assessment, we will include other roles such as nurse care managers, social workers, health educators, information technology specialists, and quality improvement personnel.
This study is being conducted by AHRQ through its contractors, the Oregon Rural Practice-based Research Network (ORPRN) and the Wisconsin Research & Education Network (WREN), pursuant to AHRQ’s statutory authority to conduct and support research on health care and on systems for the delivery of such care, including activities with respect to the quality, effectiveness, efficiency, appropriateness and value of health care services and with respect to quality measurement and improvement. 42 U.S.C. 299a(a)(1) and (2).
The outcome of the evaluation will be a report including recommendations for enhancing and improving the Workflow toolkit. The report will provide results about the perceived usefulness of the Workflow toolkit. Results will be produced separately for practices and RECs as well as for both user groups as a whole. The report will also include specific suggestions on how to revise Workflow toolkit to make it more useful to its intended audiences.
AHRQ will collect data through an established qualitative evaluation methodology, which includes in-person interviews and observations of study participants. Because most interview questions are open-ended to allow for in-depth exploration of issues, electronic submission of responses is not a viable option.
In addition, to reduce reporting burden on participants, we will use Google Analytics to provide supplementary data about direct use of the Workflow toolkit over the Web, including tracking page views, time spent on page and on site, content, and navigation analysis to examine how participants are using the Workflow toolkit Web site.
The Workflow toolkit that is under evaluation in this proposed project was published by AHRQ in July 2011. To date, AHRQ has not conducted a systematic evaluation of this toolkit and is not aware of any other entity conducting a similar evaluation.
As noted above, this evaluation will directly involve staff from 18 small- and medium-sized practices. AHRQ will target participation from practices that do not have a strong affiliation with a health system and that are a privately owned business, as the Workflow toolkit was designed to benefit these end-users in particular.
Study participation is voluntary, and AHRQ has designed a participation schedule that is intended to minimize the impact of the Workflow toolkit evaluation on the practices. Interviews and observations will be scheduled at times convenient for practice staff. The interview protocols consist of the minimum questions required for the study purposes. The established interview time limits for each respondent type will be respected, and the interviews will not exceed one hour. Observations will be designed to minimize interruptions to regular workflow, and will occur at the staff’s convenience during 1 to 2 hour long work sessions of the practice Clinic Study Teams, for a maximum of 2 hours at each of four visits over the study period. Similar interview guides and observation methods have been successfully used with practices such as the ones being included in this evaluation.
This is a one-time collection.
This request is consistent with the general information collection guidelines of 5 CFR 1320.5(d)(2). No special circumstances apply.
As required by 5 CFR 1320.8(d), notice was published in the Federal Register on March 9th, 2012 for 60 days and again on May 21st, 2012 for 30 days (see Attachment G). One comment was received (see Attachment I).
AHRQ consulted with the following experts on various aspects of the design of the data collection effort, including the key research question, approaches to identify and recruit practices, methods of data collection and analysis, and protocol development:
Pascale Carayon, PhD, Procter & Gamble Bascom Professor in Total Quality, College of Engineering, University of Wisconsin, Madison
Ann Hundt, PhD, Center for Quality and Productivity Improvement, University of Wisconsin, Madison
Peter Hoonaker, PhD, Center for Quality and Productivity Improvement, University of Wisconsin, Madison
Tosha Wetterneck, MD, MS, Associate Professor of Medicine, Department of Medicine, University of Wisconsin School of Medicine and Public Health
Supporting Statements Parts A and B along with all the attachments were also shared with Ned Ellington from the Office of the National Coordinator for Health IT for review.
The practices in this project are subcontractors to Oregon Rural Practice-based Research Network (see Attachment H). As such they will invoice for activities related to the project including pre-evaluation and evaluation activities. The pre-evaluation activity is establishing the Clinic Study Teams. Evaluation activities include participating in Pre-Workflow toolkit interviews; weekly meetings of the Clinic Study Team; using the toolkit; maintaining a log of toolkit use and issues; and participating in Post Workflow toolkit interviews.
We estimate the cost of each subcontract as approximately $8,005 based on 248 hours of labor at $32.28 per hour. The hourly rate is calculated based on weighted salaries for the four main roles we will invite to participate in the project: clinician, office manager, scheduler, and nurse or medical assistant.
10. Assurance of Confidentiality
Individuals and organizations will be assured of the confidentiality of their replies under Section 934(c) of the Public Health Service Act, 42 USC 299c-3(c). They will be told the purposes for which the information is collected and that, in accordance with this statute, any identifiable information about them will not be used or disclosed for any other purpose without their prior consent.
The study will collect information from respondents about the usefulness of the Workflow toolkit. It will not collect any information about either the respondent or any individual in the establishment. AHRQ will collect the respondent’s name, organizational affiliation, organizational phone number, and role. This information will be used for respondent tracking purposes or for clarification call backs. All electronic files will be password protected and accessible only from within a secured network. Electronic files containing study data from the PBRNs will be transmitted for data management and analysis to ORPRN, the contractor leading data collection and analysis. These files will be encrypted and will be transmitted through a secure messaging portal on the Oregon Health & Science University Web site. Paper files will be sent via certified mail or delivered by hand to project staff. When not in use by project staff, all printed information or materials that could be used to identify participants in the study will be stored in locked cabinets that are accessible only to project team members.
All respondent involvement will be voluntary. Informed consent will be obtained from each respondent from each organization prior to participation. Respondents will be informed that: (1) the project team will not share their name, their organization’s name, or copies of the interview notes with anyone outside of the team; and (2) respondent comments may be included in reports, but will not be attributed to specific individuals or organizations.
All project team members are required to complete human subjects training coursework through Institutional Review Boards.
No questions of a sensitive nature will be asked.
Exhibit 1 shows the estimated annual burden hours for each respondent’s time to participate in this evaluation. Each practice will convene a “Clinic Study Team” consisting of no more than 14 individuals; this process will take approximately 8 hours per practice, or about 35 minutes per person. The Pre-Workflow interview will be completed by a total of up to 258 persons (up to14 per practice and 3 per REC) and requires 30 minutes. Up to four observations will be conducted for up to 258 persons and they are each estimated to take two hours. Ten usage logs will be completed by a total of up to 258 persons (one per week of study activity) and completion of a single usage log should take no longer than 15 minutes. The Post-Workflow interview will be completed by a total of up to 258 persons and requires 30 minutes.
The total annual burden is estimated to be 3,114 hours.
Exhibit 2 shows the estimated annual cost burden associated with the organizations' time to participate in this research. The total annual burden is estimated to be $97,131.
Exhibit 1. Estimated Annualized Burden Hours
Data Collection |
Maximum Number of respondents |
Number of responses per respondent |
Max. Hours per response |
Total burden hours |
Creation of Clinic Study Team |
252 |
1 |
35/60 |
147 |
Pre-Workflow Toolkit Interview |
258 |
1 |
30/60 |
129 |
Observations |
258 |
4 |
2 |
2,064 |
Usage Logs |
258 |
10 |
15/60 |
645 |
Post-Workflow Toolkit Interview |
258 |
1 |
30/60 |
129 |
Total |
1,284 |
NA |
NA |
3,114 |
Exhibit 2. Estimated Annualized Cost Burden
Data Collection |
Maximum Number of respondents |
Total burden hours |
Average hourly wage rate* |
Total cost burden |
Creation of Clinic Study Team |
252 |
147 |
$32.28 |
$4,745 |
Pre-Workflow Toolkit Interview |
258 |
129 |
$32.28 |
$4,164 |
Observations |
258 |
2,064 |
$32.28 |
$64,044 |
Usage Logs |
258 |
645 |
$32.28 |
$20,014 |
Post-Workflow Toolkit Interview |
258 |
129 |
$32.28 |
$4,164 |
Total |
1,284 |
3,372 |
NA |
$97,131 |
*The hourly wage for the participants across the four data collections (pre-workflow toolkit interviews, observations, usage logs, and post-workflow toolkit interview) is based upon a weighted mean of the average hourly wages for Family and General Practitioners (1.5; $87.84 per hour); office managers (1.0; $35.18 per hour); front office staff (1.0; $15.15 per hour); medical assistants or nurses (1.5; $24.36 per hour); nurse care managers (1.5; $33.57); social workers (0.1; $24.44 per hour); health educators (0.1; $25.12 per hour); information technology specialists (0.25; $23.43 per hour); quality improvement directors (0.25; 25.12 per hour); and technical staff (1.0; $33.14 per hour) for Oregon and Wisconsin from the U.S. Department of Labor, Bureau of Labor Statistics, May 2010 National Occupational Employment and Wage Estimates for the United States, Occupational Employment Statistics (OES), Washington, D.C. (Feb. 2009), http://bls.gov/oes/2010/may/www.bls.govoessrcst.htm (accessed November, 2011).
Capital and maintenance costs include the purchase of supplies and computer software or services, travel, postage, and storage facilities for records, as a result of complying with this data collection. There are no direct costs to respondents other than their time to participate in the study.
The estimated total cost to the Federal Government for this project is $793,456 over a 27-month period from September 23, 2011 to December 22, 2013. The estimated average annual cost is $352,646. Exhibit 3 provides a breakdown of the estimated total and average annual costs by category.
Exhibit 3. Estimated Total and Annual Cost* to the Federal Government
Cost component |
Total cost |
Annualized cost |
Project Management and Coordination Activities |
$96,449 |
$42,866 |
Develop Research and Recruitment Plans |
$78,383 |
$34,837 |
Compliance with PRA |
$12,267 |
$5,452 |
Obtaining IRB approval |
$10,254 |
$4,557 |
Develop Data Analysis Plan |
$18,246 |
$8,109 |
Conduct Evaluation |
$534,401 |
$237,512 |
Data analysis and Final Report |
$23,554 |
$10,468 |
Ensure 508-compliant deliverables |
$19,902 |
$8,845 |
Total |
$793,456 |
$352,646 |
*Costs are fully loaded including overhead and G&A.
This is a new collection of information.
Time schedule and publication plans. The anticipated schedule for this project is shown in Exhibit 4. Once clearance from the Office of Management and Budget is obtained, AHRQ will begin identifying appropriate respondents and scheduling and conducting evaluation activities.
Exhibit 4. Anticipated Schedule
Activity |
Estimated timeline following OMB clearance |
Recruit for Field Evaluation |
Month 1 |
Conduct Field Evaluation |
Months 2 – 6 |
Analyze Results |
Months 7-11 |
Brief AHRQ on Results |
Month 11 |
Submit Final Report on Results |
Month 13 |
Analysis plans. Project staff will employ the immersion-crystallization approach5 to qualitative data analysis, in an iterative process that begins at the outset of data collection and continues throughout the data collection period. The ORPRN team led by Dr. Gorman will meet weekly to conduct preliminary analyses of the field notes, interview recordings and notes, and any project documents. Analysis sessions will assess and ensure data quality, and analyze data to address the research question. Analysis will also attempt to identify:
How understanding of workflow changed
How understanding of the importance of assessing workflow changed
How the practices use the Workflow toolkit
Whether the Workflow toolkit is regarded as useful
Atlas.ti software (Version 5.0) will be used to store, code and search the interview data for analysis. Data reduction will be achieved by summarizing coded interview data from Atlas.ti in data tables and practice summaries, which will then be analyzed to refine themes, align them with the evidence supporting each finding, and identify respondent disagreements and disconfirming evidence.
AHRQ does not seek this exemption.
References
Karsh B-T, Weinger MB, Abbott PA, Wears RL. Health information technology: fallacies and sober realities. J Am Med Inform Assoc. 2010 Oct.;17(6):617–623.
Chaudhry, B., et al., Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Annals of internal medicine, 2006. 144(10): p. 742-752.
Carayon P, K.B.-T., Cartmill RS, Hoonakker P, Hundt AS, Krueger D, Thuemling T, Wetterneck TB, et al. , Incorporating Health Information Technology Into Workflow Redesign: Request for Information Summary Report, A.f.H.R.a. Quality, Editor. 2010, AHRQ Publication
McKibbon, K., et al., Enabling Medication Management Through Health Information Technology. 2011, Agency for Healthcare Research and Quality.: Rockvillle, MD. p. 1-951.
Cohen D, Crabtree B. "Qualitative Research Guidelines Project." Robert Wood Johnson Foundation, July 2006. http://www.qualres.org/HomeImme-3829.html
Attachment A: Pre-Workflow Toolkit Interview Guide – Practice
Attachment B: Pre-Workflow Toolkit Interview Guide – REC
Attachment C: Workflow Toolkit Activities and Perspectives Observation Log
Attachment D: Workflow Assessment Usage Log
Attachment E: Post-Workflow Toolkit Interview Guide – Practice
Attachment F: Post-Workflow Toolkit Interview Guide – REC
Attachment G: Federal Register Notice
Attachment H: List of Subcontractor Practices
Attachment I: Response to Public Comment
Attachment J: Creation of Clinic Study Teams
File Type | application/msword |
File Title | OMB Clearance Application |
Author | hamlin-ben |
Last Modified By | CTAC |
File Modified | 2012-10-04 |
File Created | 2012-10-04 |