Supporting Statement B
Area Health Education Centers (AHEC) Project on the Mental and Behavioral Health and Substance Abuse Issues of Veterans/Service Members and Their Families
OMB Control No. 0915-NEW
1. Respondent Universe and Sampling Methods
The total respondent universe is 150 AHEC Center grantees. AHEC centers providing continuing education offerings will ask all participants to respond to the initial evaluation of the continuing education offering. The expected response rate is 82% based on results from similar evaluations1,2.
A randomized sample of 14 percent of initial respondents will be contacted via email within 30 days of the training and asked to respond to a follow-up evaluation. Respondents will be selected by each training site and offering using the following protocol:
Random Selection Example 100 participants total, desire 142 selected randomly (142%).
number the units in the
population from 1 to N
Our example – 1-100
decide on the n (sample size)
that you want or need
Our example - 14
2
k = N/n = the interval
size
Our example - 100/142=7
50
randomly select an integer between 1 to k
Our example - I choose 7as my integer
then take every kth unit
Our example – start with the 7th
name and then select every 7th
name to get the 14
2 names you
desire.
when you get to the end of
the list of names go back to #1 and continuing counting to complete
the selection of the desired # of names.
Reference - http://www.socialresearchmethods.net/kb/sampprob.php
Since surveys will be administered to all participants at baseline as a requirement for continuing education credit, the proportion of respondents responding to each item and a summative item (i.e., total score) will be calculated from this sample. After 30 days, a subset of respondents will be asked to complete a “Post” survey to self-report their use of the learned approaches. The questions are parallel to the baseline survey. However, the post survey cannot be linked to the baseline survey since no identifiers are used in this evaluation study. Thus, the proportion responding to each item and total score will be compared.
To estimate the minimum number of observations needed at 30-days, a difference in proportion power table was used (http://statpages.org/proppowr.html ). The table below summarizes possible differences in proportion and the required sample size per group assuming different proportions of increase expected from the continuing education session. For example, using 30% as baseline, expecting an increase of 20% would require a minimum of 107 per group whereas a proportion increase of 40% would require a minimum of 56.
For purposes of the proposed evaluation, a moderate increase of 30% is assumed. Thus, 97 would be needed post to calculate the proportions. However, oversampling is needed since some participants may not respond to the survey request. Thus, a 40% oversample will be done, resulting in a minimum sample size of approximately 140.
Baseline |
30-day |
Difference |
Required per group with Continuity Correction Formula |
30% |
50% |
20% |
107 |
30% |
60% |
30% |
97 |
30% |
70% |
40% |
56 |
30% |
80% |
50% |
18 |
Participants randomly selected to take part in the follow-up evaluation will be contacted once by email. If a response is not received within 2 days a follow-up phone call will be placed and the evaluation will be conducted via telephone. The expected response rate is 62 percent 2,3. If participants do not respond to the email within two business days or follow-up phone call additional respondents will be contacted until a full sample of 14 2 percent of training participants have responded.
Once both evaluations have been completed by the AHEC Center, the Center will aggregate the results and submit a CE Participant Evaluation Report and a CE Participant Follow-up Evaluation Report annually to HRSAto HRSA following each CE offering (anticipated 150 1,050 responses per year).
2. Procedures for the Collection of Information
Data will be collected through Microsoft Word. Data collection forms in Word format (Attachment A, Forms 1 and 2) will be provided to AHEC grantees to facilitate the collection and reporting of data. Individual AHEC grantee sites will aggregate evaluation and follow-up evaluation data and submit aggregated data to HRSA for compilation and analysis. Grantees will be instructed to submit evaluation reports annuallyfollowing each CE offering. HRSA will review evaluation reports and provide error reports to grantees for incorrect or missing data.
Respondents are asked to respond to similar evaluations for most continuing education offerings provided by AHEC grantees and other continuing education offerors and will be notified about the potential for being contacted for follow-up evaluation at the time of registration for the continuing education offering.
3. Methods to Maximize Response Rates and Deal with Nonresponse
Based on similar studies published in the literature, HRSA anticipates a response rate of 82 percent for the initial evaluation of the continuing education offering. Follow-up evaluation response rates have been shown to be lower, at around 62 percent in similar studies with comparable follow-up timeframes. To maximize response rate, grantees will follow-up on non-responses with a phone call within two business days of the initial follow-up email.
4. Tests of Procedures or Methods to be Undertaken
Data collection forms have been pilot tested with a small number (fewer than 10) AHEC grantees. Forms and methodology have been revised based on feedback received from pilot test participants, including the wording and sequence of evaluation questions. No significant issues were raised during the pilot testing.
Once data is collected, HRSA will perform descriptive statistical tests to determine aggregate rates of self-reported knowledge gain, intent to change practice, and practice implementation. HRSA will also perform two-tailed t-tests with significance of p=.05 to compare rates of knowledge gain and practice change from immediately following the continuing education offering and at the follow-up evaluation.
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or
Analyzing Data
Individuals involved in data collection design and analysis:
Government Project Officer:
Kyle Peplinski, MA
Public Health Analyst
301-443-7758
Contractor:
National AHEC Organization
Mary Wainwright, MS, RN
Project Director, A-TrACC
409-772-7884
Gretchen Forsell, MPH, RD
Project Manager, A-TrACC
402-644-7256
Carol Trono, MA
Program Manager, A-TrACC
409-772-7884
Individuals responsible for collecting data:
See Attachment B
References:
1 Bullock, A.D, Belfield, C.R., Butterfield, S., Morris, Z.S., Ribbins, P.M., and Frame, J.W. (1999) Continuing Education: A framework for the evaluation of continuing education short courses in dentistry. British Dental Journal, 187, 445-449.
2 Weiner, S.J., Jackson, J.L., Garten, S. (2009) Measuring Continuing Medical Education Outcomes: A Pilot Study of Effect Size of Three CME Interventions at an SGIM Annual Meeting. J Gen Intern Med, 24(5), 626-629.
3 Knox, A.B. Evaluation for Continuing Education: A Comprehensive Guide to Success. San Francisco: Jossey-Bass, 2002.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Seleda.Perryman |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |