Download:
pdf |
pdfRequest for Approval under the “Generic Clearance for Improving Customer Experience (OMB Circular
A-11, Section 280 Implementation)” (OMB Control Number: 2900-0876)
TITLE OF INFORMATION COLLECTION: Telemedicine Surveys
PURPOSE
Telehealth is an effective and convenient way for patients to receive, and for clinicians to
provide quality care management. The initiative is a way to incorporate telecommunications
technology to improve and modernize health care offered by the Veteran Health Administration
(VHA). In order to assess patient satisfaction with the program and identify areas for intervention or
further evaluation, the Telehealth Services Office within VHA enlisted the services of the VEO. The
Veteran Telehealth Survey is designed to measure Customer Experience associated with utilizing VA
electronic health services within the three major aspects, or modalities, of Telehealth: Clinical Video
Telehealth (CVT), Home Telehealth (HT), and Store and Forward (SFT). The purpose of this report is
to document the survey methodology and sampling plan of the survey. Information about quality
assurance protocols, as well as limitations of the survey methodology, is included in this report.
Once data collection is completed, the participant responses in the online survey will be weighted so
that the samples will be more representative of the overall population. Iterative proportional fitting to
create sample weights will be applied using variables: Modality/Stage, Gender, Age Group (18-39, 4059, 60+), and District.
Once the data is collected, it is immediately available in Vsignals, the Medallia-based platform used by
the Veterans Experience Office for CX data storage and analysis. Survey weights are incorporated into
the system at the close of every weekly survey. The interface allows data users to analyze the survey
results using interactive charts and sub-populations. Survey data may also be reviewed over differing
time periods, ranging from weekly, to monthly, to quarterly estimates.
DESCRIPTION OF RESPONDENTS:
1
The target population of the TH survey is all Veterans having an Outpatient CVT, HT, or SFT event in the past 7 days. The
identification of Telehealth patients utilizes weekly data extracts from the Corporate Data Warehouse (CDW), which houses the
operational records of VHA. Each Telehealth event eligible for a VEO survey will be associated with one of these three modalities.
The classification of TH events into a modality is based on a combination of primary and/or secondary stop codes. As indicated by
VSSC documentation. Under each modality, three types of surveys are designed to inquire about veterans’ experience in terms of
different VA service domains.
Patients scheduling CVT appointments are derived from the general Outpatient scheduling database table from CDW. When either an
actual CVT or SFT appointment occurs, the distinction between Offsite appointments (home, mobile, or non-VA facilities) vs.
appointments taking place within a VAMC or CBOC is based on secondary stop codes. A subset of veterans in each modality and
subtype will be randomly selected to participate in the survey. However, the subtypes of are sparsely populated so these will be
selected into each weekly sample with certainty. Telehealth subtypes that are less than 10% of the modality population will be selected
with certainty. In total, there will be 9 total sets of survey questions.
TYPE OF COLLECTION: (Check one)
[ ] Customer Comment Card/Complaint Form
[ ] Usability Testing (e.g., Website or Software
[ ] Focus Group
[X] Customer Satisfaction Survey
[ ] Small Discussion Group
[ ] Other: ______________________
CERTIFICATION:
I certify the following to be true:
1. The collection is voluntary.
2. The collection is low-burden for respondents and low-cost for the Federal Government.
3. The collection is non-controversial and does not raise issues of concern to other federal agencies.
4. Personally identifiable information (PII) is collected only to the extent necessary and is not retained.
5. Information gathered is intended to be used for general service improvement and program management purposes.
6. The collection is targeted to the solicitation of opinions from respondents who have experience with the program or may have
experience with the program in the future.
2
7. All or a subset of information may be released as part of A-11, Section 280 requirements on performance.gov. Additionally,
summaries of the data may be released to the public in communications to Congress, the media and other releases disseminated by
VEO, consistent with the Information Quality Act.
Name: Evan Albert, Director of Measurement and Data Analytics (Acting), Veterans Experience Office Evan.Albert@va.gov (202)
875-478
To assist review, please provide answers to the following question:
Personally Identifiable Information:
1. Will this survey use individualized links, through which VA can identify particular respondents even if they do not provide their
name or other personally identifiable information on the survey? [ X ] Yes [] No
2. Is personally identifiable information (PII) collected? [ ] Yes [X] No
3. If Yes, will any information that is collected be included in records that are subject to the Privacy Act of 1974? [ ] Yes [ ] No
[N/A]
4. If Yes, has an up-to-date System of Records Notice (SORN) been published? [ ] Yes [ ] No [N/A]
Gifts or Payments:
Is an incentive (e.g., money or reimbursement of expenses, token of appreciation) provided to participants? [ ] Yes [ X] No
BURDEN HOURS
Participation
Time
Burden
(÷ 60 =)
Category of Respondent
No. of Respondents
Individuals & Households
VA Form (if applicable)
Totals
604,800
3
30,240
604,800
3
30,240
Please answer the following questions.
3
( X minutes =)
1. Are you conducting a focus group, a survey that does not employ random sampling, user testing or any data collection
method that does not employ statistical methods?
[ ] Yes
[X] No
If Yes, please answer questions 1a-1c, 2 and 3.
If No, please answer or attach supporting documentation that answers questions 2-8.
a. Please provide a description of how you plan to identify your potential group of respondents and how you will select them.
b. How will you collect the information? (Check all that apply)
[ ] Web-based or other forms of Social Media
[ ] Telephone
[ ] In-person
[ ] Mail
[X] Other- E-mail-based surveys
c. Will interviewers or facilitators be used? [ ] Yes [ X ] No
2. Please provide an estimated annual cost to the Federal government to conduct this data collection: __$13,000______
3. Please make sure that all instruments, instructions, and scripts are submitted with the request. This includes questionnaires,
interviewer manuals (if using interviewers or facilitators), all response options for questions that require respondents to select a
response from a group of options, invitations given to potential respondents, instructions for completing the data collection or
additional follow-up requests for the data collection.
-Done
4. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection
methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons)
in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a
whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the
collection had been conducted previously, include the actual response rate achieved during the last collection.
- Please see Statistical Sample Plan in the Appendix.
4
5. Describe the procedures for the collection of information, including:
a. Statistical methodology for stratification and sample selection.
b. Estimation procedure.
c. Degree of accuracy needed for the purpose described in the justification.
d. Unusual problems requiring specialized sampling procedures.
e. Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
- Please see Statistical Sample Plan in the Appendix.
6. Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information
collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be
provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.
Please see Statistical Sample Plan in the Appendix.
7. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections
of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from
10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main
collection of information.
Please see Statistical Sample Plan in the Appendix.
8. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency
unit, contractors, grantees, or other person(s) who will actually collect or analyze the information for the agency.
Statistical Aspects:
Mark Andrews, Statistician, Veterans Experience Office, VA. (703) 483-5305
Collection and Analysis: Evan Albert, Director of Measurement and Data Analytics, Veterans Experience Office, VA (202) 8759478
Dr. Kevin Galpin, Executive Director, Telehealth, VHA (404) 771-8794
,
APPENDIX- STATISTICAL SAMPLE PLAN
5
Service Level Measurements: Telehealth Survey
Sampling Methodology Report
Prepared by
Veteran Experience Office
Version 1 June 2018
6
Contents
Executive Summary ....................................................................................................................................... 7
Part I – Introduction ....................................................................................................................................... 8
A. Background ........................................................................................................................................... 8
B. Basic Definitions .................................................................................................................................. 10
C. Application to Veterans Affairs ........................................................................................................... 10
Part II – Methodology ................................................................................................................................... 11
A. Target Population and Frame .............................................................................................................. 11
B. Sample Size Determination............................................................................................................... 11
C. Data Collection Methods .................................................................................................................. 15
D. Reporting........................................................................................................................................... 16
E. Quality Control ................................................................................................................................. 16
F. Sample Weighting, Coverage Bias, and Non-Response Bias ........................................................... 17
G. Quarantine Rules ............................................................................................................................... 20
Part III – Assumptions and Limitations ................................................................................ Error! Bookmark not defined.
A. Post Call Surveys ......................................................................................................... Error! Bookmark not defined.
B. Availability of Email Addresses................................................................................... Error! Bookmark not defined.
C. Immeasurability of Directory Assistance Calls ............................................................ Error! Bookmark not defined.
D. Response Rates............................................................................................................. Error! Bookmark not defined.
E. Unavailability of Gender for Veterans ......................................................................... Error! Bookmark not defined.
F. Missing Values and Non-Response ...................................................................................................... 21
Part IV - Appendices ..................................................................................................................................... 23
Appendix 3. References .......................................................................................................................... 23
Executive Summary
Telehealth is an effective and convenient way for patients to receive, and for clinicians to provide quality care management.
The initiative is a way to incorporate telecommunications technology to improve and modernize health care offered by the Veteran
Health Administration (VHA). In order to assess patient satisfaction with the program and identify areas for intervention or further
evaluation, the Telehealth Services Office within VHA enlisted the services of the VEO. The Veteran Telehealth Survey is designed to
measure Customer Experience associated with utilizing VA electronic health services within the three major aspects, or modalities, of
Telehealth: Clinical Video Telehealth (CVT), Home Telehealth (HT), and Store and Forward (SFT). The purpose of this report is to
document the survey methodology and sampling plan of the survey. Information about quality assurance protocols, as well as
limitations of the survey methodology, is included in this report.
7
The study consists of a nationally representative survey of Veterans who utilized telehealth services within the past week.
Beginning in July 2018, a weekly sample of such Veterans will be collected, in which they will be asked questions regarding their
satisfaction specifically under a telehealth modality. Survey links are disseminated to patients undergoing care at different points
(stages) in the health monitoring process, which differ by modality. The online questionnaire is brief and follows a human-centered
design methodology focusing on Trust, Emotion, Effectiveness, and Ease with the care they received. The measurement questions are
based on a five-point Likert scale (1-5) from Strongly Disagree (1) to Strongly Agree (5).
The overall sample size is determined based on a 95% confidence level and 3% margin of error with respect to monthly
estimates. This is a compromise between obtaining ongoing cross-sectional estimates of suitable precision, while maintain low
sampling rates over time. The selection process is a stratified design: controlling for geographic region (district), demographic,
modality, and stage. Once data collection is completed, the participant responses in the online survey will be weighted so that the
samples will be more representative of the overall population. Iterative proportional fitting to create sample weights will be applied
using variables: Modality/Stage, Gender, Age Group (18-39, 40-59, 60+), and District.
Once the data is collected, it is immediately available in the Veteran Insight System. Survey weights are incorporated into the
system at the close of every weekly survey. The interface allows data users to analyze the survey results using interactive charts and
sub-populations. Survey data may also be reviewed over differing time periods, ranging from weekly, to monthly, to quarterly
estimates.
Part I – Introduction
A. Background
The Enterprise Measurement and Program Improvement team (EM&PI) is part of the Insights and Analytics (I&A)
division within the Veterans Experience Office (VEO). The EM&PI team is tasked with conducting transactional surveys of the
Veteran population to measure their satisfaction with the Department of Veterans Affairs (VA) numerous benefit services. Thus, their
mission is to empower Veterans by rapidly and discreetly collecting feedback on their interactions with such VA entities as NCA,
VHA, and VBA. VEO surveys generally entail probability samples which only contact minimal numbers of Veterans necessary to
obtain reliable estimates. This information is subsequently used by internal stakeholders to monitor, evaluate, and improve beneficiary
processes. Veterans are always able to decline participation, and have the ability to opt out of future invitations. A quarantine protocol
is maintained to limit the number of times a Veteran may be contacted, in order to prevent survey fatigue, across all VEO surveys.
Surveys issued by EM&PI are generally brief in nature, and present a low amount of burden to Veterans. A few targeted
questions will utilize a human centered design (HCD) methodology, revolving around concepts of Trust, Ease, Effectiveness and
Emotion. Questions will focus on a specific aspect of a service process; spanning communication, applying for benefits, deliberation,
and/or receipt of benefits. Structured questions directly address the pertinent issues regarding each surveyed line of business. The
8
opportunity to volunteer open-ended text responses is provided within most surveys. This open text has been demonstrated to yield
enormous information. Machine learning tools are used for text classification, ranking by sentiment scores, and screening for
homelessness, depression, etc. Modern survey theory is used to create sample designs which are representative, statistically sound, and
in accordance with OMB guidelines on federal surveys.
VA uses a wide variety of technologies to facilitate quality healthcare to its beneficiaries. Telehealth services are a critical
aspect to modernizing the VA health care system. Telehealth (TH) increases access to high quality services by using information
technology and telecommunication for Veterans, especially those that live in remote areas or are incapacitated. In FY 2017, over
700,000 patients received care via the three central telehealth modalities 1. Clinical Video Telehealth (CVT) is the use of real-time
interactive video conferencing to assess, treat and provide patient care remotely. Veterans may be linked to physicians from a local
clinic or even from home, for over 50 clinical applications, ranging from primary care to numerous specialties (e.g. tele-dermatology).
Home Telehealth (HT) is applied to high-risk Veterans with chronic disease requiring long-term care. Care management is
augmented through such technologies as in-home and mobile monitoring, messaging, and/or video conferencing. The goal of HT is to
reduce complications, hospitalizations, and clinical/ER visitations, so at-risk patients may remain in their own homes. Finally, Store
and Forward Telehealth (SFT) concerns the acquisition and storage of electronic patient information (e.g., images, sounds, and
video) collected at a VA clinic or medical center. The information is forwarded and retrieved by healthcare professionals at another
VA medical facility where an assessment is performed.
The Veteran Experience Office (VEO) has been commissioned by the Veteran Health Administration (VHA) to measure the
satisfaction of Telehealth recipients regarding their electronic interaction with physicians, nursing professionals, and other medical
staff. It also seeks Veteran input on the quality of the treatment they received via the three modalities listed above. VEO proposes to
conduct a brief transactional survey on Veterans who utilized the service within the past week. A subset of veterans will be
randomly selected to participate. Sampled patients will be contacted through an invitation email. A link will be enclosed so the survey
may be completed using an online interface, with customized patient information. The survey itself will consist of a handful of
questions revolving around a human-centered design, focusing on such elements as trust, emotion, effective, and ease with the care
they received.
1
VA Telehealth Services Fact Sheet FY17, Office of Connected Care, VHA, VA
9
B. Basic Definitions
Coverage
The percentage of the population of interest that is included in the sampling frame.
Measurement Error
The difference between the response coded and the true value of the characteristic being
studied for a respondent.
Non-Response
Failure of some respondents in the sample to provide responses in the survey.
Transaction
A transaction refers to the specific time a Veteran interacts with the VA that impacts the
Veteran’s journey and their perception of VA’s effectiveness in caring for Veterans.
Response Rate
The ratio of participating persons to the number of contacted persons. This is one of the basic
indicators of survey quality.
Sample
In statistics, a data sample is a set of data collected and/or selected from a statistical
population by a defined procedure.
Sampling Error
Error in estimation due to taking a particular sample instead of measuring every unit in the
population.
Sampling Frame
A list, map, or other specification of units in the population from which a sample may be
selected.
Reliability
The consistency or dependability of a measure. Also referred to as standard error.
C. Application to Veterans Affairs
In general, customer experience and satisfaction are usually measured at three levels: the enterprise level, the service level
patterns, and point-of-service feedback. This measurement may bring insights and value to all stakeholders at VA. Front-line VA
leaders can resolve individual feedback from Veterans and take steps to improve the customer experience; meanwhile VA executives
can receive real-time updates on systematic trends that allow them to make changes.
10
1) To collect continuous customer experience data that make or break the service experience
2) To help field staff and the national office identify areas of improvement
3) To understand emerging drivers and detractors of customer experience.
Part II – Methodology
A. Target Population and Frame
The target population of the TH survey is all Veterans having an Outpatient CVT, HT, or SFT event in the past 7 days. The
identification of Telehealth patients utilizes weekly data extracts from the Corporate Data Warehouse (CDW), which houses the
operational records of VHA. Each Telehealth event eligible for a VEO survey will be associated with one of these three modalities.
The classification of TH events into a modality is based on a combination of primary and/or secondary stop codes (see Appendix 1),
as indicated by VSSC documentation. Under each modality, three types of surveys are designed to inquire about veterans’ experience
in terms of different VA service domains (see Table 1).
Patients scheduling CVT appointments are derived from the general Outpatient scheduling database table from CDW. When
either an actual CVT or SFT appointment occurs, the distinction between Offsite appointments (home, mobile, or non-VA facilities)
vs. appointments taking place within a VAMC or CBOC is based on secondary stop codes. A subset of veterans in each modality and
subtype will be randomly selected to participate in the survey. However, the subtypes of are sparsely populated so these will be
selected into each weekly sample with certainty. Telehealth subtypes that are less than 10% of the modality population will be selected
with certainty. In total, there will be 9 total sets of survey questions.
Table 1. Survey Types under Telehealth Modalities
Telehealth Modality
Clinical Video
Telehealth
Store and Forward
Home Telehealth
Subtype 1
Scheduled
Appointment
Visited VAMC or
CBOC
New Enrollment
Subtype 2
Visited VAMC or
CBOC
Offsite Appointment.
Patient Monitoring
Subtype 3
Offsite Appointment.
Received HCP
Feedback
Disenrollment
B. Sample Size Determination
For a given margin of error and confidence level, the sample size is calculated as below (Lohr, 1999):
11
where
•
•
•
Where
•
•
For population that is large, the equation below is used to yield a representative sample for proportions:
2
𝑍𝑍𝛼𝛼/2
𝑝𝑝𝑝𝑝
𝑛𝑛0 =
𝑒𝑒 2
𝒁𝒁𝜶𝜶/𝟐𝟐 = 1.96, which is the critical Z score value under the normal distribution when using a 95% confidence level (α = 0.05).
p = the estimated proportion of an attribute that is present in the population, with q=1-p.
o Note that pq attains its maximum when value p=0.5, and this is sometimes used for a conservative sample size (i.e., large
enough for any proportion).
e = the desired level of precision; in the current case, the margin of error e = 0.03, or 3%. Also referred to as MOE.
For a population that is relatively small, the finite population correction is used to yield a representative sample for proportions:
𝑛𝑛0
𝑛𝑛 =
𝑛𝑛
1 + 𝑁𝑁0
𝒏𝒏𝟎𝟎 = Representative sample for proportions when the population is large.
N = Population size.
The margin of error surrounding the baseline proportion is calculated as:
Where
•
•
•
•
𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀𝑀 𝑜𝑜𝑜𝑜 𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 = 𝑧𝑧𝛼𝛼/2 �
𝑁𝑁 − 𝑛𝑛 𝑝𝑝(1 − 𝑝𝑝)
�
𝑁𝑁 − 1
𝑛𝑛
𝒁𝒁𝜶𝜶/𝟐𝟐 = 1.96, which is the critical Z score value under the normal distribution when using a 95% confidence level (α = 0.05).
N = Population size.
n = Representative sample.
p = the estimated proportion of an attribute that is present in the population, with q=1-p.
Sample sizes are calibrated to ensure monthly reports have a 3% MOE at a 95% Confidence Level at the modality level for
Veterans 18+. This represents an industry standard for reliability widely used by survey administrators (Lohr, 1999). In order to
12
improve measurements at the facility-level, the sample sizes are marginally increased for the CVT modality, but are still kept low
enough to prevent excessive repeated contacts. The CVT population may support an increased sampling rate (42%), but this is not the
case for SFT, which already has a sampling rate above 50% (62%). Although the Home Telehealth rate is relatively low (21%), there
is little month-to-month turnover within this modality: HT patients receive care on a more permanent basis. Therefore, it is reasonable
to maintain a lower sampling rate since there will be repeated sampling over time.
Table 2 depicts the number of unique Telehealth patients that received care in fiscal year 2017, along with the approximate
monthly populations. Preliminary analysis of the Telehealth patient population indicates that approximately 30% of such patients have
provided an email address to the VHA. This represents the frame population for the survey (see section below for information on
possible bias due to frame under-coverage). Table 2A shows the sample targets from each TH modality for the three reporting periods,
while Table 2B provides the expected number of Veterans that need to be invited to achieve the sample targets, presuming a response
rate of 20% (observed in the VEO Outpatient Survey)
Table 2. Target Population Figures
Survey
Stratum
Unique
Approximate Approximate
Population Monthly
Monthly
in FY 2017 Population
Email
Population
Clinical
Video
336,000
60,000
18,000
1,049
Home
Telehealth
145,000
81,000
24,300
1,054
Store and
Forward
306,000
27,000
8,100
1,027
Source: VA Telehealth Services Factsheet for FY18
Table 2A. Proposed Sample Targets by Time Period
13
Precision at
3% MOE
Survey
Stratum
Weekly
Target
Monthly
Target
Quarterly
Target
Clinical Video
375
1,500
4,500
Home
Telehealth
250
1,000
3,000
Store and
Forward
250
1,000
3,000
Total
875
3,500
10,500
Table 2B. Proposed Number of Invited Telehealth Patients, by Time Period
Survey
Stratum
14
Weekly
Contacts
Monthly
Contacts
Monthly
Email
Sampling
Rate
Quarterly
Contacts
Clinical Video
1,875
7,500
42%
22,500
Home
Telehealth
1,250
5,000
21%
15,000
Store and
Forward
1,250
5,000
62%
15,000
Total
4,375
17,500
N/A
52,500
C. Stratification
As noted in the section above, stratification is employed to ensure that sufficient amounts of Veterans will be sampled within each
of the three modalities. Additionally, a second layer of stratification is used to ensure ample representation for each of the three
subtypes within modality. Thus there are two stratification variables from which explicit sample targets are derived: modality and
subtype. These two stratification variables will be called explicit. The allocation of subtype within each modality is proportional to the
underlying population (as opposed to the email population). In the event that subtype population is less than 10% of the modality
population, then these telehealth patients are sampled with certainty.
To ensure samples are balanced with respect to the following demographic variables: Age Group, Gender, District and
VAMC/CBOC, the random selection of patients within each stratum will follow a systematic sampling design. The Veterans are
sorted according to the demographic variables, and every nth patient will be selected for survey invitation (the value of n will change
randomly with each new survey iteration). This mechanism ensures that resulting respondent sample resembles the email population
with respect to the demographic variables. Since these stratification variables do not have explicit targets for each permutation, they
are deemed to be implicit stratification variables.
Although we do not expect differences between the email population and the general population with regard to geography, email
populations tend to skew somewhat younger and more female. Since these groups are less represented in the Veteran population, it is
not problematic for these demographics to be marginally oversampled – sample weighting calibrated to the general population will
ensure valid representation and correct for any imbalances.
Stratification Type
Variables
Explicit
Modality, Subtype
Implicit
Age Group, Gender, District, VAMC, CBOC
D. Data Collection Methods
At the beginning of every measurement period, VEO data analysts will access the Corporate Data Warehouse (CDW), which
contains the governmental database for all VHA interactions. The telehealth target population will be extracted and recorded with each
new iteration. Those veterans with a valid email address will be included in the survey frame. A new random sample, according to the
stratification and quarantine protocol defined below will be used to create an invitation file. Emails are immediately delivered to all
selected patients. Selected respondents will be contacted within 7 days of their Telehealth interaction. They will have 14 days to
complete the survey. Estimates will be accessible to data users instantly, with the final weighted results available 14 days after the
beginning of the survey.
Table 3. Survey Mode
15
Mode of Data
Collection
Recruitment
Method
Time After
Transaction
Recruitment
Period
Collection
Days
Online Survey
Email
Recruitment
Within 7 days
after Telehealth
Appointment
14 Days
Friday
(Reminder
after 7 Days)
E. Reporting
Researchers will be able to use the Veteran Insight System (powered by Medallia) for interactive reporting and data
visualization. VA employees with a PIV card may access the system at https://va.voice.medallia.com/sso/va/. Trust, Ease,
Effectiveness, and Emotion scores can be observed for each Modality and Subtype (or Survey Type). The scores may viewed by Age
Group, Gender, and Race/Ethnicity in various charts for different perspective. They are also depicted within time series plots to
investigate trends. Finally, filter options are available to assess scores at varying time periods and within the context of other collected
variable information.
Recruitment is continuous (weekly) but the results from several weeks may be combined into a monthly estimate for more
precise estimates, which is the recommended reporting level. Weekly estimates are unweighted, but allow analysts to review scores
more quickly and within smaller time intervals. Weekly estimates are less reliable for small domains, and should only be considered
for aggregated populations. Monthly estimates will have larger sample sizes, and therefore higher reliability set to a 3% MOE at the
95% Confidence level (at the Modality Level for Veterans 18+). Monthly estimates are also weighted for improved representation and
less bias (non-response and coverage, see section G on Sample Weighting). Quarterly estimates are the most precise, but will take the
greatest amount of time to obtain (12 weeks of collection). However, Quarterly estimates are the most suitable for the analysis of
small populations (e.g. VAMC, Female Veterans 18-29, etc.).
F. Quality Control
To ensure the prevention of errors and inconsistencies in the data and the analysis, quality control procedures will be instituted
in several steps of the survey process. Records will undergo a cleaning during the population file creation. The quality control steps
are as follows.
16
1. Records will be reviewed for missing sampling and weighting variable data. When records with missing data are discovered,
they will be either excluded from the population file or put into separate strata upon discussion with subject matter experts.
2. Any duplicate records will be removed from the population file to both maintain the probabilities of selection and prevent the
double sampling of the same veteran.
3. Invalid emails will be removed.
The survey sample loading and administration processes will have quality control measures built into them.
1. The extracted sample will be reviewed for representativeness. A secondary review will be applied to the final
respondent sample.
2. The survey load process will be rigorously tested prior to the induction of the TH Survey to ensure that sampled
customers is not inadvertently dropped or sent multiple emails.
3. The email delivery process is monitored to ensure that bounce-back records will not hold up the email delivery process.
The weighting and data management quality control checks are as follows:
1. The sum of the weighted respondents will be compared to the overall population count to confirm that the records are
being properly weighted. When the sum does not match the population count, weighting classes will be collapsed to
correct this issue.
2. The unequal weighting effect will be used to identify potential issues in the weighting process. Large unequal
weighting effects indicate a problem with the weighting classes, such as a record receiving a large weight to
compensate for nonresponse or coverage bias.
G. Sample Weighting, Coverage Bias, and Non-Response Bias
A final respondent sample should closely resemble the true population, in terms of the demographic distributions (e.g. age
groups). One problem that arises in the survey collection process is nonresponse, which is defined as failure of selected persons in the
sample to provide responses. This occurs in various degrees to all surveys, but the resulting estimates can be distorted when some
groups are actually more or less prone to complete the survey. In many applications, younger people are less likely to participate than
older persons. Another problem is under-coverage, which is the event that certain groups of interest in the population are not even
included in the sampling frame. They cannot participate because they cannot be contacted: those without an email address will be
excluded from sample frame. These two phenomena may cause some groups to be over- or under-represented. In such cases, when the
respondent population does not match the true population, conclusions drawn from the survey data may not be reliable, and are said to
be biased.
17
Survey practitioners recommend the use of sampling weighting to improve inference on the population. This will be introduced
into the survey process as a tool that helps the respondent sample more closely represent the overall population. Weighting
adjustments are commonly applied in surveys to correct for nonresponse bias and coverage bias. As a business rule will be
implemented to require callers to provide email address, the coverage bias for Survey 2 is expected to be small. In many surveys,
however, differential response rates may be observed across age groups. In the event that some age groups are more represented in the
final respondent sample, the weighting application will yield somewhat smaller weights for this age group. Conversely, age groups
that are underrepresented will receive larger weights. This phenomenon is termed non-response bias correction for a single variable.
Strictly speaking, we can never know how non-respondents would have really answered the question, but the aforementioned
adjustment calibrates the sample to resemble the full population – from the perspective of demographics. This may result in a
substantial correction in the resulting weighting survey estimates when compared to direct estimates in the presence of non-negligible
sample error (non-response bias).
Because the email population will have different demographics than the overall population, the initial sample will be selected
in a manner from the frame so that the final respondent sample resembles the overall population. Stratification may also adjust for
non-response (occurring when certain subpopulations are less prone to participate). Targets will be established for every permutation
of the following stratification variables. As such, population values will be collected and recorded by VEO for every data collection
period.
Stratification Variables
Modality (CVT, HT, SFT)
Subtype of Survey Modality
Gender (M, F)
Age Group (18-39, 40-59, 60+)
District
The stratification scheme above will result in a representative sample (w.r.t to the full population). Weighting will then be
applied so that the sample is more fully matched to the population. Sample weights will be generated for Monthly, and Quarterly
estimates.
It was reported earlier that the email population comprises 30% of the full Telehealth population. Since 85% of older
Americans utilize email (Choi & Dinitto, 2013), we can presume that most veterans choose not to share their email address with VHA
or are simply unware of that option. It is assumed that the level of patient satisfaction is not directly related to their email status
18
(Missing at Random). Since age and gender have been observed to be strong predictors of patient satisfaction in other VA health
surveys, the stratification and weighting methodology outlined above will adequately compensate for any bias introduced by the
incomplete frame of population.
Raking or Iterative Proportional Fitting (IPF) will be used as the method for the sampling weighting. The IPF weighting is a
mathematical scaling method, which is used to ensure that the sample data is adjusted so that its marginal (row and column) totals
agree with constraining marginal totals from the population data (Battaglia et al., 2009; Kalton & Flores-Cervantes, 2003; Kolenikov,
2014). Therefore, the response probabilities in IPF depend on the row and column and not on the particular cell (Lohr, 1999). IPF acts
as an iterative process of recalculating the weights so that the original sample values are gradually adjusted through repeated
calculations to fit the marginal constrains from the population. The starting weights correspond to the inverse of the probability of
selection multiple by a non-response adjustment factor.The final sample data is a joint probability distribution of maximum likelihood
estimates obtained when the probabilities are convergent within an acceptable (pre-defined) level of tolerance. The completion of IPF
depends on the convergence of the algorithm that requires the cell estimates are not zero.
The mathematical definitions of IPF are indicated below (Wong, 1992):
𝑝𝑝𝑖𝑖𝑖𝑖(𝑘𝑘)
𝑝𝑝𝑖𝑖𝑖𝑖(𝑘𝑘+1) =
∗ 𝑄𝑄𝑖𝑖
(1)
∑𝑗𝑗 𝑝𝑝𝑖𝑖𝑖𝑖(𝑘𝑘)
𝑝𝑝𝑖𝑖𝑖𝑖(𝑘𝑘+2) =
Where
•
•
𝑝𝑝𝑖𝑖𝑖𝑖(𝑘𝑘+1)
∗ 𝑄𝑄𝑗𝑗
∑𝑖𝑖 𝑝𝑝𝑖𝑖𝑖𝑖(𝑘𝑘+1)
(2)
𝑝𝑝𝑖𝑖𝑖𝑖(𝑘𝑘) is the matrix element in row i, column j, and iteration k.
𝑄𝑄𝑖𝑖 , 𝑄𝑄𝑗𝑗 are the pre-defined row totals and column totals respectively.
Equations (1) and (2) are employed iteratively to estimate new cell values and will theoretically stop at iteration m where:
� 𝑝𝑝𝑖𝑖𝑖𝑖(𝑚𝑚) = 𝑄𝑄𝑖𝑖 𝑎𝑎𝑎𝑎𝑎𝑎 � 𝑝𝑝𝑖𝑖𝑖𝑖(𝑚𝑚) = 𝑄𝑄𝑖𝑖
𝑗𝑗
19
𝑗𝑗
As part of the weighting validation process, the weights of persons in age and gender groups are summed and verified that they
match the universe estimates (i.e., population totals). Additionally, we calculate the unequal weighting effect, or UWE (see Kish,
1992; Liu et al., 2002). This statistic is an indication of the amount of variation that may be expected due to the inclusion of weighting.
The unequal weighting effect estimates the percent increase in the variance of the final estimate due to the presence of weights and is
calculated as:
𝑠𝑠
2
= ( )2
𝑈𝑈𝑈𝑈𝑈𝑈 = 1 + 𝑐𝑐𝑐𝑐𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤ℎ𝑡𝑡𝑡𝑡
𝑤𝑤
�
where
•
•
•
cv = coefficient of variation for all weights 𝑤𝑤𝑖𝑖𝑖𝑖 .
s = sample standard deviation of weights.
1
𝒘𝒘
� = sample mean of weights, 𝑤𝑤
� = 𝑛𝑛 ∑𝑖𝑖𝑖𝑖 𝑤𝑤 ij.
H. Quarantine Rules
VEO seeks to limit contact with Veterans as much as possible, and only as needed to achieve measurement goals. These rules
are enacted to prevent excessive recruitment attempts upon Telehealth patients. VEO also monitors veteran participation within other
surveys, to ensure veterans do not experience survey fatigue. All VEO surveys offer options for respondents to opt out, and ensure
they are no longer contacted for a specific survey. VEO also monitors Veteran participation within other surveys, to ensure Veterans
do not experience survey fatigue. Finally, all VEO surveys offer options for respondents to opt out, and ensure they are no longer
contacted for a specific survey.
Table 4. Quarantine Protocol
Quarantine Rule
Description
Repeated Sampling
Number of days between receiving/completing online
for Telehealth Survey survey, prior to receiving email invitation for a separate
Telehealth experience
Other VEO Surveys
Number of days between receiving/completing online
survey and becoming eligible for another VEO survey
20
Elapsed
Time
60 Days
30 Days
Prioritization
Opt Outs
Prioritization is based on the observed sample sizes.
Persons indicating their wish to opt out of either phone
or online survey will no longer be contacted.
N/A
N/A
Part III – Assumptions and Limitations
A. Repeated Surveys
There are currently existing Telehealth Surveys that are administered electronically to patients through means other than email. This
double-surveying may reduce the expected 20% response rate for the VEO Telehealth Survey, due to survey fatigue of Telehealth patients. The
response rate will be monitored, and the number of patients contacted may be increased to maintain the final respondent targets.
B. Coverage Bias
Since the VEO Telehealth Survey is email only, there is a large population VHA patients that cannot be reached by the survey.
Veterans that lack access to the internet or do not use email may have different levels of Trust and satisfaction with their service.
However, the majority of Veterans that do not share their email addresses do so because they did not have an opportunity to provide
the information, or they elected not to share their email address. As such, it is thought that Veterans in this latter category do not
harbor any tangible differences to other Veterans who do share their information. In order to verify this, VEO plans to execute a
coverage bias study to assess the amount of coverage bias due, and derive adjustment factors in the presence of non-negligible bias.
C. Other Issues
The telehealth service may have limited use to the diagnosis and treatment of common illnesses and conditions. Veterans who have
complex disease types, such as cancer or tumor, may not choose to use telehealth to pursuit the medical care even if they are located in the remote
area. The telehealth service users do not cover Veterans with a wide spectrum of diseases. Therefore, the Veteran respondent types should be
incorporated into consideration when interpreting the survey results and applications.
The telehealth service rating may require Veterans to be familiar with and have access to modern technologies (e.g., Apps,
Mobil Appt, Online Video Chat). Therefore, Veterans who use the telehealth services and respond to the survey may be younger in
age. The demographic distribution of the survey respondents will be reviewed by the VEO when receiving the survey results.
Home Telehealth is designed to provide medical care and services to high-risk Veterans with chronic disease. When such
patients receive the survey, their family members, caregivers, or nurses are likely to respond to the survey on behalf of them.
21
Therefore, the feedback and information from the primary source may be missing. VEO will continue to identify these responses in
the VA databases and assess the effect of them on the Telehealth Survey estimates.
22
Part IV - Appendices
Appendix 1. References
Choi, N.G. & Dinitto, D.M. (2013). Internet Use Among Older Adults: Association with Health
Needs, Psychological Capital, and Social Capital. Journal of Medical Internet Research,
15(5), e97
Kalton, G., & Flores-Cervantes, I. (2003). Weighting Methods. Journal of Official Statistics,
19(2), 81-97.
Kish, L. (1992). Weighting for unequal P. Journal of Official Statistics, 8(2), 183-200.
Kolenikov, S. (2014). Calibrating Survey Data Using Iterative Proportional Fitting (Raking). The
Stata Journal, 14(1): 22–59.
Lohr, S. (1999). Sampling: Design and Analysis (Ed.). Boston, MA: Cengage Learning.
Liu, J., Iannacchione, V., & Byron, M. (2002). Decomposing design effects for stratified
sampling. Proceedings of the American Statistical Association’s Section on Survey
Research Methods.
Wong, D.W.S. (1992) The Reliability of Using the Iterative Proportional Fitting Procedure. The
Professional Geographer, 44 (3), 1992, pp. 340-348
23
File Type | application/pdf |
File Title | DOCUMENTATION FOR THE GENERIC CLEARANCE |
Author | 558022 |
File Modified | 2020-09-18 |
File Created | 2020-09-18 |