SUPPORTING STATEMENT
For
Paperwork Reduction Act Submission
Evaluation of the Department of Veterans Affairs’ Vocational Rehabilitation and Employment Program
April 28, 2009
Prepared by:
3141 Fairview Park Drive, Suite 700
Falls Church, VA 22042
(703) 642-5225
and
ICF International, Inc. (Subcontractor)
Submitted to:
Department of Veterans Affairs
Office of Policy and Planning
810
Vermont Avenue, NW
Washington, DC 20420
Department of Veteran Affairs
Office of Policy and Planning
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
The primary objective of VA’s Vocational Rehabilitation and Employment Program (VR&E) is to help veterans who have service-connected disabilities become suitably employed, maintain employment, or achieve independence in daily living. The target population for this survey includes all disabled veterans who are theoretically eligible to receive VR&E services, including those are have applied for VR&E services and those who have not applied. Specifically, the target population includes veterans who
Have received, or will eventually receive, an honorable or other than dishonorable discharge; and
Have a VA service-connected disability rating of 10% or more.
The law generally provides for a 12-year basic period of eligibility in which services may be used. A VA staff counselor decides if a veteran has an employment handicap based upon the results of an evaluation. Entitlement to services is established if the veteran has a 20% or higher service-connected disability and an employment handicap. If the disability is 10% service-connected, then a serious employment handicap must be found to establish entitlement to vocational rehabilitation services.
EconSys and ICF will rely on several different sources of data and information to perform all of the analyses in this evaluation, including VA administrative files, Social Security Administration earnings data, primary survey data collection, literature reviews, and data files made available from other agencies including the Social Security Administration, the Department of Defense, the Department of Labor, and the Department of Education. Since a major dimension of analysis in the study is participation and conversely, lack of participation in the VR&E program, the VA BDN Compensation and Pension (C&P) Master File and the BDN Chapter 31 file, and the VR&E CWINRS files are crucial in this evaluation for identifying both participants and non-participants and the characteristics of both groups. It is only through systematic primary data collection that we can understand the reasons for participating (or non-participating) and the factors that contribute to employment outcomes following VR&E services.
As noted, a major dimension of analysis for this evaluation is the VR&E participant compared to the non-participant. We define the non-participant not only as the veteran who never applied but also those who applied and were found eligible but did not progress to the point of an approved rehabilitation plan. We then have two distinct groups of non-participants. One group consists of those veterans found to be eligible by VR&E but did not continue, and the other group are disabled veterans potentially eligible but who did not apply. The applicants who were found eligible but did not proceed to develop a plan are of great interest because they were found by VA to meet the requirements for having a service-connected disability and were also found to be entitled because of an employment handicap; yet they did not participate.
EconSys and ICF propose to analyze four cohorts of VR&E participants defined by the year in which the veteran with a Service Connected Disability (SCD) applies for VR&E services. Each cohort is further defined to be applicants who were found to be eligible/entitled and progressed to completing and approving a rehabilitation plan. We propose to sample from the following VR&E participant cohorts:
1992 VR&E cohort
1997 VR&E cohort
2002 VR&E cohort
2003-2008 VR&E cohort
While the cohorts are defined in terms of the year that they apply for VR&E services, we view a cohort as lasting approximately 12 years because veterans have 12 years of eligibility to receive the services. Many veterans delay the start of their program, interrupt their program, and take several years to complete their program. Also, the impact or outcome of the program in terms of employment and income will not be fully apparent for several years after completion. For this reason, we start our cohorts in 1992 and 1997. In order to analyze results for veterans of the National Guard and Reserves and those receiving Individual Unemployability disability compensation payments, separate strata will be created for those groups for the period 1992 to 2008.
In addition to the four participant cohorts, the EconSys and ICF team will analyze non-applicants and applicants who were found to be eligible but did not progress to an approved rehabilitation plan. These non-participants will be matched as closely as possible to VR&E cohorts in terms of year of release from active duty, year of eligibility for VA disability compensation, disability rating, disability condition, and Guard/Reserve status. We will have the following samples of non-participants to match against each of the four VR&E cohorts:
Non-participant veterans with SCD rating matched to 1992 VR&E cohort
Applicants who were found eligible but did not participate
Non-applicants
Non-participant veterans with SCD rating matched to 1997 VR&E cohort
Applicants who were found eligible but did not participate
Non-applicants
Non-participant veterans with SCD rating matched to 2002 VR&E cohort
Applicants who were found eligible but did not participate
Non-applicants
Non-participant veterans with SCD rating matched to the 2003-2008 VR&E cohort
Applicants who were found eligible but did not participate
Non-applicants
To study the reasons for non-participation and obtain broad-based measures of outcomes for both participants and non-participants, we propose to conduct surveys of a representative sample of each group for each of the cohorts described above. We also propose to conduct a Web-based survey of VR&E contract counselors in order to learn more about VR&E’s five service tracks and other services or processes.
Based on a CWINRS data extract that EconSys and ICF obtained from VA for the 2008 Transition Benefit Study, there are at least 524,668 veterans in the sampling frame, including 310,686 non-participants (247,367 who did not apply and 63,319 who were found to be eligible/entitled but did not progress to develop a plan) and 213,982 participants (those who developed a plan.) Among participants, there are at least 22,148 Independent Living (IL) participants. These groups will be sampled separately (see Section 2 below). It is possible these numbers may change once updated information is obtained from VA. However, we expect such changes will not affect the sample design or the sample sizes in a significant way.
The collection of primary survey data from participants and non-participants will require sampling. Relevant criteria for sampling strata will include SCD rating level (such as, 50% or less, 60 to 90%, and 100% ratings), type of disability (such as, physical condition versus mental condition), academic or non-academic education/training, individual unemployability, and Independent Living (IL) status (yes or no). In addition, we would draw samples from the 1992, 1997, and 2002 cohorts who are receiving employment services, and from the 2003-2008 cohorts to adequately represent the five tracks of services.
Previous customer satisfaction surveys of VR&E program participants in 1999, 2000, and 2001-2002 using mail questionnaires achieved response rates of 57%. More recently, the 2007 Veterans Employability Research Survey (VERS), administered to 5,000 applicants to the VR&E Program achieved an overall response rate of 29%. The VERS survey was administered by telephone to 5 cohorts of veterans in various stages of participation in VR&E. The cohorts included veterans that (1) applied for but were not eligible for VR&E, (2) were temporarily interrupted in the program, (3) had dropped out of the program, (4) had completed the Evaluation and Planning phases but not the Rehabilitation phase of VR&E, and (5) had successfully completed VR&E program. Although our proposed study will use cohorts very similar to VERS, we believe the use of a mail questionnaire with a web option and the use of a five-step mailing procedure can result in a response rate of 50% across veteran participants and non-participants in this evaluation.
The formula used to calculate the response rate for the survey is1:
where
RR = Response rate
I = Complete interview
P =
Partial interview
R = Refusal and break-off
NC =
Non-contact
O = Other
UH = Unknown if
household/occupied HU
UO = Unknown, other
e
= Estimated proportion of cases of unknown eligibility that are
eligible
Records of veteran participants and non-participants will be obtained from CWINRS, BDN Chapter 31, and BDN Compensation & Pension (C&P) Master Record data files. Working with the VA Office of Policy and Planning and VBA staff, the EconSys Team will select a sampling frame of approximately 46,980 disabled veterans to participate in a survey about the VR&E program. The survey will include participants of the VR&E program as well as those theoretically eligible for VR&E services but that have not pursued participation in the program. Additionally, a small group of 100 dependents of veterans that have participated in the VR&E program will provide information about their experiences with the program through structured telephone interviews. Names will be processed through a commercial data match processing service, ChoicePoint, to obtain current, validated contact information of veterans and adult dependents of veterans as well as to ensure that no decedent records remain in the sampling file. To ensure confidentiality and data security, only a minimum of personally identifiable variables will be used for data extraction. The following fields will be required to process a valid mailing address file of eligible veterans: SSN/service number, full name, and date of birth. In addition, we will request the currently available telephone number for each veteran and dependent selected for primary data collection to be used, as needed, to remind non respondents to complete the survey mailed to them, for survey completion (such as, to provide survey responses verbally to a telephone interviewer), or to interview non-respondents as part of non-response bias testing. At each stage of processing record updates, we will employ quality control procedures to ensure the security and integrity of veteran and dependent records.
We propose to construct the sampling frame from CWINRS, BDN Chapter 31, and the BDN C&P Master files to represent the target population. CWINRS is VR&E’s electronic case management and information system. According to the Office of Management and Budget (OMB), “…the reliability (consistency within the database) and performance (availability of the system to users in the field and at Central Office) of the CWINRS database is 93 and 99 percent, respectively.”2 The BDN Chapter 31 and C&P Master files will be used to fill missing information or update cases with more recent information which may not be reflected in CWINRS and to identify non-participants in VR&E.
We will divide the frame into two parts: VR&E participants and non-participants. The former includes all disabled veterans who applied for VR&E services, were found eligible/entitled to receive services, and developed a rehabilitation plan. The latter includes those who never applied for VR&E services as well as those who applied, were found eligible, but did not develop an approved plan. Among VR&E participants, we further distinguish between IL (Independent Living) participants and non-IL participants. The IL participants are those whose program goal is independent living; the non-IL participants are those who are on the other four program tracks. Those who applied but were found ineligible for VR&E will be excluded from the frame.
After the frame cases are identified, we will construct necessary frame variables to support data collection and sample design. First, we will construct variables that will be used to contact the sample veterans, including name, SSN, mailing address, E-Mail address, and home telephone number. We will fill missing address information using address matching services such as those provided by ChoicePoint or other similar organizations. Second, we will develop sample design variables including variables that will be used for explicit or implicit sample stratification. Specifically, the sampling frame will contain the following design variables: year of application, program track, disability rating, type of disability, application status, and program participation status. Again, missing or outdated frame information will be replaced by the most recent information from VA. The final sample design will be based on the most recent information available at the time of actual sample selection.
We propose three sets of sampling strata: one for Non-Participants, one for Non-IL participants, and one for IL participants, as follows:
Participation Status:
Non-Participants—those who never applied for service and those who applied, were found eligible but did not develop a rehabilitation plan;
Participants, non-Independent living—those who applied, were found eligible/entitled, developed an approved plan, and who did not need independent living assistance
Participants, independent living assistance—those who applied and were found eligible and needed independent living assistance because of the extent of their disabilities.
Cohort:
1992 Cohort—those applied for service in 1992
1997 Cohort—those applied for service in 1997
2002 Cohort—those applied for service in 2002
2003-2008 Cohort—those applied for service from 2003-2008
Disability Rating:
Participants-Non Independent Living
50% or less
60%-90%
100%
Participants – Independent Living3
90% or less
100%
Non-participants
20%-50%
60%-100%
Type of Disability:
Mental
Physical
Application Status:
Yes—those who applied, were found eligible, but did not develop an approved plan
No—those who did not apply
Program Track:4
Reemployment with previous employer
Rapid employment services for new employment
Self-employment
Employment through long term services
The Independent Living (IL) participants do not have a corresponding comparison group of non-participants for several reasons:
Whether or not a veteran is placed in the IL program is a determination made by the vocational counselor in cooperation with the veteran.
The placement is based on a number of eligibility and other factors, only one of which is the disability rating of the veteran.
Given the information that the Study Team will have access to, it is not possible to predict or infer whether a veteran would have been placed in IL or program or some other program.
Given these conditions, the creation of a sample of IL-non participants is not feasible, although statistical and analytical comparisons can be made between IL participants and those non-participants with identical disability ratings.
Based on these stratification variables, we defined a total of 87 sampling strata, including 36 for non-participants, 39 for non-IL participants, and 12 for IL participants.
Number of Sampling Strata by Population Segments
Stratification of Sample for Participants and Non-Participants in VR&E and Participants in Independent Living |
||||||
Participants |
||||||
|
1992 Cohort |
1997 Cohort |
2002 Cohort |
2003-2008 Cohort |
1992-2008 Cohort |
Total |
Rating: 100% |
1 |
1 |
1 |
|
|
3 |
|
|
|
|
|
|
|
Academic |
|
|
|
|
|
|
100% Mental |
|
|
|
1 |
|
1 |
100% Physical |
|
|
|
1 |
|
1 |
50% or less Mental |
1 |
1 |
1 |
1 |
|
4 |
50% or less Physical |
1 |
1 |
1 |
1 |
|
4 |
60-90% Mental |
1 |
1 |
1 |
1 |
|
4 |
60-90% Physical |
1 |
1 |
1 |
1 |
|
4 |
Academic Subtotal |
5 |
5 |
5 |
6 |
|
21 |
Non Academic |
|
|
|
|
|
|
Non-Academic |
1 |
1 |
1 |
|
|
3 |
50% or less Mental |
|
|
|
1 |
|
1 |
50% or less Physical |
|
|
|
1 |
|
1 |
60-90% Mental |
|
|
|
1 |
|
1 |
60-90% Physical |
|
|
|
1 |
|
1 |
Non-Academic Subtotal |
1 |
1 |
1 |
4 |
|
7 |
Employment |
|
|
|
|
|
|
Employment |
1 |
1 |
1 |
|
|
3 |
50% or less RA Mental |
|
|
|
1 |
|
1 |
50% or less RA Physical |
|
|
|
1 |
|
1 |
60-90% RA Mental |
|
|
|
1 |
|
1 |
60-90% RA Physical |
|
|
|
1 |
|
1 |
Self-Employment |
|
|
|
1 |
|
1 |
Re-employment |
|
|
|
1 |
|
1 |
Employment Subtotal |
1 |
1 |
1 |
6 |
|
9 |
Guard/Reserves |
|
|
|
|
1 |
1 |
Individual Unemployable (IU) |
|
|
|
|
1 |
1 |
1992-2008 Cohort Subtotal |
|
|
|
|
2 |
2 |
Total Non IL |
7 |
7 |
7 |
16 |
|
39 |
Independent Living |
|
|
|
|
|
|
Rating: 100% |
1 |
1 |
|
|
|
2 |
100% Mental |
|
|
1 |
1 |
|
2 |
100% Physical |
|
|
1 |
1 |
|
2 |
Rating: 60-90% |
1 |
1 |
|
|
|
2 |
90% or less Mental |
|
|
1 |
1 |
|
2 |
90% or less Physical |
|
|
1 |
1 |
|
2 |
IL Subtotal |
2 |
2 |
4 |
4 |
|
12 |
|
|
|
|
|
|
|
Subtotal: Participants |
9 |
9 |
11 |
20 |
|
51 |
Non-Participants |
||||||
|
1992 Cohort |
1997 Cohort |
2002 Cohort |
2003-2008 Cohort |
1992-2008 Cohort |
Total |
20-50% App Mental |
1 |
1 |
1 |
1 |
|
4 |
20-50% App Physical |
1 |
1 |
1 |
1 |
|
4 |
20-50% Non-App Mental |
1 |
1 |
1 |
1 |
|
4 |
20-50% Non-App Physical |
1 |
1 |
1 |
1 |
|
4 |
60-100% App Mental |
1 |
1 |
1 |
1 |
|
4 |
60-100% App Physical |
1 |
1 |
1 |
1 |
|
4 |
60-100% Non-App Mental |
1 |
1 |
1 |
1 |
|
4 |
60-100% Non-App Physical |
1 |
1 |
1 |
1 |
|
4 |
Cohort Subtotal |
8 |
8 |
8 |
8 |
|
32 |
Guard/Reserve Non-Applicant |
|
|
|
|
1 |
1 |
Guard\Reserve Non-Participant |
|
|
|
|
1 |
1 |
IU Non-Applicant |
|
|
|
|
1 |
1 |
IU Non-Participant |
|
|
|
|
1 |
1 |
1992-2008 Cohort Subtotal |
|
|
|
|
4 |
4 |
Subtotal: Non-Participants |
|
|
|
|
|
36 |
|
|
|
|
|
|
|
Total Strata |
|
|
|
|
|
87 |
We assume
that most survey estimates can be expressed as a proportion
,
such as the proportion of the participants who are satisfied with the
VR&E services received. The data collected will meet an overall
requirement of a 90%
confidence level +/- 3 %. Therefore, we
recommend that, for each sampling stratum, the margin of error of the
proportion estimate does not exceed 5% at the 90% confidence level.
Under simple random sampling, the standard error of a proportion
estimate is
,
where
is
the estimated proportion and
is
the number of respondents. The recommended precision requirement
implies,
Solving
for
while
assuming maximum population variance (i.e.,
),
we need to complete 270 surveys per stratum. Across the 87 strata, we
will complete a maximum of 270*87=23,490 surveys.
The total number of complete surveys is likely to be smaller than 23,490 for two reasons. First, some small strata may not have enough cases to support 270 completes even if all frame cases are selected into the sample. For example, if we assume a 50% response rate, then a stratum with 100 cases can generate approximately 50 complete surveys even if all 100 cases are included in the sample. Second, under simple random sampling without replacement, we will be able to reduce the required sample size by taking into account the finite population correction factor when estimating the sample size. Such correction can make a difference in the small strata where the sampling rate exceeds 10%. For example, for a stratum with 500 cases on the frame, we will only need to complete 175 surveys to meet the same precision requirement. Therefore, if there are many small cells, the total number of completes needed could be substantially smaller than 23,490. EconSys and ICF will provide a more accurate estimate of the sample size per stratum once the population distribution over the strata is available.
The RFQ
requires that the primary data collection methodology must be able to
“provide results that are representative of the population of
veterans enrolled in the VR&E program at the 90% confidence level
+/- 3%.” We propose to use simple random sampling method within
each stratum (see Sample Selection section). Therefore, with 270
complete surveys, we will meet the 5% precision with 90% confidence
within
each stratum. Across all strata, the overall precision will be much
higher than the RFQ required, meeting the 90%
confidence level +/- 3 %. .
Suppose that the survey eventually completes
surveys across all strata. Then, the overall precision will be
determined by the size of the effective sample size
where DEFF is the design effect, defined as the ratio of the variance under the current design to the variance of a simple random sample of the same size. For a stratified random design, the design effect due to disproportionate allocation can be estimated by
where
indexes
the strata,
represents stratum weights or the proportion of the population in
each stratum,
represents the inverse of the sampling rate in each stratum. The
design effects increase as the variation of the sampling rates across
strata increase. When the sampling rates in the strata are very
different, the design effect for the overall mean can be large and
hence the effective sample size is small. However, with the proposed
sample size of 23,490 cases, the effective sample size will be large
even if the design effect is substantial. For example, even if the
design effect is 4, which represents substantial disproportional
sampling, the effective sample size will be 6,210. An effective
sample of 6,210 cases can support estimates with a margin of error no
greater than 1.3% at the 95% confidence level, which more than
satisfies the requirement specified in the RFQ. EconSys and ICF will
provide an accurate estimate of the design effect once the final
sampling frame is available.
The initial sample size will be determined by taking into account expected response rate. The following table shows the derivation of the initial sample size. Overall, we will start data collection with an initial sample of 46,980 veterans selected from CWINRS.
Determination of Sample Size Based on Strata and Expected Response Rate
|
||||
|
Expected Response Rate |
Expected Number of Completes per Stratum |
Number of Strata |
Total Initial Sample Size |
Participants |
50% |
270 |
51 |
27,840 |
Non-Participants |
50% |
270 |
36 |
19,440 |
Total |
|
|
87 |
46,980 |
The sample will be randomly selected independently in each stratum. Within each stratum, the sample will be sampled systematically. Prior to systematic sample selection, the stratum frame will be sorted by branch of service, gender, and age (other variables?). Sorting by these variables creates an implicit stratification within the stratum such that the stratum sample more closely mirrors the stratum population (with respect to the sorting variables) than can be achieve under simple random sampling. We will use the SAS procedure SURVEYSELECT to implement the systematic sampling routine. SAS automatically outputs selection probabilities and sampling weights which will be used later for weighting adjustments.
We will compute survey weights to support unbiased estimation from the respondent sample. The purpose of the weights is to reduce bias due to differential selection probabilities and nonresponse. To the extent possible, we will also try to improve the efficiency of the sample through poststratification weighting adjustments for frame undercoverage. We propose a three-step weighting process as follows.
Base Weight. We will start by computing a base weight for all sample members. The base weight is simply the inverse of the selection probability of each veteran under the sample design
where
is
the selection probability which varies by strata. The sum of base
weights is equal to the total number of cases on the frame.
Nonresponse Adjustment. The base weight will be adjusted for nonresponse through weighting class adjustments where the weighting classes are defined as the sampling strata. The nonresponse adjusted weight will be calculated as
where
denotes
the weighting class and
is
1 for respondents and 0 otherwise. The summations are over all cases
in each weighting class.
Poststratification Adjustment. Poststratification adjustment helps improve the efficiency of the sample by making the sample distribution conform to known population distribution with respect to some important characteristics. For example, if the population distribution over branch of service is known, poststratification can ensure that the sample distribution conforms to population distribution with respect to branch of service. In this case, each branch of service is considered a poststratum. Poststrata can also be defined by the cross of multiple variables such as branch of service by gender. The poststratified weight will be calculated as
where
denotes
the poststrata and
is
the total population size of poststratum
.
The poststratified weight
will
be used as the final analysis weight.
Variance Estimation
EconSys and ICF will compute a standard error for each estimate derived from the sample and will include this in a table in the study technical appendix. The standard error can be used to support hypothesis testing and the construction of confidence intervals of point estimates. Standard textbook formulas typically underestimate the standard error for estimates derived from complex designs because they assume simple random sampling. The proposed sample design features stratification and unequal selection probabilities across strata. Therefore, the final analysis weights will vary across strata (possibly within strata as well). We propose to use SUDAAN to estimate the standard error of point estimates. SUDAAN takes into the complex design features while estimating the standard error.
Participants in this evaluation study will comprise three distinct groups. Veteran Participants will consist of veterans who completed or suspended their participation in the VR&E process. Similarly, Dependents of Veterans as Participants will consist of as many as 100 eligible dependents of veterans having direct experience with the VR&E program. Finally, Veteran Non-Participants will include those veterans who were theoretically eligible for the VR&E program by VBA but did not apply (as of the date of the sampling frame) for the program and those who applied and were deemed eligible but did not complete a rehabilitation plan. VR&E veteran participants and non-participants will participate in a survey; adult dependents of veterans will participate in structured telephone interviews. We will seek survey completes from 23,490 disabled veterans (13,770 Veteran Participants and 9,720 Veteran Non-Participants). separate questionnaire forms have been developed each of these two veteran groups.
Survey topic areas include the following categories:
Your Disability
VR&E Program Outreach
VR&E Program Participation (Including status)
Pre-Military Employment History
Current Employment
Allocation of Time
Physical and Mental Health
Social and Family Assistance
Personal (Non-Work) Skills and Abilities
Non-VA Services Obtained
Use of Other Non-VA Programs
The “New Post 9/11 GI Bill”
Non-VA Program Satisfaction
Financial Questions
Addendum A. Independent Living Questions
Disabled veterans comprising the veteran participants include respondents that (1) completed the VR&E program and are deemed rehabilitated, (2) participants whose status is currently in training or temporarily interrupted their training, and (3) participants that have dropped out permanently from the VR&E program or are otherwise determined to be not eligible. Survey topic areas for this population include the following:
About You (Demographics)
Your Disability
VR&E Program Outreach
VR&E Program Participation (Including status)
Pre-Military Employment History
Current Employment
Allocation of Time
Physical and Mental Health
Social and Family Assistance
Personal (Non-Work) Skills and Abilities
VR&E Services Obtained
Use of Other Non-VA Programs
The “New Post-9/11 GI Bill”
VR&E Program Satisfaction
Financial Questions
Addendum A. Independent Living Questions
Copies of both questionnaires and the interview protocol for dependents are attached to this Supporting Statement.
The veteran participants and veteran non-participants questionnaires will be developed for mail administration with an option for completion by web. These two modes for survey completion will be available to all veteran respondents. Participants will be mailed a notification letter followed one week later by a paper survey which respondents may fill out and return via a pre-paid postage Business Reply Envelope (BRE). Respondents will also be given the option to complete the survey via web administration. The survey cover letter will contain information to respondents for completing the survey using either mode. The cover letter will contain a unique password to access the survey URL.
The EconSys and ICF team, in collaboration with the VA Office of Policy and Planning, will finalize the two questionnaires that will be mailed to 46,980 participants. We will distribute the mailed surveys using a modified version of the Total Design Method (Dillman, 1981) to maximize survey response rates. Distribution of the two questionnaire types to veterans in the sample will be as follows:
Mailing 1 – A pre-notification letter.
Mailing 2 – A survey package (i.e., cover letter, survey, business reply envelope, and web survey instructions).
Mailing 3 – A reminder postcard package.
Mailing 4 – A second wave survey package mailed to non-respondents following the reminder postcard (i.e., second wave cover letter, survey, business reply envelope, and web survey instructions).
Mailing 5 – A final reminder postcard mailed after the second wave survey package.
The basic elements and procedures of the Total Design Method are to:
Minimize the burden on the respondent by designing questionnaires that are attractive in appearance and easy to complete; printing mail questionnaires in booklet format; placing personal questions at the end; creating a vertical flow of questions; and creating sections of questions based on their content.
Personalize communication with the respondent by printing letters and envelopes individually, using signatures, using first-class postage on outgoing and return envelopes; and constructing a persuasive letter.
Provide information about the survey in a cover letter to respondents, and use a pre-notification letter to inform respondents that a survey is forthcoming.
Use of regular, follow-up contacts.
The EconSys and ICF team will use the 5-step mailing process with follow-up to administer both questionnaires to the two veteran groups to maximize overall response rates; the use of a reminder post card for survey follow-up tends to increase response rates by between 5 and 8 percentage points. The use of both reminder post cards and a second survey mailing almost doubles the response rate. The improved response rate and reliability of the data more than offset the increase in the cost of this survey administration process. All survey materials will be mailed via first class mail to each veteran in the participant and non-participant sample. The postage-paid envelope (BRE) for survey return will include the return address for the EconSys and ICF team. Copies of the mailing materials are attached to this Supporting Statement.
In addition to the use of notification letters, duplicate survey mailings, reminder letters and postcards in the Total Design Method, we will employ additional strategies to boost response rates such as:
Use of novelty in correspondence such as reminder postcards designed in eye-catching colors.
Use of an extended survey field-period to afford opportunities to respond for subgroups having a propensity to respond late (such as, males, young, full-time employed).
Use of well-designed questionnaires and the promise of confidentiality.
Use of a contact name and telephone number for inquiries.
Finally, it is worth noting that the VR&E surveys are likely to have differing degrees of salience – an important factor in inducing survey completion (and thus, response rates) – for respondents depending on such factors as age and history of interaction with C&P, VR&E, and related VBA services and benefits. Despite these challenges, the EconSys and ICF team remains confident in our ability to obtain valid and reliable data from which to answer the research questions in this evaluation.
We will assume that by the end of Mailing #3, the response rate for each group will be approximately 30%. The quantities for Mailing #4 and Mailing #5 will be adjusted accordingly. The specifications of the mailings are presented below.
Mail Out Quantity for Participant and Non-Participant Surveys |
||
Mailing Steps |
Participant Survey Quantity: Sample Size=27,540 |
Non-Participant Survey Quantity: Sample Size=17,280 |
Mailing #1: Pre-notification letter |
27,540 |
19,440 |
Mailing # 2: Notification cover letter with URL and password, survey, and BRE |
27,540 |
19,440 |
Mailing #3: Reminder card one with URL and password (Qty 24,000) |
27,540 |
19,440 |
Mailing #4: Second survey with cover letter and BRE |
19,278 |
13,608 |
Mailing #5: Second reminder card |
19,278 |
13,608 |
Data collection (i.e., the receipt and logging of surveys for analysis) will cease two weeks after the final reminder postcard. EconSys and ICF will closely monitor the number of surveys completed and returned via postal mail, and completed via the World Wide Web to track response rates on a weekly basis.
The paper surveys will be scanned using Optical Mark Read (OMR) technology. The data from Web respondents will be merged with the data from the paper surveys to create one consolidated data file.
Non-response bias refers to the error expected in estimating a population characteristic based on a sample of survey data that under-represents certain types of respondents. Stated more technically, non-response bias is the difference between a survey estimate and the actual population value. Non-response bias associated with an estimate consists of two components—the amount of non-response and the difference in the estimate between the respondents and non-respondents. While high response rates are always desirable in surveys, they do not guarantee low response bias in cases where the respondents and non-respondents are very different. Still, low response rates will further magnify the effects of the difference between respondents and non-respondents that contributes to the bias. Given the increasing use of survey data to inform assessments and performance indicators, it is crucial that we know who completes surveys.
Two types of non-response can affect the interpretation and generalizability of survey data: item non-response and unit non-response. Item non-response occurs when one or more survey items are left blank in an otherwise completed, returned questionnaire. Unit non-response is non-participation by an individual that was intended to be included in the survey sample. Unit non-response—the failure to return a questionnaire—is what is generally recognized as survey non-response bias.
Non-response follow-up (NRFU) analyses can help identify potential sources of bias and can help reassure data users, as well as the agency collecting and releasing the data, of the quality of the data collected. One approach is to conduct a follow-up survey by telephone of a sample of non-respondents to assess differential responses to key survey items. Another approach is to conduct record linkage—using demographic variables from the mailing address file to analyze whether non-respondents differ demographically from respondents. ICF will use these proven methods to examine non-response bias in the VA VR&E Survey if warranted at the conclusion of the data collection period.
Since it is not always possible to measure the actual bias due to unit non-response, there are strategies for reducing non-response bias by maximizing response rates across all types of respondents. In the face of a long-standing trend of declining response rates in survey research (Steeh, 1981; Smith, 1995; Bradburn, 1992; De Leeuw & Heer, 2002; Curtin & Presser, 2005), these strategies include:
Use of notification letters, duplicate survey mailings, reminder letters and postcards.
Use of novelty in correspondence such as reminder postcards designed in eye-catching colors.
Use of an extended survey field-period to afford opportunities to respond for subgroups having a propensity to respond late (such as, males, young, full-time employed).
Use of well-designed questionnaires and the promise of confidentiality.
Providing a contact name and telephone number for inquiries.
Employing these strategies to the administration of the VA Vocational Rehabilitation and Employment Program Survey will be crucial for maximizing high response rates across all respondent types. Additionally, the survey is likely to have differing degrees of salience—an important factor in inducing survey completion—for respondents depending on age and history of interaction with the VA-supplied services and benefits. Despite these challenges, ICF remains confident in our ability to obtain valid and reliable data from which to answer the research questions in this evaluation.
Non-response follow-up analyses can help identify potential sources of bias and can help reassure data users, as well as the agency collecting and releasing the data, of the quality of the data collected. The Study Team's approach to examine the presence of non-response bias in this survey will be conducted in two steps:
Step 1 – Compare the Demographics of the VR&E Evaluation Survey to the Demographics of the 2001 National Survey of Veterans Survey (NSV) Respondents, and the 2007 Veterans Employability Research Survey. If the results from the 2008 National Survey of Veterans become available during this project, the Study Team will compare the demographics to that as well. One initial way to examine whether there is a non-response bias issue with the VR&E Evaluation Survey is to compare the demographics of the respondents from the that survey to the demographics of the respondents from the 2001 National Survey of Veterans (NSV). Since the NSV captured data from a representative sample of veterans, the demographics of these respondents should look similar to the demographics of the VR&E Evaluation Survey respondents. For this analysis, we will draw comparisons on demographics to include, but not limited to: age, gender, marital status, and education. This first step will provide indication to where potential non-response bias may exist (if at all). We will follow a similar procedure using the 2007 Veterans Employability Research Survey.
Step 2 – Compare the Demographics of Respondents from the VR&E Evaluation Survey to the Demographics of Non-Respondents from the VR&E Evaluation Survey. To further examine the presence of non-response bias, we will compare the demographics of responders to the non-responders (i.e., those who did not respond to the VR&E Evaluation Survey ). The comparison between these two groups will be made on the following five variables:
o War Period – it is possible that respondents may be older or younger in age than non-respondents. For example, we may see that veterans from earlier war periods (such as, WWII) may respond at a higher rate than veterans from later war periods (such as, Persian Gulf War). As a result, we will examine non-response bias as it relates to war period. The data source for war period will be the CWINRS database.
o Gender – it is possible that participants from a certain gender (i.e., male) may respond at a higher rate than their counterpart. As a result, we will examine non-response bias as it relates to gender. The data source for the gender will be the CWINRS database.
o Region – it is possible that participants from a certain part of the country (i.e., region) may respond to the survey at a higher rate than those who are from another part of the country. As a result, we will examine non-response bias as it relates to region. The data source for region will be CWINRS as updated during the course of the study.
o Urban/Rural/Suburban – it is possible that participants from urban areas may respond at a higher or lower rate than participants from rural or suburban areas. As a result, we will examine non-response bias as it relates to urban vs. rural vs. suburban. The data source for the urban/rural/suburban of both the respondents and non-respondents will be their address provided by ChoicePoint (cross-referenced with databases from the Census to classify urban/rural/suburban status).
o Income – it is possible that participants from a certain income bracket (such as, low income earners) may respond to the survey at a higher rate than participants from other income brackets. As a result, we will examine non-response bias as it relates to income. The data source for the income of the respondents will be their answer to the survey question about family income (note: respondents that skip this question will have their family income imputed using the median family income in the associated zip code via Census) and the data source for family income of the non-respondents will be the median family income for their associated zip code via Census.
Step 3 – Compare the Types and Proportion of Disabilities of Respondents from the VR&E Evaluation Survey to Types and Proportion of Disabilities of Non-Respondents with information in the CWINRS database from which the samples will be drawn.
Representatives of EconSys and ICF International held two VA VR&E survey pretest sessions on March 27, 2009, at the Roanoke VA Regional Office. Candidates for the pretest were identified by the VR&E staff at Roanoke based on their status as participants or non-participants in VR&E. VA staff contacted the veterans and asked if they would be willing to pretest the instruments. Two groups of veterans agreed to participate. Session I was intended as a pretest of the VA VR&E Participant Survey; Session II was intended as a pretest for the Non-participant version of the Survey. The Participant Questionnaire was pre-tested with nine veteran volunteers; the Non-Participant Questionnaire was pre-tested with two veteran volunteers.
The session was introduced with a PowerPoint presentation explaining the purpose of the survey and the purpose of the pretest. The volunteers were asked to review and sign a consent form indicating their willingness to participate in the pretest. They were then asked to review and comment on a set of mailings (letters and postcards) that would precede and accompany the questionnaire. Following this step, the volunteers were asked to read through the questionnaire and attempt to answer each question (excluding any questions they did not wish to answer), marking any questions for which they noted problems, such as missing response categories or unclear intent. The volunteers were also asked to note the start time and stop times for completing the questionnaire. When the volunteers completed their surveys, they participated in a question-by-question discussion of any problems with the instrument. The volunteers were asked to raise any topics they believed were not sufficiently addressed in each questionnaire.
The instrument for the survey of contract counselors was pretested by experienced counselors. A project consultant, who is an expert in vocational rehabilitation counseling, identified potential participants for the pretest of the VR&E Contract Counselor Survey. The pretest participants were all experienced counselors, most with doctorate degrees. Once the participants were identified, the team sent emails to each of them. The email included the survey in an attached Microsoft Word document along with a short list of instructions. The instructions asked the participant to take the survey and record the amount of time it took them from start to finish. After recording the time, the participants were asked to review the survey questions and give the study team feedback on the questions. Feedback could include comments on which questions were confusing, did not offer enough options for answers, and required grammatical changes. Once the participants reviewed the survey and made comments to the Word document, they emailed the results to the study team along with the time it took to complete the survey and any general comments they wanted to add. Once all surveys were received from the participants, the study team calculated the average amount of time it took to complete the survey. In this case, it was 24 minutes. The study team also consolidated all of the participants’ comments into one document and made changes to the survey instrument according to the comments that were received. Discussions with VA staff concerning the role of contract counselors resulted in elimination of one third of the pretest questions. The estimated time for completion of the instrument was therefore reduced to 20 minutes to allow for the reduced number of questions.
Dr. George Kettner, EconSys, 703-333-2190
Mr. Ray Wilburn, EconSys, 703-738-0535
Mr. Ali Sayer, EconSys, 703-333-2193
Dr. Christopher Spera, ICF International, 703-934-3446
Dr. Michael Yang, ICF International, 703-934-3320
Dr. Ronald Szoc, ICF International, 703-934-3456
Dr. Diane Boyd, ICF International, 703-934-3721
1 This formula is taken from Standard Definitions: Final Dispositions of Case Codes, and Outcome Rates for Surveys, The American Association for Public Opinion Research (AAPOR), 2006, Lexena, KS.
2Expectmore.gov. (2008). Detailed information on the Vocational Rehabilitation and Employment Program assessment. Retrieved July 2, 2008, from http://www.whitehouse.gov/omb/expectmore/detail/10003220.2006.html
3 Only two levels are needed here because 85% of disabled veterans who are in independent living have CCD ratings of 70% to 100%, so two levels should be sufficient.
4 The VR&E Program has five tracks, including Independent Living Services (IL). Because we propose to sample the IL participants separately, the Non-IL participants will have only four tracks. Track indicators were not used prior to 2006. Therefore, veterans receiving employment services in the 1992, 1997, and 2002 cohorts will be identified in one strata for each cohort.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | OMB Supporting Statement |
Author | Ronald Szoc, PhD |
File Modified | 0000-00-00 |
File Created | 2021-02-04 |