SUPPORTING STATEMENT – PART B
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
Description of the Activity
The survey is administered to a stratified random sample of providers over a period of four years. Each year, the contractor responsible for sampling, analysis and reporting assembles the sampling frame, and draws the sample, then provides the sample with contact information to another contractor, which administers the survey by mail and by telephone. The responses are returned to the analysis contractor, which edits them and removes duplicate responses, and adds them to responses from previous years. That contractor calculates non-response adjusted sampling weights for the full set of respondents, and attaches them to an analytic file containing responses and selected provider characteristics. For reporting purposes, geographic units are defined based on their relation to TRICARE during the reference period of the most recent survey year, then analyzes the results and prepares briefings and reports as directed by Congress. Details of this activity are described below.
The TSS Provider Surveys are conducted over four years, starting in 2012. After four years all of the U.S. will be surveyed. Congress directed that DHA survey 20 markets in which Prime is offered and 20 in which Prime is not offered every year, and to incorporate recommendations from beneficiary and provider representatives in the choice of survey sites. To fulfill this mandate, strata were constructed identifying all of the zip codes in each state that were designated as Prime Service Area (PSA) and non-PSA. These designations were based on the best information available in August 2012.
The population of each of the resulting state-level markets was measured as the total number of Standard-eligible beneficiaries, plus Selected Reserve. States with small non-PSA or PSA populations were combined with nearby states to avoid creating strata with small populations. The result was 43 PSA strata and 37 non-PSA strata. Samples of 20 PSA and 20 non-PSA strata for each of the four years were selected during the first year of the survey. Sampling was conducted with probability proportional to size, so that markets with large populations, using the population definition described above, will be surveyed in up to four years. Because the PSA and non-PSA regions were formed based on the number of beneficiaries and not the number of providers, some regions with large number of providers were sampled at relatively low rates.
To fulfill the requirement that they be consulted regarding survey locations, provider and beneficiary representatives are presented with the list of states scheduled for survey in each year. From the list of areas to be surveyed in the current year, they suggest cities and towns where access should be measured. Health Service Areas (HSAs)1 corresponding to these cities and towns were identified. A list is created based on those recommendations, and sorted in priority order. Each HSA is allocated a sample of a fixed number of providers or all providers in the HSA, whichever is smaller. HSAs are included in priority order until the remaining sample is allocated. Those HSAs not sampled in one year are considered for a future year’s list.
Prior to 2012, Prime was offered throughout the South region. Therefore that region was entirely comprised of Prime Service Areas. At the time that sampling strata were developed for this survey, it was expected that the South would be divided into PSA and non-PSA regions. However, during the reference periods of the survey fielded in 2012 and 2013, that division had not yet taken effect and Prime was still offered in the planned non-PSA regions. On October 1, 2013, new MCSC contracts went into effect, dividing the South into PSA and non-PSA regions and reducing the size of regions in the North and West where Prime is offered. However, the new non-PSA regions do not correspond exactly to those planned when the sampling strata were developed. Because of these changes in the organization of TRICARE and the resulting divergences between the characteristics of strata at the time of sampling and the time of reporting and analysis, the geographic units used for reporting differ from those used for sampling. More detail is provided, following the description of sampling procedures below.
The TSS provider survey is required to include both physicians and non-physician mental health providers. Therefore, two separate questionnaires with common questions are fielded. The first questionnaire is for physicians and the second is for non-physician mental health providers. The provider list is assembled from 7 sources:
American Medical Association (AMA) master data file for physicians (Physicians-MDs and Doctors of Osteopathy-DOs) including Psychiatrists
State licensing records, through LISTS, Inc. for mental health non physicians
American Association of Marriage & Family Therapists through INFOCUS Marketing
National Association of Social Workers through INFOCUS Marketing
American Association of Pastoral Counselors
American Psychiatric Nurses Association
NPI- the National Plan Provider and Enumeration System from the Centers for Medicare and Medicaid Services (CMS) for psychiatric nurses from all PSAs and HSAs and mental health professionals (non-physicians) from Hawaii (see http://nppesdata.cms.hhs.gov/cms_NPI_files.html)
Physicians were selected from the Physician Master File, if their type of practice is office based or unclassified patient care, with a principle employer other than the Federal Government.
The following physician specialties are excluded:
Aerospace
Allergy and Immunology/Clinical Laboratory
Anatomic/Clinical Pathology
Anesthesiology
Bloodbanking/Transfusion Medicine
Chemical Pathology
Clinical Biochemical Gene
Clinical Pharmacology
Cytopathology
Epidemiology
Forensic Pathology
Forensic Psychiatry
Hematology/Pathology
Medical Management
Medical Microbiology
Medical Toxicology
Neuropathology
Pathology
Pediatric Anesthesiology
Public Health and General Preventive Medicine
Underseas Medicine
Mental Health providers are selected from sources above as follows:
Social Workers: from National Association of Social Workers through INFOCUS Marketing and NPI
Psychiatric Nurses: from the American Psychiatric Nurses Association and NPI
Psychologists: from LISTS and NPI
Marriage and Family Therapists: from the American Association of Marriage & Family Therapists through INFOCUS Marketing and NPI
Pastoral Counselors: from the American Association of Pastoral Counselors and NPI
Mental Health Counselors: from LISTS and NPI
Psychiatrists: from the AMA file
Because the Mental Health sample is derived from 7 different data sources, the sources must be merged together to remove duplicates and to obtain the best contact information for each provider. Contact information from the NPI is given first priority because the NPI contains office (i.e. work) addresses and phone numbers. That contact information is supplemented with contact information from the various other data sources.
In each year, providers in the randomly selected PSAs and non-PSAs (including the designated HSAs if they are contained in those areas) contribute to national and regional estimates, though with a lower sampling weight as the total number sampled in a particular area increases. Stratified samples of providers are selected from the frame within the randomly selected PSAs and non-PSAs and the purposively selected HSAs. The TSS Provider Survey sample is stratified by
1. Prime service area
2. Non-Prime service area,
3. HSA, and
4. Type of provider: physician or mental health.
Strata are a combination of type of area (PSA, non-PSA, or HSA) and type of provider. Within each physician stratum we implicitly stratify the providers by primary care provider or specialists and selected a systematic sample. Within each mental health stratum we implicitly stratify by type of provider (social work, psychiatric nurse, psychologists, marriage and family therapist, pastoral counselor, and mental health counselor) and select a systematic sample. Implicit stratification results in a sample that is proportional to the distribution of the types of providers. Psychiatrists are included within the mental health sample.
The survey is fielded to the randomly sampled providers by mail and telephone. Procedures for collecting information are described in “Procedures for Collection of Information” below.
Weighting
The analysis of survey data from complex sample designs, such as this one, requires application of weights to accomplish the following:
Compensate for variable probabilities of selection
Adjust for differential response rates
Improve the precision of the survey-based estimates through post-stratification [for details, see Brick and Kalton (1996) and references cited therein], and trim extreme weights.
Sampling weights are equivalent to the reciprocal of the probability of each respondent’s selection into the sample. Survey sampling weights account both for the probability of selection in the first stage, i.e. that a geographic area is sampled, and in the second stage, i.e. that within an area, an individual provider is sampled. The HSAs selected for oversampling are parts of areas randomly selected in the first stage and therefore have the same first stage probability of selection. However, they are oversampled in the second stage and responses from those areas are thus assigned smaller sampling weights than others in their strata. A propensity model estimates the probability of response as a function of an array of provider characteristics from the sample frame. Sampling weights are further adjusted for nonresponse within the classes formed based on the percentiles of propensity scores from the propensity model.
Finally, the nonresponse adjusted weights are poststratified to the frame totals to obtain specific domain weighted totals equal to the population totals, and some extreme weights are trimmed to reduce the excessive effect of extreme weights on variance inflation. Poststratification is performed based on reporting units that take into account changes in TRICARE’s administrative structure as described below.
Because of the complex sample design and the divergence between sampling and reporting units, weights are applied and precision of estimates is measured using replicate weights. To construct replicate weights, the entire file of sampled cases is first sorted in sample selection order in which the stratification variables are used in the sorting process. Next, 50 mutually exclusive and exhaustive systematic subsamples of the full sample are identified in the sorted file. A jackknife replicate is then obtained by dropping one subsample from the full sample. As each subsample is dropped in turn, 50 sets of jackknife replicates are produced. The weighting process after the modeling is applied to the full sample is then applied separately to each of the jackknife replicates to produce a set of replicate weights for each record. The propensity score modeling is skipped. Instead the weighting cells from the propensity scores from the full sample weight are adopted in the replicate weights construction. Then, a series of jackknife replicate weights is attached to the final data in order to construct jackknife replication variance estimates.
Reporting and Analysis
Because Congress directed that the survey be performed over a four-year period and because of changes in the planned and actual locations in which Prime is offered, the strategy for weighting, reporting and analysis must accommodate changes in stratum definitions. Therefore, in order to fulfill the requirement that the TSS include each year 20 regions in which Prime is not offered, non-PSA strata are divided into smaller geographically contiguous regions. These regions are designated reporting strata. Similarly, for the 2012 survey, the areas of the South where it was intended that Prime would not be offered in the future were reported separately from other PSA regions, and designated reporting strata. In 2012, a total of 20 non-PSA and 27 PSA reporting strata were created. In 2013, reporting strata definitions were revised in such a way that 40 PSA and 40 non-PSA strata were constructed out of the sites surveyed in 2012 and 2013.
Following the inception of new MCSC contracts in October, 2013, areas in which the Prime benefit is offered were reduced, resulting in changes cutting across both the sampling strata and reporting strata defined in 2013. Therefore, for analysis and reporting in 2014 and 2015, the U.S will be divided into PSA, continuous non-PSA and transitional non-PSA regions for reporting purposes, comprising 60 PSA and 60 non-PSA regions for 2014 and 80 of each for 2015.
Results are presented showing geographic variation, variation among specialties, variation between market types and in comparison to benchmarks established by previous surveys. In order to calculate rates and test hypothesis that rates differ between analytic categories or compared to benchmark, weighted rates are calculated using non-response adjusted sampling weights described above, and variance of estimates is calculated using Jackknife replicate weights. Calculation of variance in the TSS requires a design-based variance estimation technique that is available in most statistical software packages for analysis from a complex survey data set. For reports and briefings prepared by the analysis contractor, SUDAAN® is used.
2. Procedures for the Collection of Information
As described above, contact information for sampled providers is transmitted to the vendor responsible for fielding the survey. Each sample member is assigned an internally generated ID number. Only that ID is used when the survey is fielded. Responses are recorded and the response data is incorporated into the analysis file using the internally generated ID and reports are prepared.
A multi-mode data collection method is used through a mailed survey with internet option and a telephone follow-up survey. An initial mailed survey is sent to members of the target population within specified geographic areas, with a follow-up mail survey sent within a defined period after the first. The initial and follow-up survey includes a cover letter signed by a senior DHA director requesting the recipient’s participation and requesting a response by return mail, internet or facsimile, as well as providing a toll-free number to call with any questions and a web address to take the survey via the internet (See Attachment 6 for the mail instruments). If providers’ responses to the mailing are not obtained, their offices are contacted by telephone. The telephone survey uses a standardized Computer Assisted Telephone Interview (CATI) protocol.
Mailed surveys are sent to the provider’s stated work address only, and not the residence, to the extent the work address is different from the home address and can be discerned. Telephone follow-up is to the work address as well, and, similarly, to the extent the work telephone is different from the home address and can be discerned. These surveys are designed to be answered by the billing manager or person responsible for the provider’s billing practices, to minimize the burden on the provider’s practice, and to obtain data the billing expert may be most knowledgeable about. If a recipient receives multiple surveys for multiple providers in the same office or practice group, the recipient is asked to complete a separate mail survey or answer to a separate scripted telephone survey for each provider.
The survey operations contractor administers the telephone survey. The vendor uses standard telephone survey research methodology in administering the telephone questionnaires to include documentation of interviewer training, valid retrievable call records, and a log of interview sessions. A computerized telephone matching service (if needed) and Directory Assistance are used to track current telephone numbers. To optimize the chances of locating respondents and enlisting cooperation, calls are made at different times of the day, on different days of the week, but calls are made only during normal business hours. Calls are not made during weekend or evening hours.
The survey is fielded only to providers with specialties reimbursed by TRICARE, and only to providers who offer care in an office-based practice. Information from the frame is not always sufficient to determine eligibility. Therefore, procedures for determining eligibility are incorporated in fielding and subsequent data processing methods, as described below.
TRICARE reimburses mental health providers of the following types:
Psychiatrists (or other physicians)
Clinical psychologists
Certified psychiatric nurse specialists
Clinical social workers
Certified marriage and family therapists
Pastoral counselors
Mental health counselors
For purposes of this survey, we elected to survey only those who may choose whether to accept TRICARE patients. Therefore, the first 3 questions of the mental health provider survey instrument (see mental health and physician questionnaires) attempt to “weed out” any providers who are not reimbursable by TRICARE and who are not able to choose.
A respondent is counted as part of the final sample if they are eligible for the survey and a respondent to the questionnaire.
A respondent is not considered to be a valid respondent to the survey if they did not respond to any of the following questions: PROVIDE (Question 1), AWARE (Question 2 for Physician, Question 3 for mental health), ACCEPT (Question 4 for Physician, Question 6 for mental health). If a respondent answers any one of these questions, then they are considered to be a valid respondent.
Dispositions are assigned and verbatim responses are coded by the survey administrator to facilitate analysis. The coded response data and the original responses are both returned to Mathematica, where they are reviewed and incorporated into a file for subsequent processing and analysis.
3. Maximization of Response Rates, Non-response, and Reliability
The cover letter that accompanies each mailed survey is the primary method used to encourage participation in the survey effort. Both the cover letter and telephone script appeal to the respondent’s patriotism, and include information about the purpose of the survey and a brief description of how the information will be used by TMA. For offices with multiple selected physicians and mental health providers, the billing manager recipient will receive separate surveys for each requested physician, and will be asked to complete one survey for each. In addition, telephone interviewers are trained in interviewing techniques designed to minimize incidences of respondent refusals to participate in the survey. They ask respondents to answer separately for each physician in cases where multiple doctors are being surveyed in the same office. The table below presents unweighted response rates obtained when the survey was fielded in 2012 and 2013.
Unweighted Response Percentages
Respondent Types |
2012 |
2013 |
Total |
40.7 |
41.5 |
Mental Health |
34.6 |
35.2 |
Other Physician |
48.7 |
48.9 |
4. Tests of Procedures
TMA’s six-site telephone mode pilot effort in FY04 yielded data from 25.9 percent of the target population, and the dual-mode second pilot effort of fourteen sites in FY04 resulted in a 49.5 percent response rate.
To evaluate bias due to non-response, we surveyed non-respondents and compared the characteristics and responses of these “non-responders” to similar characteristics of physicians and mental health providers who responded. We specifically looked at characteristics to include practice type in terms of primary care or specialty, across strata, and by TRICARE network affiliation status. Such a survey of non-respondents was last conducted in 2010 when the survey was conducted with the same questionnaires, using the same data collection modes as the current survey.
In that study, responses chosen by those who did not respond to the initial fielding efforts differed substantially from those of respondents. However, when non-response adjusted sampling weights were applied, respondents’ and non-respondents’ response frequencies did not differ materially.
5. Statistical Consultation and Information Analysis
a. Individual(s) consulted on statistical aspects of the design.
Eric Schone, Ph. D.
Mathematica Policy Research
Phone: 202-484-4839
Amang Sukasih, Ph.D.
Mathematica Policy Research
Phone: 202-484-3286
Nancy Clusen
Mathematica Policy Research
Phone: 202-484-5263
b. Person(s) who will actually collect and analyze the collected information.
Eric Schone, Ph. D.
Mathematica Policy Research
Phone: 202-484-4839
Karen Metscher, Ph.D.
Altarum Institute
Phone: 703-217-0956
Michelle Mondro
Ipsos Public Affairs, LLC
Phone: 312-526-4582
1 Dartmouth Atlas
File Type | application/msword |
Author | Patricia Toppings |
Last Modified By | Frederick Licari |
File Modified | 2015-09-04 |
File Created | 2015-06-02 |