OMB No. 0960-0747
Background:
In early 2007, SSA obtained OMB approval for the Accelerated Benefits Demonstration Project, a multi-phase study designed to assess whether providing new SSDI disability recipients with certain benefits would stabilize or improve their health and help them return to work early. In this long-term study, we assigned new SSDI disability recipients (i.e., those who had just begun receiving benefits and who had at least 18 months remaining before they qualified for Medicare) to three groups: 1) a control group who would just receive regular SSDI benefits; 2) a treatment group who would receive immediate access to health care benefits; and 3) a treatment group who would receive health care benefits and additional care management, employment, and benefits services and support. The study, which research contractors and health care experts are conducting for SSA, would 1) assess if the health care and other benefits would help beneficiaries improve and return to work earlier and 2) if there was a difference between the treatment groups.
Update/Current ICR:
Having 1) assigned eligible beneficiaries into one of the three participant groups described above and 2) conducted a baseline and six-month follow-up surveys with these beneficiaries, SSA is now ready to move on to the next phase of the study: a 12-month follow-up survey. This ICR is for the 12-month follow-up survey, which we plan to conduct beginning in March 2009. We will use telephone interviews for the survey, with in-person follow-up for non-responders as necessary. We will attempt to contact all 2,000 participants and expect to complete follow-up interviews with 1,600 of them (80 percent). The survey’s purpose is to explore participants’ experiences after one year in the program, which would provide initial data on the effects of the health care and “heath care plus” treatments. The respondents are SSDI beneficiaries participating in this study.
See Addendum, attached, for a more detailed background history of this study.
Legal Authority:
Section 234 of the Social Security Act (the Act) directs the Commissioner of SSA to carry out experiments and demonstration projects to determine the relative advantages and disadvantages of:
Various alternative methods of treating work activity of individuals receiving SSDI, including such methods as a reduction in benefits based on earnings designed to encourage these beneficiaries to return to work;
Altering other limitations and conditions, such as lengthening the trial work period, or altering the 24-month waiting period for Medicare; and
Implementing a sliding scale benefit offset.
The Act requires that we design these demonstration projects to show that savings will accrue to the Trust Funds, or will otherwise promote or facilitate the administration of the SSDI program. Section 234 also provides that we conduct these projects in a manner that will allow SSA to evaluate the appropriateness of implementing such a program on a national scale. The AB Demonstration and planned evaluation meet these legislative and congressional mandates.
SSA will use the information it collects from this phase of the study to evaluate the efficacy of its experimental treatment procedures. Specifically, our evaluation will address the following research questions using an experimental design:
Operation. What are the important issues and challenges in helping beneficiaries enroll in the health benefit and use appropriate health care and employment services? How does this vary across local areas and for different types of beneficiaries?
Structure. What are the characteristics of the context in which beneficiaries are seeking medical and employment services?
Participation. Who agrees to participate in the study? Of those participants who are randomized into the treatment groups, who utilizes the health benefit, and what services do they get? How long do treatment group members agree to work with care managers? How does service use by the treatment groups differ from that utilized by the control group? How does participation differ for different subgroups?
Impacts. How does AB affect health, work limitations, employment and earnings, and dependence on SSA disability programs? Do these impacts differ across subgroups of the population of beneficiaries? What is the added effect of AB-Plus care management over and above the AB-Basic health benefit offer?
Benefits and Costs. How does AB affect income and payroll tax receipts, benefit outlays, and the status of the SSA trust funds? From the perspectives of beneficiaries, the government, SSA trust funds, and society, do the benefits of the intervention exceed its costs? Does the AB Health Plan generate enough use of health care services to provide eventual savings to SSA? Are the extra costs of AB-Plus offset by the extra benefits that it generates?
The 12-month follow-up survey will be instrumental in the impact analysis. We will estimate the impacts of the AB-Basic and AB-Plus health plans by comparing outcomes of treatment group participants to those in the control group. In addition to the 12-month follow-up survey, the analysis will be based on various administrative records and MIS data on use of health care services by those assigned to the AB-Basic and AB-Plus groups.
MPR is responsible for primary data collection. MDRC and MPR staff will analyze the data.
a. Twelve Month Follow-up Survey
We will administer the twelve-month follow-up survey as a computer-assisted telephone interview (CATI) that collects information on key outcomes that cannot be measured from other sources. The survey will be conducted 12 months following random assignment.
Enrollment and randomization into AB is happening in two parts. During the first part, 40 percent of individuals who agreed to take part in the study were assigned to the AB-Plus and control groups and 20 percent were assigned to the AB-Basic group. Enrollment during this part continued until there were 617 individuals in AB-Plus, 625 in the control group, and 308 in the AB-Basic group. During the second part of enrollment, 20 percent of individuals who agreed to take part in the study were assigned to the AB-Basic group and 80 percent were assigned to the control group. This phase of enrollment will continue until there are 400 AB-Basic group members. This differs from the enrollment and randomization procedures that were described in the supporting statement for the baseline and six-month follow-up surveys. Health care cost projections from the initial group of people randomly assigned suggested that the study would substantially overspend the budget that had been set aside for health care costs. A decision was therefore made to reduce the size of the AB-Plus group from 800 to 600. When this decision was made in November 2008, 617 individuals had already been enrolled in AB-Plus, 625 had been enrolled in the control group, and 308 had been enrolled in the AB-Basic group. A decision was also made to keep the planned sample of 400 AB-Basic group members and the original target of 2,000 overall sample members. To meet these goals, the randomization algorithm was changed to 1 AB-Basic case for every 4 control group cases. The new anticipated sample size for the demonstration is 2,010 participants, which includes 617 AB-Plus members, 400 AB-Basic members, and 993 control group members.
The sample for the twelve-month follow-up survey will include all individuals who entered the study during the first part of enrollment (617 AB-Plus members, 625 control group members, and 308 AB-Basic members). It will also include the 92 individuals assigned to the AB-Basic plan in the second part of enrollment along with 92 randomly selected control group cases from that portion of enrollment. The anticipated sample size for the 12 month survey is therefore 1,734, including 617 from AB- Plus, 717 from the control group and 400 from AB-Basic. We expect to complete interviews with 1,387 beneficiaries for an estimated 80 percent response rate.
Respondents will be mailed an advance letter. This letter introduces the 12-month follow-up survey, alerts the individual that an interviewer from MPR will call in a short time and encourages them to participate in the survey. Both the privacy act statement and paperwork reduction act statement will be included in the letter. A set of frequently asked questions (FAQ’s) will accompany the advance letter. An example of the advance letter is in Appendix B and the FAQs are in Appendix C.
The 12-month follow-up survey, included in Appendix D, will include the topics described below. Where possible, we included questions from the baseline and “early usage” questionnaires to ensure consistent measurement.
Section A: Introduction/Sample Management. In this section, we verify that we have reached the correct sample member, determine whether a proxy or assistant is needed, and verify the language of administration and whether an interpreter is needed and available.
Section B: Physical, Functional and Mental Health Status. This section includes questions on self-reported health status, functional limitations, activity limitations, pain perception, perceived disability, and mental health. The majority of the questions in this section are from the SF-36. The SF-36 is a multi-purpose, short-form health survey with only 36 questions. It yields an 8-scale profile of functional health and well-being scores as well as psychometrically-based physical and mental health summary measures and a preference-based health utility index. It is a generic measure, as opposed to one that targets a specific age, disease, or treatment group. Accordingly, the SF-36 has proven useful in surveys of general and specific populations, comparing the relative burden of diseases, and in differentiating the health benefits produced by a wide range of different treatments. Questions on functional limitations are identical to those that were asked on the baseline survey.
Section C: Health Insurance Status. In this section, we ask about public and private health insurance that sample members may have obtained since random assignment.
Section D: Healthcare Usage. This section begins with a few questions designed to assess the respondent’s level of fatigue and need for a break. In this section, we ask about the respondent’s health care usage, such as the use of general practitioners and primary care physicians, the use of specialists, including mental health specialists, the use of emergency rooms and/or clinics, the number of hospitalizations and/or institutionalizations, the use of prescription medicines, and the average annual out-of-pocket healthcare expenses.
Section E: Unmet Medical Needs. In this section, we ask for reasons the respondent did not keep doctor or specialist appointments or have recommended tests, reasons for not taking prescribed medicines and reasons for not completing prescribed home health treatments.
Section F: Employment and Earnings. This section collects information on employment, use of Vocational Rehabilitation services, job search activities, work-related education and training. Specifically, topics include health-related work limitations, orientation to work, whether employment supports were needed and/or accessed, hours worked, length of time at job, type of job, type of employment (full- or part-time), self employment, benefits and accommodations for disabled employees, wages or self-employment income, and reasons for not working.
Section G: Use of SSA and Other Employment Services. In this section, we measure familiarity with and sources of information about the Ticket to Work program, enrollment, receipt of services and dies-enrollment from an Employment Network, state Vocational Rehabilitation agency, and Work Incentives Planning and Assistance (WIPA) program. We also measure use of other employment-related services including education and training.
Section H: Family Status, Income, Assets and Receipt of Benefits. This section asks about the total family income including any public assistance.
With the exception of some questions that are specific to the treatment groups, all sections will be asked of all respondents. Some questions within sections will be skipped according to respondent answers. At the end of the interview, we will collect current contact information in order to send the $25 incentive to the correct address.
We will use CATI interviewing to collect data for the AB surveys. We will administer the 12-month follow-up survey as a CATI interview with field follow-up for non-respondents. Because we expect most of the non-response at follow-up to result from non-contacts rather than refusals, we plan to send field staff out to locate hard-to-find sample members and convince them to complete the interview by telephone. Field locaters will be equipped with cell phones that the sample member can use to dial-in to MPR’s survey operations center and complete the interview with a trained, professional interviewer. This approach both minimizes costs relative to in-person interviewing and reduces possible mode effects that could result from a mixed-mode methodology. The CATI instrument will incorporate standard checkpoints to assess each respondent’s level of fatigue, and to provide the respondent with an opportunity to take a break, if necessary. CATI will be the primary mode of data collection. We will use in-person locating as part of our non-response follow-up strategy.
Telephones equipped with amplifiers will be available for use as needed to accommodate sample members who are hearing impaired. In addition, TTY and Relay technologies will also be used to facilitate participation in the telephone survey. A TTY is a special device that lets people who are deaf, hard of hearing, or speech-impaired use the telephone to communicate, by allowing them to type messages back and forth to one another instead of talking and listening. A TTY is required at both ends of the conversation in order to communicate. The Telecommunications Relay Service (TRS) will be used for sample members who are deaf, hard of hearing, or speech-impaired but who do not have a TTY. With TRS, a special operator types whatever the interviewer says so that the person being called can read the interviewer’s words on his or her telephone display. He or she will type back a response, which the TRS operator will read aloud for the interviewer to hear over the phone. These methods, TTY and TRS, both increase survey administration times, but enable us to conduct interviews with sample members who, without the help of these technologies, would not be able to participate.
The nature of the information we are collecting and the manner in which we are collecting it preclude duplication. SSA does not use another collection instrument to gather similar data.
The 12-month follow-up survey does not affect small entities.
If we did not conduct the 12-month follow-up survey, we would not be able to conduct an impact analysis, negating the whole point of this demonstration project. Since we are only collecting the information once, we cannot collect it less frequently.
There are no technical or legal obstacles to burden reduction.
There are no special circumstances that would cause SSA to conduct this information collection in a manner inconsistent with 5 CFR 1320.5.
SSA published the 60-day Federal Register Notice on September 25, 2008, at 73 FR 55584, and we did not receive any public comments. We published the 30-day Federal Register Notice on December 11, 2008, at 73 FR 75490. If we receive any comments in response to the 30-day Notice, we will forward them to OMB.
Consultation with the public: We developed the 12-month follow-up survey by first reviewing the baseline and “early usage” surveys and the SF-36. The AB surveys were developed by reviewing the National Beneficiary Survey (NBS), the National Health Interview Survey (NHIS) and the associated disability follow-back survey (NHIS-D), the Medical Expenditures Panel Survey (MEPS) as well as the Current Population Survey (CPS) to identify questions that have been used successfully to capture information about program usage, health insurance usage, access to and utilization of health care, use of employment and other rehabilitative and supportive services, employment history, job search activities, employment and income, as well as information about household composition, functional status, general well-being, and demographics. In designing the 12-month survey, we collaborated with colleagues working on other SSA demonstration projects, notably the Youth Transition Demonstration (YTD) and the National Beneficiary Survey to ensure that there are common data elements across projects. We then developed new questions for outcomes of interest for which reliable questions did not already exist. We conducted an internal timing test to estimate respondent burden.
We will offer beneficiaries $25 to complete the survey. We will mail payments to respondents upon completion of the interview. Since the majority of sample members will not be working, we believe that offering an incentive is appropriate as an expression of appreciation of their time and effort.
SSA protects and holds confidential the information it is collecting in accordance with 42 U.S.C. 1306, 20 CFR 401 and 402, 5 U.S.C. 552 (Freedom of Information Act), 5 U.S.C. 552a (Privacy Act of 1974), and OMB Circular No. A-130.
The secure handling of confidential data is important to the study team and its staff due to ethical and legal obligations. Ensuring the secure handling of confidential data is accomplished via several mechanisms, including obtaining suitability determinations for designated staff; training staff to recognize and handle sensitive data; protecting computer systems from access by staff without favorable suitability determinations; limiting access to secure data on a “need to know” basis and only for staff with favorable suitability determinations; and creating data extract files from which identifying information has been removed.
We will take several steps to assure sample members that the information they provide will be treated confidentially and used for research purposes only. Sample members will be told that they will not be identified individually (by name) in any reports. The assurances and limits of confidentiality will be made clear in all advance materials sent to participants and restated at the beginning of the interview. The Privacy Act and Paperwork Reduction statements appear on the advance letter.
The purpose of the evaluation is to test whether providing early access to health benefits, care management, and expanded access to employment supports will help new SSDI beneficiaries regain their independence and return to work. As a result, obtaining information about potentially sensitive topics, such as the health status and medical condition of sample members is central to the intervention. The survey will not collect data that can be obtained directly from other sources (for example, information about receipt of disability benefits is best obtained directly from SSA administrative records).
The survey will include questions about the following topics that can potentially be considered sensitive:
Health insurance coverage
Health status, including disability information
Assistance needed with Activities of Daily Living (ADLs) and Instrumental Activities of Daily Living (IADLs) (for example, help or supervision needed with bathing, dressing, eating, and using the toilet).
Doctor’s visits and hospital stays
Financial status
Many of the questions were adapted without modification from other national surveys of similar populations, such as NBS, NHIS, The Youth Transition Demonstration (YTD), and the Current Population Survey (CPS).
There is no known cost burden to the respondents.
The estimated cost to the Federal Government for designing, administering, and analyzing the twelve-month follow-up survey data is $1,961,000. Table 2 shows these estimated costs on a year-by-year basis.
table 2
annualized costs
Year |
Cost |
2008 (Design) |
$128,000 |
2009 (Data Collection) |
$809,000 |
2010 (Data Collection and Analysis) |
$1,024,000 |
The total burden for this collection decreased 5,612 hours (from 6,812 to 1,200 hours). The reason for this decrease is that there was a significant reduction in the number of respondents (from 38,292 to 1,600). We had so many more respondents when we first cleared this study because we were screening potential participants.
The AB process and implementation analysis will document recruitment strategies and operations. We will develop measures that summarize services received by the control group and the two treatment groups, allowing us to estimate the difference between those groups in terms of the types and intensity of services. We will carefully document how participants are recruited to assess how findings might be generalized to other settings and groups. We will examine characteristics of the local environment such as existing agencies, organizations, and the services they offer, as well as document the existing relationships between WIPAs, the SSA field offices, medical providers, and the vocational rehabilitation, workforce, and behavioral health agencies. Finally, we will examine how the intervention interacted with existing SSA work incentives, such as the trial work period, extended period of eligibility, Ticket To Work, and the host of other incentives available for SSDI beneficiaries wishing to return to work.
b. Impact Analysis
The impact analysis will focus primarily on the health and employment outcomes for beneficiaries, both overall and for meaningful subgroups. Our analytical approach combines the power of a random assignment design with statistical modeling.
Basic Impacts. The first tool for understanding the effects of the intervention is to compare average outcomes for the AB-Plus, AB-Basic, and control groups. This is a straightforward calculation that is generally easy to explain to policymakers and other non-technical audiences.
Conditional Outcomes. Some outcomes can be measured for only a subset of beneficiaries. For example, hourly wage rates can be measured only for workers. For such outcomes, the experimental framework can be preserved by analyzing distributions. For example, we could compare the proportion of people who worked and earned above a certain wage amount rather than average hourly wages among workers.
More Sophisticated Methods. Most random assignment studies attempt to increase the precision of estimated basic impacts by adjusting for baseline characteristics. In estimating these effects, we can use a linear regression framework, or a more complex set of methods depending on the nature of the dependent variable and the type of issues being addressed, such as: logistic regressions for binary outcomes (e.g., whether or not someone works); Poisson regressions for outcomes that take on only a few values (e.g., months of employment); quantile regressions to examine the distribution of outcomes for continuous outcomes, such as benefit payment amounts, earnings, and income; and duration (hazard) models for outcomes that depend on an event, such as the conditional probability of entering employment over several periods.
The impact analysis findings will be presented in a series of reports designed for a broad audience of policymakers, program planners and managers, and researchers. For the interim and final reports, estimates of impacts for the full sample and for policy-relevant subgroups will be presented. Drawing on findings from the implementation/process analysis, we will describe differences in the target populations, project services, project implementation, and community context that may explain differences in impacts. These reports will present both a global analysis of the effectiveness of the projects and then a targeted analysis that addresses more specific questions such as which program model works best and for whom. The scope of project-specific “letter reports” will be more limited. These reports will be based on administrative data and will focus more narrowly on the presentation and interpretation of the impact estimates themselves and less on the context. Subgroup estimates will be presented and the data and analytical methodology will be fully described.
Table 3 provides the schedule for the project.
TABLE 3
PROJECT SCHEDULE
Activity |
Beginning Date |
End Date |
Pretest Baseline Survey |
March 2007 |
April 2007 |
Baseline Survey (Phase 1) |
October 2007 |
November 2007 |
Baseline Survey (Phase 2) |
March 2008 |
January 2009 |
Implementation Site Visits |
September 2009 |
December 2009 |
Early Use Survey |
September 2008 |
February 2009 |
Twelve-Month Follow-Up |
March 2009 |
May 2010 |
Policy Brief |
|
September 2008 |
Project Assessment |
|
February 2009 |
Interim Findings |
|
October 2009 |
Final Report |
|
January 2011 |
c. Cost-Utility Analysis
To ensure that the cost-utility findings are as helpful as possible to SSA, we will present the information in a way that has proven useful for communicating this type of information to the SSA Office of the Actuary and to OMB. First, we will summarize all of the information that is based directly on data collected during the demonstration period. Second, we will present the size of future effects (if any) that would be required for the interventions to generate benefits that exceed costs. We will then assess the plausibility of observing the impacts needed for the
program to break even. The latter steps in this process are important because of the limited observation period during which we will be able to observe costs and benefits. While most of the costs of the program will be incurred upfront (i.e., during the observation period), the benefits are likely to accrue for years in the future. It is possible that the net value of the program during the observation period is negative, but with carry-over effects, in the future the net value may become positive. By presenting these components, the actuaries will be able to see the net value generated during the observation period, and then use the more speculative analysis of possible future benefits and costs to draw conclusions about whether the interventions will ultimately pay for themselves. This approach differs from the more common approach to presenting results that provides a single bottom-line benchmark estimate that incorporates both directly observed evidence and the best available evidence of benefits and costs that occur after the observation period. In addition to using this general presentation format, we will work with the actuaries during the demonstration to ensure that the other assumptions used in the analysis (such as the discount rate, correction for inflation, and projections about potential productivity growth) are consistent with the ones they are using to assess other potential SSA initiatives.
The analysis will address the level of certainty with which SSA and other policymakers can view the findings. The final benefit-cost numbers will be based on a range of estimates and assumptions, each of which is inherently associated with some level of uncertainty. The starting point for the analysis of uncertainty is to conduct a series of sensitivity tests that indicate how the bottom line would be affected by changes in specific underlying estimates or valuation assumptions. In this way, we will give SSA a series of plausible estimates instead of a single, inherently imprecise, estimate. We will also analyze uncertainty by using probabilistic sensitivity analysis, which allows us to simultaneously assess the uncertainty in all values by using Monte Carlo simulation, which, in turn, will tell us how the bottom line would be affected when all underlying values move within a specified probability distribution. We prefer to use both methods because the series of specific sensitivity tests is typically easier for policy makers to understand, while the more complex method indicates the extent to which possible correlations between the values would make such sensitivity tests misleading.
We will display the OMB expiration date on all survey materials sent to respondents, including the advance letter and Frequently Asked Questions.
SSA is not requesting an exception to the certification requirements at 5 CFR 1320.9 and related provisions at 5 CFR 1320.8(b)(3).
References
Committee on Finance, U.S. Senate. 1972. Social Security Amendments of 1972: Report to Accompany HR 1 (92nd Congress 2nd session), S. Rept. No. 92-1230.
Gold, Marsha, and Beth Stevens. “Medicare’s Less Visible Population: Disabled Beneficiaries Under Age 65.” Monitoring Medicare+Choice: Operational Insights, no. 2, May 2001. Washington, DC: Mathematica Policy Research, Inc..
Hall, Jean P., and Michael H. Fox. “What Providers and Medicaid Policymakers Need to Know About Barriers to Employment for People with Disabilities.” Journal of Health & Social Policy, vol. 19, no. 3, 2004.
Thornton, Craig, Thomas Fraker, Gina Livermore, David Stapleton, Bonnie O’Day, Tim Silva, Emily Sama Martin, John Kregel, and Debra Wright. 2005. Evaluation of the Ticket to Work Program: Second Evaluation Report, Mathematica Policy Research, Washington DC (Draft Report).
File Type | application/msword |
File Title | SUPPORTING STATEMENT FOR ACCELERATED BENEFITS DEMONSTRATION PROJECT, PHASE II |
Author | Jessica Silvani |
Last Modified By | 666429 |
File Modified | 2008-12-30 |
File Created | 2008-12-30 |