ADULT PROTECTIVE SERVICES OUTCOMES STUDY
Supporting Statement for Paperwork Reduction Act - Part A
Information Collection Request
The U.S. Department of Health and Human Services (HHS), Administration for Community Living (ACL) is seeking OMB approval to collect data using new information collection tools that examine if and how Adult Protective Services (APS) programs make a difference in the lives of APS clients.
APS programs are provided by state and local governments nationwide and serve older adults and adults with disabilities in need of assistance due to maltreatment, which can include: physical, emotional, and sexual abuse; financial exploitation; neglect; and self-neglect. APS is an important avenue through which maltreatment is reported to law enforcement or other agencies. Additionally, APS programs are often the gateway for adults who experience maltreatment to access additional community, social, health, behavioral health, and legal services to maintain independence in the settings in which they prefer to live.
APS programs work closely with clients and a wide variety of allied professionals to maximize safety and independence, while respecting each client’s right to self-determination. APS programs provide a range of services to the clients they serve, including:
investigating reports of adult maltreatment;
case planning, monitoring, evaluating, and other casework; and
providing, arranging for, or facilitating the provision of medical, social service, economic, legal, housing, law enforcement, or other protective, emergency, or support services.
The purpose of this new information collection (“the study”) is to examine if and how APS programs make a difference in the lives of APS clients with regard to their self-determination, safety, well-being, and satisfaction with services. The design of the study was informed by key foundational activities, including a review of the literature and existing datasets, the development of a logic model and a theoretical framework, and guidance from ACL content experts and a technical expert panel (TEP), consisting of researchers, practitioners, and program leaders within APS.
The study will be conducted with three target populations: (1) APS clients, (2) APS caseworkers, and (3) APS leaders. APS leaders will consist of APS state and APS county leaders. Data collection with these three target populations will be conducted using five new data collection instruments:
Brief, anonymous APS client questionnaire – to be completed by APS clients
De-identified client data form – to be completed by APS caseworkers
Semi-structured in-person interview guide – to be used with APS clients
Semi-structured in-person focus group guide – to be used with APS caseworkers; and
Semi-structured interview guide – to be used with APS leaders.
Circumstances Making the Collection of Information Necessary
At this time, there is no single set of rules and regulations that APS programs must follow, and there is limited research examining the impact of APS programs on client outcomes. Thus, building the evidence base for APS programs and practices, identifying and promoting the use of evidence-based and promising practices, and developing guiding standards are key needs of the APS field and reflect the ongoing commitment of ACL to the APS field.
The proposed new information collection is an important component for building the evidence base for APS programs and practices in improving client outcomes, and to support the ongoing evaluation of APS programs. Specifically, the study will help to examine:
(1) what changes clients report as a result of receiving APS services;
(2) how satisfied clients are with the APS services they receive;
(3) to what extent clients report APS helps them achieve their goals;
(4) to what extent clients report APS supports their right to self-determination;
(5) to what extent APS programs affect client safety (risk of maltreatment);
(6) how APS programs intervene to reduce client risk of maltreatment;
(7) what factors help or hinder APS efforts to reduce risk of maltreatment;
(8) to what extent APS programs affect client well-being (e.g., quality of life, financial, physical health, etc.);
(9) how APS programs intervene to improve client-well-being; and
(10) what factors help or hinder APS efforts to improve client well-being.
Legal support for the study stems primarily from the Elder Justice Act of 2009 (“the Act”). The Act requires that the Secretary of HHS ensures that HHS “conducts research related to the provision of adult protective services” (Section 2042(a)(1)(D)). The following weblink provides the full-text of this provision: https://www.ssa.gov/OP_Home/ssact/title20/2042.htm
Additionally, the Act established the Elder Justice Coordinating Council (EJCC) to coordinate activities related to elder abuse, neglect, and exploitation across the federal government. In 2014, the EJCC adopted a set of eight recommendations to increase federal involvement in addressing elder abuse, neglect, and exploitation. Recommendation four states that the federal government should “establish a coordinated research agenda across federal agencies to identify best practices for prevention of and intervention in elder abuse and elder financial exploitation.” The following weblink provides the full-text of the recommendations: https://acl.gov/sites/default/files/programs/2016-09/Eight_Recommendations_for_Increased_Federal_Involvement.pdf
Purpose and Use of the Information Collection
The new information collection will be used by ACL to understand if and how APS programs make a difference in the lives of APS clients with regard to their self-determination, safety, well-being, and satisfaction with services. The findings from the study will help inform federal decision-making around funding priorities and other support for APS programs. Additionally, the study findings will be broadly disseminated to the public, and may be used for a variety of other purposes. For example, state and county APS program leaders may use the information to inform APS program planning, policy-making, or resource allocation decisions. Additionally, researchers may use the information to inform research efforts to further advance the evidence base for APS programs and practices.
Use of Improved Information Technology and Burden Reduction
Among the five data collection instruments to be used in the study, the client questionnaire and the client data form will be completed by written/typed response.
The client questionnaire will be completed by APS clients (or their proxy) using pencil and paper and returned by the client/proxy via mail using a prepaid envelope. This method will help protect the privacy of APS clients and will allow them to submit the questionnaire anonymously. It will also allow clients with limited access to technology (e.g., computer, internet connection) to participate.
In order to limit burden on clients, the client questionnaire does not collect basic demographic information or additional details about APS clients. However, this information is important for analysis, since responses to the client questionnaire are expected to vary by key client characteristics. We will capture this additional information about clients through a separate client data form to be completed by the APS caseworker. The client data form will be linked to the client questionnaire using the pre-populated, eight-digit form number.
In order to limit burden on caseworkers, the client data form is brief and collects information readily available from client case records. Additionally, caseworkers will submit the responses for the client data form using Survey Monkey, an online platform for data collection. If this submission method does not work well for participating caseworkers, we will work with them to choose an alternative approach to submitting data for the client data form (e.g., mail, scan and e-mail, phone).
The three semi-structured interview and focus group guides will be administered at a time and location convenient for participants. For example, individual in-person client interviews may be conducted in the client’s home, if they choose. Also, interviews with state APS leaders may be conducted via phone.
Efforts to Identify Duplication and Use of Similar Information
There are currently no other governmental or non-governmental, multi-state efforts to systematically collect feedback from APS clients or to measure the impact of APS on their lives.
To avoid duplication and use of similar information, we first conducted a review of the literature and existing data sets. In our review of existing APS literature, we found that there are few existing studies of APS client outcomes. Among the 17 studies that we identified in our literature review, most had one or more characteristics of a new or underdeveloped body of literature. In particular, most of the articles used small sample sizes selected from small geographic areas; relied on case record review methods using state/local administrative data sources; and/or used simple, descriptive statistical approaches to address research questions. These limitations may be explained in part by the decentralized nature of APS and the wide variation in APS programs across states and counties. They may also be explained by a lack of data collection about APS in national surveys and surveillance systems.
To assess the availability of national data relevant to this study, we identified a total of 17 relevant datasets containing information on adult maltreatment and/or APS. Although many national-level, federal datasets collect information related to adult maltreatment, the National Adult Maltreatment Reporting System (NAMRS)1 represents the only dataset that captures information on both adult maltreatment and APS.
NAMRS is the national reporting system for state APS data. It is divided into three components: (1) the Agency Component, submitted by all states about their APS policies and practices; (2) the Case Component, submitted by states with case-level tracking systems that capture detailed information on clients, services, and perpetrators; (3) the Key Indicators Component, an alternative to the Case Component, submitted by states that can provide aggregated state-level data on key APS statistics but do not have the case-level tracking systems necessary to submit Case Component data. NAMRS is currently in its third federal fiscal year of data collection (FFY16-FFY18 available). In FFY 2017, 56 states submitted NAMRS data. A total of 26 states submitted Agency and Case Component data, 21 states submitted Agency and Key Indicators Component data, 8 states submitted Agency Component data only, and one state did not participate. In FFY18, the number of states that submitted Agency and Case Component data, increased to 31.
NAMRS has several key limitations. Given that it is only in the third year of data collection, states are still becoming accustomed to submitting data to NAMRS. No states currently report all of the data fields included in the three components of NAMRS. Additionally, NAMRS is a voluntary reporting system. States have the option to submit as much or as little data as they choose. This means the quality and completeness of NAMRS data vary greatly across states and across years. NAMRS does not collect feedback from APS clients with regard to their self-determination, safety, well-being, and satisfaction with services. Nevertheless, NAMRS has value as a source of population data for studying APS if these limitations are properly addressed in the study design.
Based on these findings from our literature and dataset review, we determined that existing information already available cannot be used or modified to fully achieve the purpose of the study. Therefore, we developed this study design to use existing information, including secondary data analysis of NAMRS, but also to collect new information where data gaps exist.
Impact upon Small Businesses or Other Small Entities
APS programs in 27 counties will be asked to participate in this study on a voluntary basis. The size of participating counties will range considerably in population across the country. State APS program leaders will facilitate recruitment by providing the appropriate contacts at the APS county programs, making virtual introductions between the contacts and the study team, and responding to county questions or needs related to applicable state approval processes for participating in the study. The study team will provide each participating APS program with extensive administrative support and training to ease the burden of their participation. For example, the study team will provide regular check-in meetings, a live and recorded webinar training covering all aspects of the information collection, pre-populated data collection forms and pre-paid and addressed envelopes, and a one-page quick reference sheet with tips, instructions, and contact information for any questions. The study team will also provide a written guide (“APS Evaluation Site Visits: A Guide for State and County Leaders”) that includes all the information and materials counties will need to help plan and carry out a site visit.
Consequences of Collecting the Information Less Frequently
This is a one-time information collection activity. If the collection is not conducted, the study will not be able to achieve its intended purpose (as specified by ACL), nor address existing gaps in knowledge of APS client outcomes, nor further APS research as instructed by the Elder Justice Act of 2009 and the EJCC.
Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
There are no special circumstances required for the collection of information in this data collection.
Report Information More Often than Quarterly. There are no circumstances that could result in the data needing to be collected more frequently than quarterly. This is a one-time information collection effort.
Requiring Response in Less than 30 Days. There are no circumstances that could result in participants needing to respond to the written questionnaire in less than 30 days.
Requiring Respondents to Submit More than One Original and Two Copies. No respondents will need to provide copies of their responses.
Requiring Respondents to Retain Records for More Than Three Years. No respondents will need to retain any records.
In Connection with a Statistical Survey. These data are not collected as part of a statistical survey.
Use of a Statistical Data Classification that Has Not Been Approved by OMB.
This information collection will be approved by OMB before commencing.
Pledge of Confidentiality. This information collection will not include a pledge of confidentiality that is not supported by various authorities, regulations, or policies. The information collection will be performed following an approved informed consent process.
Requiring Respondents to Submit Trade Secrets or Other Confidential Information. This study will not involve collection any data related to trade secrets. No identifying data will be collected.
Comments in Response to the Federal Register Notice
We gathered input on this proposed information collection effort from three main sources outside of ACL: (1) public comments submitted through the 60-day Federal Register notice, (2) pilot testing, and (3) a Technical Expert Panel (TEP).
A 60-day Federal Register Notice published in 84 FR 43137 on August 20, 2019. A 30-day Federal Register Notice published in 84 FR 66426 on December 4, 2019.
Three individuals provided written responses to the 60-Day Federal Register notice containing the original proposed data collection instruments.
Additionally, we conducted pilot testing of all data collection instruments with a total of 16 individuals from across three states. The purpose of the pilot test was to collect feedback about the clarity of the items and instructions, appropriateness of the administration procedures, and accuracy of the burden estimates before full implementation of the tools. Pilot testing included three state or county APS leaders, eight APS caseworkers, and five APS clients. We limited the number of pilot test respondents and varied the specific questions asked of each respondent type in order to avoid OMB rules that would require approval of the pilot testing under a separate information collection request. APS state leaders served as the point of contact for coordinating pilot testing in their state, including recruiting respondents to participate, collecting and de-identifying their feedback, then submitting their feedback to the study team. The study team did not interact in any way with APS county leaders, caseworkers, or clients in order to ensure the feedback was entirely anonymous.
Lastly, we shared the data collection instruments with 10 experts in the APS field, including federal experts in other agencies, researchers, practitioners, and program leaders within APS. The TEP helped shape the study design and provided detailed review and feedback on the data collection instruments.
Across the three sources of input, we received a total of 147 separate comments. These comments tended to fall into one of three categories:
78 separate comments were made about one or more specific items in a data collection instrument.
36 separate comments were made about the study design or procedures for administering the data collection instrument.
33 separate, general comments were made in support of the study and/or the data collection instruments.
The table below provides a cross tabulation of all comments received by comment type and comment source, with cell values expressed as numbers and percentages.
|
TEP |
Pilot Testing |
Public Comment |
TOTAL |
Instrument Feedback |
26 (18%) |
50 (34%) |
2 (1%) |
78 (53%) |
Design and Process Feedback |
2 (1%) |
33 (22%) |
1 (1%) |
36 (24%) |
General (Positive) Comments |
1 (1%) |
30 (20%) |
2 (1%) |
33 (22%) |
TOTAL |
29 (20%) |
113 (77%) |
5 (3%) |
147 (100%) |
Below, we provide a summary of the comments by type and list all changes that we propose to the data collection instruments and study procedures as a result of the comments:
Summary response to comments about changes to specific item(s): These comments included suggested edits to wording (e.g., grammar/spelling errors, plain language, reducing length, combining items), adding details to focus the question (e.g., specifying timeframes, adding example answers to probes in case respondents need clarification), and adding new items or probes (e.g., how clients qualify for APS, caseworker safety, legal interventions).
Changes to APS Caseworker Focus Group Guide
Introduction and Case Initiation, Item B, Probe 3: Revise to read: “How do you build relationships with clients?”
Changes to APS Client Interview Guide
Introduction and Case Initiation, Item C: Add the following text to the "Instructions to Interviewers": "If the client reports multiple instances of using APS services, allow the client to consider/compare all of their experiences in their responses to questions." Also, move the current Item F to occur after current Item B. Move current Item C to become Probe 2 under current Item F.
Self-Determination, Item D, Probe 1: Add the following text to the end of the current probe: "If not, how could have the caseworker done a better job of involving you in planning and decision-making."
Satisfaction, Item C: Revise to read: "Are you satisfied with the services that [name of APS program] referred you to?"
Self-Determination, Item E: Revise the text to read: "If so, which one(s)?"
Changes to APS Leader Interview Guide
Information Sheet: Remove the following sentence: "Participants will be advised not to share anything they heard from other participants with other staff in their programs."
Collaboration/Partnership, Item A: Revise to be: "...(2) memorandum of understanding/memorandum of agreement…".
Policies, Practices, and Procedures, Item A, Probe 2: Revise to be: "Have there been any significant changes to [State/County] APS policies, practices, and procedures in recent years."
Self-Determination: Add the following lead-in description: "Now we'd like to talk a little about the role the client plays in planning and making decisions around the help and services they are offered by [name of APS program]."
Changes to Client Data Form
Add new item: "How did the client qualify to receive APS services (check all that apply)?" with check boxes for two response options: 1) On the basis of old age; 2) On the basis of disability/vulnerability/etc.
Level of Client Engagement: Revise the item to read: "Level of Client Engagement with APS:"
Level of Client Engagement: Create table (similar to the item for type of maltreatment) or other revised formatting to capture level of client engagement with two separate aspects of APS: (a) the investigation, (b) services. No revisions to the response options.
Number all items.
Maximize the font and spacing, while keeping the items all on a single page.
Changes to Client Questionnaire
Instructions, second paragraph, first sentence: Delete the word “from”.
Item 2: Revise the item to read as follows: "When I first met the worker from [name of APS Program], I thought I needed their help."
Item 4: Revise to read: "The worker respected my wishes."
Item 9: Replace the single line with a large empty box that fits the available space, while keeping the items all on a single page.
Maximize the font and spacing, while keeping the items all on a single page.
Changes to all data collection instruments
Revise all instances of "elder maltreatment" in all study tools to be "adult maltreatment" and provide the definition/description given in the APS Guidelines.
Summary response to comments about the study design or procedures: These comments included suggested modifications to time estimates for administering data collection instruments (e.g., more time), amount of incentives (e.g., larger amount), and method of administration (e.g., shift from by phone to in-person). They also included clarification questions (e.g., client inclusion criteria for participation in the study, determining if the client or proxy should participate, choosing a suitable proxy).
Changes to APS Caseworker Focus Group Guide
Increase the amount of the incentive from $30 to $40 per caseworker for participating in a focus group.
Summary response to general, supportive comments: These comments provided agreement with the wording of items and procedures for administering the data collection instruments. They also provided support for ACL in conducting the study and cited benefits to the field of APS.
No proposed changes as a result of these comments.
For comment-level details, including our responses to comments and proposed changes to the data collection instruments, please see Appendix A – APS Client Outcomes Study Feedback Tracking Sheet.
Explanation of Any Payment or Gift to Respondents
Incentives will be offered at several levels of participation in the study. The county APS program will be offered a one-time, administrative stipend for their participation. APS clients will be offered a gift card for participating in a client interview. APS caseworkers will also be offered a gift card for participating in a caseworker focus group. No other incentives will be offered as part of the study. Incentives will not be provided where they are prohibited, for example, where caseworker union rules prohibit this kind of compensation.
The participating county APS program will be offered a one-time, $1,000 administrative stipend for their participation in the study. The stipend is intended as an acknowledgement and thank-you to APS leaders and caseworkers within each participating county for their time and contributions to the study. Each county may use the funds however they see fit. The administrative stipend will be offered to participating counties only. No financial incentives will be offered to states where participating counties are located.
Older adults and adults with disabilities can be difficult to reach for research purposes. Some studies have found that survey response rates tend to decline as age increases. Factors that can contribute to low response rates among these populations include: low-income, limited English proficiency and/or low levels of literacy, communication challenges requiring support from a proxy/caregiver. Clients will be incentivized to participate in a client interview with a $20 gift card (retailer-specific gift card selected in consultation with the local program staff about what would be most desirable). The incentive was chosen as an appropriate amount of compensation for the time required to participate (i.e., 45 minutes), without being so large an amount that the client feels like they must participate for the financial benefit. Additionally, clients may have concerns about participating in an interview about their experiences with APS given the sensitive nature of adult maltreatment. Although the client interview guide includes questions that only ask about experiences with APS, and not about the client’s maltreatment, this incentive may help ease potential concerns they client may have.
Similarly, caseworkers will be incentivized to participate with a $40 gift card (either cash card or a retailer-specific gift card selected in consultation with the local program staff about what would be most desirable). The incentive was chosen as an appropriate amount for the time required to participate (i.e., 90 minutes) and acknowledges that caseworkers should be compensated for sharing their professional opinions, while participating in a professional capacity.
The caseworker incentive is greater than the client incentive since caseworker focus groups are longer than client interviews.
Assurance of Confidentiality Provided to Respondents
Personally identifiable information (PII) is not being collected as part of the study and we make no pledge about the confidentially of the data. However, the data collection instruments include an informed consent process.
Given that the client questionnaire is brief and completed anonymously, the informed consent process for this instrument is built into the instructions. These instructions inform the respondent that by submitting the form, they are giving consent for their answers to become a part of the study. Again, the information collected from these respondents does not include PII.
Prior to beginning an interview or focus group, the study team moderator fully reviews the informed consent form with each participant. This form explains the purpose/background of the study, procedures and duration, voluntary participation, risks and benefits, incentives, privacy, and contact information for any questions/concerns. The form concludes with a certificate of consent where participants acknowledge that they fully understand the terms of participating and agree to take part in the study.
The study team will protect the privacy of all participants. Participants will not be identified in any reports and nothing they say will be personally attributed to them or their APS programs.
All data collected as part of the study will be stored on New Editions Consulting, Inc. secured drive and be accessed only by members of the study team through password protected, New Editions computers. The secured drive is hosted by a third-party data center that leverages an Information Technology Infrastructure Library (ITIL)-based control environment validated for compliance against Health Insurance Portability and Accountability Act (HIPAA), Payment Card Industry Security Data Standards (PCI DSS) and Service Organization Controls (SOC (formerly SAS 70)) frameworks. The data centers are also fully compliant against U.S. Department of Health and Human Services Office for Civil Rights (OCR) and Payment Card Industry (PCI) Audit Protocols. The folder on the secured drive where the data will be stored will be restricted so that only study team members have permissions to access the folder content. Similarly, data submitted through Survey Monkey will only be accessible to the study team through a password protected, New Editions user account. We will migrate the data from Survey Monkey to the New Editions secured drive on a weekly basis.
Although the study instruments will not collect PII, we will have paper record consent forms for interview and focus group participants. These forms constitute human subjects records and will be stored at New Editions offices, in locked cabinet files accessible only to key members of the study team. All consent forms will be maintained as long as the IRB requires (e.g., three years after closure of the project). Following the mandatory retention period, we will shred all consent forms.
Justification for Sensitive Questions
Some questions in the client questionnaire and client interview guide may be sensitive for respondents. APS clients typically interact with APS based on suspected or confirmed abuse, neglect or other maltreatment. In order to determine the impact of the APS program on client outcomes, including personal outcomes like self-determination, safety, well-being, and satisfaction with services, clients will be asked about their experiences with APS and how the program has impacted their lives. Thus, the questionnaire and interview may consequentially bring to mind negative experiences. To lessen respondents’ discomfort, the instruments are carefully designed to only ask questions about client experiences with APS, not about their maltreatment. No questions will be asked about what happened in their lives to prompt APS intervention. Respondents will be informed and reminded as needed that their participation is entirely voluntary and that they may withdraw at any time. Additionally, for interviews, we will arrange for a caseworker to be available in-person or by phone to provide assistance if a client becomes distressed while participating.
12. Estimates of Annualized Burden Hours and Costs
The annual burden estimates are shown below. The estimates for the client questionnaire and client data form are based on plans to invite 6,000 clients to complete the questionnaire.
Respondent/Data Collection Activity |
Respondent Type |
No. of Respondents |
Responses Per Respondent |
Hours Per Response |
Annual Burden Hours |
Client Questionnaire |
Individual |
6,000 |
1 |
0.167 |
1,002 |
Client Data Form |
Local Government |
*6,000 |
1 |
0.167 |
1,002 |
Client Interview |
Individual |
24 |
1 |
0.75 |
18 |
Caseworker Focus Group |
Local Government |
84 |
1 |
1.5 |
126 |
Leaders Interview |
State and Local Government |
16 |
1 |
1 |
16 |
TOTAL |
|
12,124 |
|
3.58 |
2,164 |
*It is likely that one APS caseworker will submit client forms for more than one client. This will depend on which clients respond to the questionnaire. While this number is likely to be smaller than 6,000, we are using 6,000 as the maximum possible number of caseworkers to participate in completing client forms.
The annualized cost estimates associated with respondents participation are shown below. These estimates are based on an average hourly salary of $7.25 for APS clients, an estimated average across a group that will include many retires, some part-time workers and some full-time workers across a variety of fields. For APS caseworkers and APS program leaders we are using the average hourly rate for Social Workers and Social and Community Managers, respectively. Across all respondents, the total estimated burden is equal to 2,164 hours or $36,721.64.
Data Collection Activity (Respondent Type) |
No. of Resp. |
Responses per Resp. |
Hours per Response |
Total Hour Burden |
Hourly Wage Cost |
Total Hour Cost |
Client Questionnaire (APS clients) |
6,000 |
1 |
0.167 |
1,002 |
$7.25 (Federal minimum wage) |
$7,264.50 |
Client Data Form (APS caseworkers) |
*6,000 |
1 |
0.167 |
1,002 |
$25.51 (BLS Occupational data Social Workers2) |
$25,561.02 |
Client interview (APS clients) |
24 |
1 |
0.75 |
18 |
$7.25 (Federal minimum wage) |
$130.50 |
Caseworker focus group (APS caseworkers) |
84 |
1 |
1.5 |
126 |
$25.51 (BLS Occupational data Social Workers) |
$3,214.26 |
Leader Interview (APS leaders) |
16 |
1 |
1 |
16 |
$34.46 (BLS Occupational data Social and Community Managers3) |
$551.36 |
TOTAL |
12,124 |
|
|
828 |
|
$36,721.64 |
*It is likely that one APS caseworker will submit client forms for more than one client. This will depend on which clients respond to the questionnaire. While this number is likely to be smaller than 6,000, we are using 6,000 as the maximum possible number of caseworkers to participate in completing client forms.
13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers
The collection of this information does not have any capital and startup costs, nor operational and maintenance costs. APS leaders and caseworkers in each county APS program enrolled in the study will participate in a training prior to starting data collection. They will also help with coordinating site visit logistics and recruiting participants for focus groups and interviews. The size of APS programs will vary by county. For the purpose of these estimates, we use an average number of five APS caseworkers per county. The total estimated cost burden for these activities is $17,352.06. Details of the cost estimate are shown in the table below.
Activity (Respondent Type) |
No. of staff |
Hours per staff |
Total Hour Burden |
Hourly Wage Cost |
Total Hour Cost |
Data Collection Training (APS leaders) |
27 |
2 |
54 |
$34.46 (BLS Occupational data Social and Community Managers) |
$1,860.84 |
Data Collection Training (APS caseworkers) |
135 |
2 |
270 |
$25.51 (BLS Occupational data Social Workers) |
$6,887.70 |
Site Visit Logistics and Recruitment (APS leaders) |
12 |
6 |
72 |
$34.46 (BLS Occupational data Social and Community Managers) |
$2,481.12 |
Site Visit Logistics and Recruitment (APS caseworkers) |
60 |
4 |
240 |
$25.51 (BLS Occupational data Social Workers) |
$6,122.40 |
TOTAL |
234 |
|
636 |
|
$17,352.06 |
14. Annualized Cost to the Federal Government
The estimated annual cost to the federal government for the three-year study period is shown in the table below.
Type |
Year 1 |
Year 2 |
Year 3 |
Total Cost |
Respondents |
$0 |
$27,036.85 |
$27,036.85 |
$54,073.70 |
ACL Staff |
$9,452.00 |
$9,452.00 |
$9,452.00 |
$28,356.00 |
Contractor Staff |
$278,000.00 |
$400,000.00 |
$250,000.00 |
$928,000.00 |
TOTAL |
$287,452.00 |
$425,547.01 |
$275,547.01 |
$1,010,429.70 |
The above estimated costs are based on the following:
Total estimated respondent burden cost of $54,073.70 calculated in Section 12 and Section 13 above, spread evenly across year two and year three of the study when data collection will occur.
ACL oversight at a 10 percent full-time equivalent level for one staff member earning a Grade 12, Step 5 annual salary for the locality pay area of Washington-Baltimore-Arlington, DC-MD-VA-WV-PA.
The use of contractor staff to design the collection of information, compile, process, and analyze the primary data to be collected; this estimate does not include contractor costs for other components of the overall study (e.g. secondary data analysis of NAMRS).
Explanation for Program Changes or Adjustments
This is a new information collection; there is a program change increase of 2,164 annual burden hours. There are no changes or adjustments that materially affect the collection of information or alter the burden estimates from what was described in the 60-day Federal Register notice.
Plans for Tabulation and Publication
The study will follow a stepped, data analysis plan that includes a range of statistical techniques of varying complexity. We will use univariate statistics (e.g., counts, percentages) to summarize the data, bivariate statistics (e.g., correlation, tests of group differences) to examine the relationship between variables and differences between subgroups of interest, and multivariate statistics (e.g., mixed effects multilevel regression models) to examine independent associations between variables and the outcomes of interest. See Supporting Statement Part B for a full explanation of the statistical techniques that will be used.
The reporting and dissemination strategy will likely include several approaches to sharing final results for the APS client outcomes study. Each of the approaches described below is designed to inform various audiences in the field of APS about the study through tailored communication channels and formats. The target audiences for the reporting and dissemination strategy include APS researchers; practitioners; federal, state, and county authorities; and policy/decision-makers.
Final Report. The final report for the APS client outcomes study will provide a comprehensive description of the study, including the following sections: introduction, background, conceptual framework, methods, sample, data, analysis, results, discussion, and implications sections. The final report will be shared externally at ACL’s discretion (e.g., posted to the ACL.gov website for public access).
Manuscript for Journal Publication. We may adapt the final report into a manuscript for submission to an academic journal that publishes on topics related to APS. Journals that we may submit to include the Journal of Elder Abuse and Neglect, The Gerontologist, Journal of Interpersonal Violence, Journal of Gerontological Social Work, and the Journal of Aging and Health. This method of dissemination would add the study to the formal literature on APS and reach an interested research audience.
Research Briefs. We may also adapt the final report into a short research brief. This research brief could follow a similar format and style to ACL research briefs on the performance of Older Americans Act programs (https://acl.gov/programs/performance-older-americans-act-programs). This method of dissemination would add the study to the gray literature on APS and reach a broad range of APS audiences.
Conference/Webinar Presentations. We may adapt the final report into an oral presentation with slide deck to be delivered via in-person conference and/or webinar. These methods of dissemination would provide an opportunity for the study team to directly discuss questions, feedback, and other input about the APS client outcomes study with a broad range of APS audiences. Webinar presentation(s) could follow a similar format and style to the ACL webinar on the Nutrition Services Program Outcomes Evaluation (https://acl.gov/news-and-events/announcements/webinar-1030-nutrition-services-program-outcomes-evaluation).
The priority choice for conference presentation would be the Annual NAPSA Conference, but the presentations could also be tailored to other audiences that intersect with APS programs, including conferences on abuse and neglect, services to older Americans, public health officials, among others.
Infographics. Finally, we may adapt the findings into infographics to visually represent data, key findings, and implications. Using this format would enable us to present the information clearly and make it more appealing visually for key stakeholders, including the public, legislators, and program leaders.
Project Time Schedule
We will perform the major study activities described below according to the given timeframes. It is important to note that some activities will overlap, and that start and end points may shift based on the status of other steps.
Project Activities |
Timeline |
OMB/PRA Application, Review, and Approval |
June, 2019 – February, 2020 |
Analysis of Quantitative Data4 |
October, 2019 – January, 2021 |
APS Program Recruitment and Enrollment |
February, 2020 – April, 2020 |
Data Collection (Client Questionnaire, Client Data Form, Site Visits) |
April, 2020 – November, 2020 |
Analysis of Qualitative Data |
May, 2020 – January, 2021 |
Reporting and Dissemination5 |
November, 2020 – September, 2021 |
Reason(s) not to Display Expiration Date for OMB Approval
The expiration date for OMB approval will be displayed on all data collection instruments.
1 NAMRS is the national reporting system for state APS data. It is a voluntary surveillance system, but 55 out of 56 states and territories provide some level of data to NAMRS. The dataset is relatively new, with three fiscal years of data collected (FFY16-FFY18). NAMRS was developed with funding and support from ACL.
4 Secondary analysis of NAMRS does not require OMB/PRA approval and will begin immediately. Quantitative analysis of primary data collected using the client questionnaire and client data form will not begin until the end of primary data collection in November, 2020.
5 Reporting and dissemination includes submitting a journal manuscript for publication (expected July, 2021), presenting findings at one or more professional conferences (expected August, 2021), and completing the final report and other products (expected September, 2021).
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Statement |
Author | Ying-Ying T. Yuan |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |