Administration
for Community Living
Fidelity
Evaluation of ACL’s Evidence-Based Programs
OMB
Supporting Statement – Part A
November
2021
A.1. Circumstances that Make the Collection of Information Necessary 1
A.2. Purpose and Use of Information Collection 2
A.3. Use of Improved Information Technology and Reduction Burden 3
A.4. Efforts to Identify Duplication 3
A.5. Impact on Small Businesses or Other Small Entities 3
A.6. Consequences if Information Not Collected or Collected Less Frequently 4
A.7. Consistency with Guidelines of 5 CFR 1320(d) 4
A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 4
A.9. Explanation of any Payment or Gift to Respondents 4
A.10. Assurance of Confidentiality Provided to Respondents 4
A.11. Justification for Sensitive Questions 5
A.12. Estimates of Annualized Hour Burden and Costs 5
A.13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 6
A.14. Estimates of Annualized Costs to the Federal Government 7
A.15. Explanation for Program Changes or Adjustments 7
A.16. Plans for Tabulation and Publication and Project Time Schedule 7
A.17. Exception for Display of Expiration Date 8
The Administration for Community Living (ACL), within the U.S. Department of Health and Human Services (DHHS), requests approval from the Office of Management and Budget (OMB) for data collection to support the execution of the Fidelity Evaluation of ACL’s Evidence-Based Programs. ACL’s mission is to maximize the independence, well-being, and health of older adults and people with disabilities across their lifespan. ACL provides mandatory funding to 50 states and 6 U.S. territories (the District of Columbia, Puerto Rico, American Samoa, Guam, the Mariana Islands, and the Virgin Islands) under Title III-D of the Older Americans Act (OAA) (i.e., mandatory grantees), and also awards grants on a competitive basis to applicants who seek to promote the well-being of target populations through disease prevention and health promotion (i.e., discretionary grantees).1 Since 2012, the law mandates that Title III-D funding be limited to evidence-based programs.2
The authorizing legislation for this data collection is found in Title II of the OAA, which requires ACL to “measure and evaluate the impact of all programs authorized by this Act…”3
ACL has worked with stakeholders to develop processes for evaluating the strength of the evidence supporting specific programs, as well as a variety of resources to educate prospective grantees about the importance of maintaining fidelity to program models.4 ACL provides technical assistance to support evidence-based programming, which includes access to a variety of tools for planning fidelity. ACL, however, has not previously studied how and how well grantees use these and other resources to ensure that programs are implemented with fidelity. The primary goals of ACL’s Fidelity Evaluation are to understand:
The process ACL staff use when making awards (i.e. discretionary grants) or otherwise verifying that grantees (e.g., mandatory grantees) are using funds for evidence-based programming;
How grantees select the evidence-based programs that they implement or, for example for Chronic Disease Self-Management Education (CDSME) grantees, how they determine the grants for which they will apply;
How grantees implement those programs and verify that they are being implemented with fidelity to their original models including the identification of intentional and unintentional adaptations;
The strengths and weaknesses of the current award and implementation process;
How and for what reasons grantees and sub grantees work with the developers of the evidence-based programs, including how grantees work with program developers to identify acceptable program adaptations that maintain the integrity of the evidence-based nature of the program.
To understand these issues, data collection will be required from mandatory and discretionary grantees, and from sub grantees who actually implement the programs. For mandatory grants, selection of programs and supervision of fidelity are handled primarily by the state units on aging (SUAs) for the 50 states and 6 U.S. territories. For discretionary grant, selection of programs and supervision of fidelity are handled by the organizations that apply for grants to implement specific programs. These applicants include a variety of organizations such as local non-profits, Area Agencies on Aging, academic partners, or interest groups, some of which implement programs directly, and others which subcontract implementation to other organizations. A flexible multi-layered approach is necessary to obtain the necessary information to better understand the community’s approach to supervision and maintenance of fidelity. Two complimentary surveys have been designed as the most parsimonious approach to obtaining this data, which is not currently being collected in any other form.
The data collection tools developed pursuant to this contract will enable ACL to collect information from mandatory and discretionary grantees to better understand the procedures they use to ensure that grant-funded evidence-based programs are being implemented with fidelity to the program models. Specifically, the information will be used to answer the research questions provided below in Exhibit 1.
Exhibit 1: Key Research Questions |
|
ACL intends for this new, one-time data collection effort to inform steps it can take to aid grantees and improve processes for the selection, implementation, and fidelity monitoring of evidence-based programs.
Surveys will be used to gather data from grantee program administrators and implementation organization program directors. The surveys (one for grantees and one for implementation organizations) will be administered electronically over the internet to minimize the burden on respondents. All respondent groups will be emailed a survey invitation with instructions on web survey access, including a link to the appropriate survey and login information unique to the respondent. The unique login will allow tracking of survey completions to inform follow-up efforts. The web-based format will permit respondents to complete surveys on the first attempt or save their responses and resume work at a later time. The web-based surveys will incorporate skip patterns to automatically skip sections not relevant to a specific respondent’s experiences. Respondents that have not yet completed the appropriate survey will be sent reminders via email to maximize the survey response rate.
The information sought as part of this study is unique and has not been collected elsewhere in any format that could be adapted to address the research objectives of the evaluation. Based on our review of existing data sources, no survey or other mode of data collection has captured the needed information on the processes used by grantees and their contracted implementation organizations to ensure fidelity in their evidence-based programming. However, any existing information that might be useful to address the research questions can and will be used whenever possible.
The target populations for this effort, particularly the discretionary grantees and the organizations that implement programming on their behalf, are likely to include small businesses/non-profit entities. ACL is committed to minimizing the reporting burden for all respondents to the extent possible. The survey will be administered electronically over the internet, with no special knowledge or software required aside from an up-to-date web browser. Both surveys are designed to be easily understandable by mandatory and discretionary grantee administrators and program implementers. To reduce the burden of completing the surveys, simple response scales (e.g., yes/no or 5-level response scales) have been selected for most questions. No questions in the survey require arithmetic calculations or ask for exact numbers. Open text responses have been held to a minimum in the survey; there is only one item where an open text response is expected. All information requested should be readily available in the survey respondents’ business records or in their direct knowledge. The grantee survey should take no more than 40 minutes to complete under normal circumstances, while the implementation survey should take no more than 35 minutes to complete. Additionally, the surveys have been reviewed by subject matter experts to ensure that all components are necessary to the research purpose.
If the data collection is not conducted, ACL will not obtain the direct information necessary to understand and improve current processes for maintaining fidelity to evidence-based program models in the programs it funds. Data for this Fidelity Evaluation will only be collected once. There are no technical or legal obstacles to conducting the surveys as planned in order to minimize burden.
This data collection request is fully consistent with the guidelines in 5 CFR 1320.8(d). There are no special circumstances required for the collection of information in this data collection.
In accordance with the Paperwork Reduction Act (PRA) of 1995, the notice required in 5 CFR 1320.8(d) has been published in the Federal Register announcing ACL’s intention to request an OMB review of data collection activities. This notice was published on July 12, 2021, in volume 86, number 13720, on pages 36558-36559 and provided a 60-day period for public comment. ACL did not receive any comments during the 60-day public comment period for the Fidelity Evaluation PRA notice.
In accordance with the Paperwork Reduction Act (PRA) of 1995, a 30-day Federal Notice published in the Federal Register announcing ACL’s intention to request an OMB review of data collection activities. A 30-day notice was published on November 16, 2021, in volume 86, number 24923 on page 63401.
The surveys were developed by ACL and its contractor, Health Services Advisory Group, Inc. (HSAG). External input and feedback on the surveys were sought from Kathleen Zuke and Jennifer Tripken, National Council on Aging (NCOA) Center for Healthy Aging, and Lesley Steinman, University of Washington, Member of the Board of Directors, Evidence Based Leadership Collaborative.
No respondent group will be offered any payments or gifts for their participation in the study.
No data will be sought from individuals participating in programs, and the survey questions do not request any data that could potentially be personally identifiable data. Although no assurance of confidentiality is provided to survey respondents, ACL is committed to protecting their privacy as part of this data collection activity.
Respondents will be informed of the purposes for which the information is collected in a survey invitation email, which will inform them that their participation is strictly voluntary. The invitation email will provide a unique link and login information for each respondent to complete the web-based survey.
Special steps will be taken to ensure that data collected via the web surveys are secure. Access to the survey is only allowed with valid login information (i.e., username and password).
ACL will publish aggregate statistics summarizing the survey responses in a report. Individual respondents will not be identified in any report, publication, or presentation of this study or its results.
No questions of a sensitive nature will be asked in the surveys. Questions are restricted to grantees’ understanding of and experiences with evidence-based programs funded by ACL under the OAA.
The data collection tools/survey instruments discussed here will be fielded one time only to the appropriate grantee audience. Burdens, therefore, will be one-time only, not annual. Because the number of respondents is limited, the survey is relatively brief, and information requests will be limited to records and resources respondents are expected to have readily at hand as part of their business operations, the burden will be small. Below are estimates of the expected time to complete the survey.
Grantee Survey: SUAs for all 50 states and 6 U.S. territories and all 47 distinct discretionary grant recipients will be invited to complete a survey estimated to take at most 40 minutes to complete.5 The survey includes 71 items, most of which are single response or select all that apply types of items. The survey includes skip patterns based on organizational characteristics and responses to gateway questions, meaning that many if not most respondents will be asked to complete fewer than 71 items. Because our time estimate was based on the time it would take to complete all items, it is likely an overestimate. No calculations or precise numeric responses are required. One item requests an open-text response that would not be expected to exceed 3 sentences in length.
Implementation Organization Survey: We intend to sample 4 implementation organizations for each of the 56 SUAs and 47 discretionary grantees, so the theoretical maximum number of implementation organizations to be invited is 412. The actual number of implementation organizations that will be invited to participate in the survey may be smaller, since these organizations are subgrantees or contractors to the grantees, and sampling will depend on the cooperation of the grantees.
See Part B of this statement for details on the sampling methodology. The implementation organization survey is based on the grantee survey but is shorter at 65 items. Estimated completion time is at most 35 minutes.
Exhibit 2 presents estimates of the total one-time reporting burden for respondents and Exhibit 3 presents estimates of total one-time reporting cost.
Exhibit 2: Estimated Reporting Burden for Respondents |
||||
Responded/Data Collection Activity |
Number of Respondents |
Responses Per Respondent |
Hours Per Response |
One-Time |
Grantee: Program selection process and survey |
103 |
1 |
2.00 |
206 |
Implementation Organization Survey |
412 |
1 |
0.58 |
239 |
Total |
515 |
-- |
.86 |
445 |
Exhibit 3: Estimated Burden Cost |
||||
Type of Respondent |
Number of Respondents |
Total Burden Hours |
Average Hourly Wage Rate* |
Total Cost |
Grantee: Program selection process and survey |
103 |
206 |
25.09 |
$5,168.54 |
Implementation Organization Survey |
412 |
239 |
25.09 |
$5,996.51 |
Total |
515 |
445 |
-- |
$11,165.05 |
There are no annualized capital/startup or ongoing operation and maintenance costs involved in collecting the information. Other than their time to complete the interview or survey, which is estimated to be 40 minutes for the grantee survey and 35 minutes for the implementation organization survey, there are no direct monetary costs to respondents.
The total estimated cost to the Federal Government for the Fidelity Evaluation of ACL’s Evidence Based Programs data collection activities is $397,722.15 over a period of two years (September 2020 to September 2022). This is the Federal contract amount to HSAG for data collection and analysis activities associated with this submission. There are no additional costs to the Federal Government based on Federal Staff wages and benefits.
This is a new information collection request there is a program change increase of 445 annual burden hours.
A final Fidelity Evaluation report will be created to document the information collected, and provide clear, actionable recommendations for ensuring the effective use of evidence-based programming. Recommendations may address what ACL, its grantees, and subgrantees can do to improve the selection, implementation, and monitoring of evidence-based programming, and may include a tool for use by ACL and its grantees to assess and monitor fidelity after the contract ends. In addition to any such tools, the final report will include the following sections:
Executive Summary: The executive summary will be written in a manner that makes it useful as a stand-alone document for individuals who do not have time to review the entire report. It will highlight the objectives, key findings, and the implications of these findings on the implementation of evidence-based programs.
Methodology: This section will describe the methods used for developing, implementing and analyzing the surveys.
Key Issues and Findings: This section will discuss findings around each of the key research questions.
Recommendations: Conclusions will include recommendations or suggestions for job aids, as well as for any identified future research and policy initiatives.
Analysis will begin shortly after the final data are collected, which is projected to be in January 2022. The contractor will analyze the data using thematic analysis for qualitative data and basic frequencies and cross tabulations for quantitative data. Simple statistical testing may be used (t-test and chi-square) to identify significant relationships between types of grantees and approaches or practices related to fidelity.
Exhibit 4: Timetable for Data Collection and Publication for Other Data Collection Efforts |
|
Activity |
Estimated Completion Date |
Develop Instruments for Data Collection |
|
Anticipated month during which survey instruments will be completed** |
Sepetmber 2021 |
Obtain OMB approval for data collection, estimated |
December 2021 |
Implement Data Collection |
|
Anticipated month during which data collection will be completed |
January 2022* |
Conduct Data Analysis |
|
Conduct analysis of data collected via the survey instruments |
March 2022* |
Develop Report |
|
Produce the final report on the fidelity evaluation |
September 2022 |
* Dates are contingent on the date of OMB approval **The survey instruments will be finalized after the public comment period. |
All data collection materials will display the OMB expiration date.
ACL certifies that the collection of information encompassed by this request complies with 5 CFR 1320.9 and the related provisions of 5 CFR 1320.8(b)(3).
1 Section 361 (a) of the Older Americans Act (OAA) of 1965, as amended, states “The Assistant Secretary shall carry out a program for making grants to States under State plans approved under section 307 to provide evidence-based disease prevention and health promotion services and information at multipurpose senior centers, at congregate meal sites, through home delivered meals programs, or at other appropriate sites.”
2 The FY 2012 Congressional appropriations law specified that “Evidence-based health promotion programs, as defined in the Older Americans Act (Title I section 102 (14)(D)) includes programs related to the prevention and mitigation of the effects of chronic disease…, alcohol and substance abuse reduction, smoking cessation, weight loss and control, stress management, falls prevention, physical activity, and improved nutrition.”
3 Title II of the OAA requires ACL to “measure and evaluate the impact of all programs authorized by this Act, their effectiveness in achieving stated goals in general, and in relation to their cost, their impact on related programs, their effectiveness in targeting for services under this Act unserved older individuals with greatest economic need (including low-income minority individuals and older individuals residing in rural areas), and unserved older individuals with greatest social need (including low-income minority individuals and older individuals residing in rural areas), and their structure and mechanisms for delivery of services . . . .”
4 ACL’s Aging and Disability Evidence-Based Programs and Practices (ADEPP) Guide to Reviewing Evidence-
Based Programs.
5 Discretionary grantees may hold two or more grant awards but will only be invited to respond once. SUAs are eligible for discretionary awards. In two instances, the SUA contact is also a discretionary grant contract, and those SUAs will receive only one survey each. The count of 47 distinct discretionary grantees disregards the number of awards held and excludes the two aforementioned SUAs.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | OMB Supporting Statement - Part A |
Author | HSAG |
File Modified | 0000-00-00 |
File Created | 2021-11-19 |