Attachment 4e1. APPLTREE APR Guidance Document
|
CDC-RFA-TS-23-2001 ATSDR’s Partnership to Promote Local Efforts to Reduce Environmental Exposure (APPLETREE)
Annual Performance Report (APR) |
Guidance Document
Companion to ATSDR APR Template
Annual Performance Report (APR) Guidance
CDC RFA-TS23-2001
ATSDR’s Partnership to Promote Local Efforts to Reduce Environmental Exposure (APPLETREE)
Purpose of this Guidance
ATSDR’s Partnership to Promote Local Efforts to Reduce Environmental Exposure (APPLETREE) Annual Performance Report (APR) is designed to:
Quantitatively and qualitatively assess each Cooperative Agreement Partner based on performance measures and evaluation projects applicable to all states and as identified in states’ Evaluation and Performance Measurement Plans (EPMPs). Individual assessments will be used to:
Evaluate the impact the Cooperative Agreement Partner has on public health in its state.
Assess Cooperative Agreement Partner performance against the Annual Plan of Work.
Monitor Cooperative Agreement Partner progress in meeting objectives as listed in the Notice of Funding Opportunity.
Evaluate the APPLETREE program as a whole.
Create a better connection between the Evaluation and Performance Measurement Plan (EPMP) and the APR.
Streamline reporting to reduce the number of APRs, increase quality and efficiency in reporting, and increase utility of the APR for Cooperative Agreement Partners and ATSDR.
General Directions
Each Cooperative Agreement Partner should complete their Continuation Application annually per CDC instructions, which they will receive in the fall of each year. CDC requires grantees to submit their Annual Performance Reports (APR) no later than 120 days prior to the end of the budget period, and the exact deadline and reporting period will be provided, along with instructions from CDC as that deadline approaches. The APR requires a packet of contents, including a project narrative on budget period progress. The following document and accompanying template serves as guidance for completing the project narrative. You must report on all performance measures and evaluation results outlined in your EPMP and challenges, as well as update your work plan for each budget period. The following page provides more detailed directions on each section of the template.
Keep an eye out for this symbol. There are specific, supplemental Component 1, Strategy B (Choose Safe Places for Early Care and Education (CSPECE)) reporting requirements. To avoid duplication in your APR, we’ve pointed out areas where you may consider writing “See CSPECE Reporting Supplement” rather than reporting that information in two areas.
BEGINNING OF INSTRUCTIONS FOR TEMPLATE
Executive Summary
Directions: Provide a high-level summary of the report. This may include:
Purpose/goal of your program
Brief summary of activities
Summary of project period performance, accomplishments, evaluation findings
Summary of lessons learned and next steps
Performance Measures
Directions:
Tables 1-3 have been provided in the APR Template to report performance measures.
List performance measures. A list of NOFO performance measures is already provided in the accompanying template’s table. Update measures, if needed.
For each performance measure, provide quantitative and/or qualitative results for each performance period. Performance measures should be reported, at a minimum, in the form of an aggregate quantitative result for the performance period, when applicable (total of all units completed as opposed to site-specific results). Some performance measures are more qualitative in nature, such as the completion of an activity, so a quantitative result may not apply. In addition, you can report any other supporting results that reflect your performance for the given measure during the performance period (i.e. any related sub-measure(s); narrative describing overall performance and achievements, narrative describing performance on significant sites, barriers and/or facilitators to meeting targets, etc.).
If no data exist, indicate “no data exist” along with an explanation (i.e. a description of the barriers that occurred to implementing activity, notation that the outcome is longer term and has not yet occurred in the reporting period, etc.).
In all cases possible, contextualize the data provided with the target goal for the reporting period, and justify cases in which no target exists (a target is a goal for your state’s performance). Targets can be copied and pasted from your EPMP for easy transfer.
States can also provide their own state-specific performance measures not outlined by ATSDR as much as they reflect state performance. A table has also been provided for this purpose in the accompanying template.
Component 1, Strategy B (CSPECE) measures do not need to be duplicated and can be attached as appendices in the supplemental reporting. The template table already has a reference to the excel spreadsheet to simplify reporting and avoid duplication.
Evaluation
Directions:
Present results for evaluation projects as specified in your state’s EPMP. It may be helpful to individually copy and paste each evaluation project table from your EPMP, as shown below.
Update your EPMP to reflect what actually occurred during the report period. For instance, if you had to change your data collection method from what was originally planned, indicate the data collection method that actually occurred during the reporting period.
Although your EPMP may include CSPECE, the preference is to put CSPECE information with your CSPECE Qualitative Narrative Reporting Form (there is an evaluation section in the CSPECE narrative questions—this would be the best place to put evaluation results for CSPECE)
Table 4. Evaluation Design and Data Collection:
Component: Click here to enter text
Strategy: Click here to enter text
Activity (What activity is in the logic model?): Click here to enter text
Describe the activity in less than 200 words: Click here to enter text
Evaluation Question What do we want to know? |
Indicator How can we measure the answer? |
Data Source(s) Where do we get the data?
|
Data Collection & Analysis Methods How do we get the data? |
Question: Click to enter text
Select one: ☐Process question ☐Outcome question
|
Click to enter text |
Click to enter text |
Select at least one: ☐Data already exist ☐Collecting new data If collecting new data: ☐Survey ☐Interview or focus group ☐Observation ☐Other_____________
When
will data be collected: ☐Baseline/follow-up format ☐Retrospective post ☐Pre/mid/post ☐Other, specify:______
How are you analyzing data:
|
Add additional rows as necessary |
|
|
|
For each evaluation question/outcome/activity your state specified in its EPMP, it is suggested to complete the following statements:
Description of results for reporting period considered significant. Provide results for the measure/indicators specified in the table. Aggregate results are helpful to provide to understand the overall performance for the reporting period. Additionally any site-, project-, or initiative-specific results considered significant can be included here. This should not be a regurgitation of every single site-specific entry provided throughout the year in SIA, HEAT, TA. We already have this information. We want you to present what you consider the best reflection of your state’s performance, and the results that best answer each evaluation question/outcome/activity in your EPMP. As noted in the EPMP, performance measures may contribute to your evaluation projects, so you are welcome to reference results you have already sited earlier in your APR rather than re-write them a second time.
Changes to Evaluation Plan: Provide an explanation for any changes in the measure, data collection and analysis methods, or data source. If there were no changes, indicate “no change.”
Barriers during reporting period. Describe any significant programmatic or evaluation barriers that made it challenging to complete activities or evaluate your program.
Facilitators during reporting period. Describe any significant programmatic or evaluation facilitators that helped you complete activities or evaluate your program.
Conclusions. This section goes a step beyond providing results and gives an interpretation of those results. Explain what your results mean with the goals of your program and limitations of the evaluation in mind. Think about the takeaway(s) from your results to highlight—the key successes and key areas for improvement. Some questions to think about when interpreting your findings are listed at this end of this guidance. Highlight takeaway/lessons learned and actions steps to utilize those lessons learned.
*Copy Table 4 and bullets for to report out on each component/strategy as necessary.
(Optional) Successes:
Include any additional successes on completing activities outlined in the work plan not already captured through performance measure or evaluation reporting that reflects your state’s annual performance.
Include success stories for the reporting period that best highlight the impact of your program. One success story per quarter must be uploaded to SharePoint. You do not have to copy success stories to your APR; it is optional. Success stories can provide personal, community, and far-reaching impacts of your program in a story form that provides additional context to supplement other results and round out your performance report.
CSPECE success stories should be included in your CSPECE Qualitative Narrative Reporting Form.
Work Plan
Awardees must provide an updated work plan each budget period to reflect any changes in project period outcomes, activities, timeline, etc. Work plans may be included in the text or in an appendix depending on the most suitable format.
Challenges: Describe any challenges you have not yet mentioned that may affect your program’s ability to achieve annual and project period outcomes, conduct performance measures, or complete activities in the work plan.
CDC Program Support to Awardee:
Describe how CDC could help your program overcome challenges to achieving annual and project-period outcomes and performance measures, and completing activities outlined in the work plan.
Appendix: Additional Questions for Conclusion on Outcomes/Progress
Below are some example questions to think about when answering the prompt “Conclusion on outcomes/progress.” You do not need to answer all of these questions. These are simply to help you brainstorm and think about how to interpret your results.
What are your key conclusions and justifications from these results?
Did you achieve your goals/outcomes? Think about why or why not.
Are there any connections, links, trends, patterns?
Did some sites demonstrate more “success” than others? What might have contributed to this?
What were the high performing areas or successes? Why did these successes occur? What could have potentially contributed to the success?
Is there a promising best practice to scale up?
What were the areas for program improvement? What are some possible reasons for lower scores?
What could be modified to improve performance next year?
Are there are alternative explanations for results?
Are your results as expected? If not, why do you think that was?
What are the limitations in the data that need to be addressed?
Do you plan to update anything on your EPMP based on these findings? For example, are there any additional measures to add or measures to modify based on the present results?
What are your recommendations based on these findings? What specific recommendations do you have for next steps, based on results?
APR Guidance CDC RFA-TS20-2001
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Hall, Elisha (ATSDR/DCHI/OD) |
File Modified | 0000-00-00 |
File Created | 2023-11-06 |