Instructions for the Final Descriptive Evaluation Report for Healthy Marriage and Responsible Fatherhood Grantees
The Administration for Children and Families (ACF), Office of Family Assistance (OFA) requires all Healthy Marriage and Responsible Fatherhood (HMRF) grantees with local descriptive evaluations funded by OFA to submit a final evaluation report. The purpose of this document is to provide guidance on structuring your final report so it is comprehensive and accessible. Many of the report sections draw directly from the descriptive analysis plan. When feasible, the instructions below reference the relevant sections in the analysis plan template. You can use text from your plan, updated as appropriate, to simplify your report writing. Below, we provide an annotated outline that contains guidance for each section. This outline describes: (1) the purpose of the section and what should be discussed, (2) things to keep in mind when writing the section, (3) existing documents that may be potential sources for the section, and (4) whether non-text elements (tables or figures) could be useful.
You should use the accompanying template (FaMLE Descriptive Report Template.docx) to write your report, because it will make it easier for the Administration for Children and Families (ACF) and your Evaluation Team Technical Assistance (ETTA) liaison to review the reports. Or, you can provide the specified information in a separate document, if all sections of the template are included in that document. The template outlines the report and gives you spaces to fill in each section. A separate file (FaMLE Descriptive Report Table Shells.docx) provides some required and optional table shells for you to use and paste into the outline file as you write the report. Using these shells will help you finish the report faster and ensure you include key details that will help readers understand your findings.
Here are some additional suggestions for your final descriptive report:
Organize the final report so it is approximately 30 to 40 pages long, double-spaced (or if you prefer, 15 to 20 pages single-spaced), excluding tables, figures, references, and appendices. Put additional technical details in the appendices.
The report should be written for all audiences. Write as if the audience has not been involved in the grant and knows nothing about the intervention or the evaluation. The report should give the reader enough detail to understand the intervention and its evaluation, and it should be free of project- or intervention-specific jargon and abbreviations.
Reach out to your ETTA liaison with questions about the report guidance or your approach as you begin to work on your report. They are available and eager to address your questions quickly. Getting questions resolved early in the process will simplify the review process at the end of the grant period.
Please submit a report that you believe is ready for review by your ETTA liaison and OFA. Ideally, it will have been edited and read by at least a few other people before you submit it; this should minimize the number of editorial comments your family program specialist (FPS) and ETTA liaison will need to provide. Their goal is to focus on content and technical details rather than presentation.
Use caution with causal language: Please note that the findings presented in this report are not evidence of a causal relationship between the intervention (or program) and outcomes. For example, please do not make statements such as, “The intervention led to/increased/decreased an outcome.”
Please email your final report to your FPS and ETTA liaison whenever it is ready, but no later than [due date]. For consistency, please use this common naming convention when submitting your report: [short version of Grantee name] Descriptive Report_[MM-DD-YYYY report draft date]. Please send a Word version of the document, not a PDF. Your FPS and ETTA liaison will review the final report, provide comments and suggested edits, and return it to you for revisions. Your final report must be approved by your FPS by the end of your grant.
The report’s cover page should include the title of the report, grantee name, date, all authors, and author affiliation(s).
On page ii, specify the recommended citation for your report, list any acknowledgements, and disclose any conflicts of interest—financial or otherwise. For an example of how to identify a conflict of interest, please see the International Committee of Medical Journal Editors. Please note: if the evaluation team is not completely independent from the intervention team (that is, if they are not different organizations with completely separate leadership and oversight), this is a conflict of interest that must be documented.
Finally, include the attribution to ACF:
This publication was prepared under Grant Number [Insert Grant Number] from the Office of Family Assistance (OFA) within the Administration for Children and Families (ACF), U.S. Department of Health & Human Services (HHS). The views expressed in this report are those of the authors and do not necessarily represent the policies of HHS, ACF, or the OFA.
Purpose. Provide a high-level summary of the final report.
Instructions and reminders. In no more than one page, summarize (1) the objectives and key services offered by the program; (2) the main focus of the descriptive study; (3) the number of sites that participated in the study; (4) the number of clients enrolled and the number with complete data for the outcomes study (if applicable); (5) key types of data and size of samples for the implementation study (if applicable); and (6) key conclusions, lessons learned from this work, and notable limitations.
Potential sources. Descriptive analysis plan
Non-text elements. None
Purpose . Orient the reader to the study.
Instructions and reminders. In this section, (1) explain the motivation for this intervention and why responsible fatherhood and/or marriage and relationship education are important for the local community under study; (2) briefly describe the program being studied; (3) explain the motivation for conducting this descriptive evaluation and, if applicable: (4) briefly summarize previous research describing aspects or effects of the intervention, and how this current study compares to earlier research and adds to the knowledge base about these kinds of interventions (you will add details on your intervention later). The reader should understand why the intervention focused on the population under study and the motivation for selecting the chosen intervention.
At the end of the introduction, please add a paragraph outlining the structure for the remainder of the report.
Potential sources. Grant application
Non-text elements. None
Purpose. Summarize the intervention being studied.
Instructions and reminders. This section should describe the intervention condition as it was intended to be (or what the intervention participants were supposed to receive). You can draw from the description already detailed in your descriptive evaluation analysis plan. Discuss (1) intervention components; (2) intervention content; (3) intended implementation—including the location or setting, duration and dosage—and staffing (including the education and training of staff); and (4) population the intervention focuses on.
Section II will discuss the intervention services participants actually received in this study.
Optional: If you had a graphical representation of your logic model in the descriptive analysis plan, please include it in the appendix (see Appendix A in the table shells document) of your report, and briefly discuss it here.
Potential sources. Section A of your descriptive analysis plan
Non-text elements. It is often useful for the reader if you summarize information describing the intervention in a table (with, for example, rows for the focus population, intervention components, intervention content, planned mode of delivery, and so on).
Please refer to Tables I.1 and I.2 in the table shells document to summarize intervention components in a table. These are the same as Tables 1 and 2 in your descriptive analysis plan.
The rest of this document gives you one option for organizing the study methods, data, and findings. If it makes more sense for you to structure the report differently for your study, please discuss this with your ETTA liaison. For example, if you conducted only an outcomes study or only a process/implementation analysis, you would not include either Section II or Section III (depending on which study you did) in your report. As another example, if the analysis methods were notably different from one research question to another, one possibility is to organize the report by research questions (present a research question, discuss the data and analysis approach used to answer it, discuss the findings, and then move to the next research question).
Introduce the process/implementation study and summarize the structure of this section (that is, give the reader a road map). If you also did an outcomes study, state here that the findings discussed in this section will provide important context for the outcomes study findings in Section III.
Purpose. Outline the main hypotheses of your process/implementation analysis.
Instructions and reminders. This section should briefly describe what you wanted to know about the implementation of the intervention (for example, aspects of engagement, dosage, or fidelity to the model). In other words, this section should present the research questions you addressed in your process/implementation analysis (the same questions outlined in your descriptive analysis plan).
Potential sources. Descriptive analysis plan, Section C.1 and Table 5
Non-text elements. You can include Table 5 from your analysis plan to list the research questions.
Purpose . Describe the study eligibility criteria and the sample used to conduct the process/implementation study. You might need to describe the different samples you used for different research questions (if applicable).
Instructions and reminders. Describe how members of the focus population became part of the process/implementation study, including the eligibility criteria, purposeful sampling, and consent process for study enrollment. Include descriptive statistics (demographic characteristics and other key details) of your process/implementation study sample. (Summarize characteristics in Table II.1).
Indicate that details of how the outcomes study sample was formed (if applicable) will be in Section III.B.1. If you are conducting an outcomes study, describe whether and how sample formation for the process/implementation study differed from how the sample was formed in the outcomes study. For example, did you sample a subset of individuals included in the outcomes study? Did you include everyone enrolled in the outcomes study who completed a baseline survey?
Potential sources. Descriptive analysis plan, Section C.2
Non-text elements. Summarize characteristics of your process/implementation study sample(s) in Table II.1
2. Data Collection
Purpose. Document the data sources that were used to conduct the process/implementation analysis. You might need to describe different data sources that were used for different research questions.
Instructions and reminders. Describe the data collected for the process/implementation study. Discuss the data sources for each aspect of process or implementation being examined. What data were collected? Who was responsible for collecting the data? This information can come directly from your descriptive analysis plan.
Potential sources. Descriptive analysis plan, Section C.3 and Table 6
Non-text elements. If you collected data from a variety of sources, a table can help organize what you present in this section. See Table II.2 in the table shells document (Table 6 in your descriptive analysis plan).
Purpose. Describe how the process/implementation outcomes were operationalized using various data elements.
Instructions and reminders. Describe how each process/implementation measure was constructed. Use a table to link the description of measures to the research questions.
Potential sources. Descriptive analysis plan, Section C.4 and Table 7
Non-text elements. To support the discussion above, summarize the measures and constructs corresponding to each research question in a table. See Table II.3 in the table shells document (Table 7 in your descriptive analysis plan).
Purpose. Describe the intervention as it was actually implemented and, if applicable, provide context for the outcomes study findings in Section III.
Instructions and reminders. This section is divided into subsections organized by research question. First, in an introductory paragraph at the beginning of Section C, briefly state the key findings or takeaway(s) across all of the process/implementation analyses.
Next, discuss the findings for each research question in turn. For each research question, first state the key finding(s) in the text box (Key findings) at the top of each subsection. The findings should be reported concisely and grounded in numeric results. (For example, “The intervention was implemented with fidelity, and achieved its goals for attendance. Staff delivered 95 percent of all intended intervention sessions, and 82 percent of the sample attended most intervention sessions.”) This section should also briefly describe the methods used to analyze the data. Details about the methods should be in Appendix B.
Use this section to tell a story about the intervention’s process/implementation that provides both context for the outcomes study (if applicable) and the key lessons learned from implementation.
Potential sources. Descriptive analysis plan, Section C.4
Non-text elements. If tables would help to summarize the findings, please add them.
Introduce the outcomes study and summarize the structure of this section (provide a road map).
Purpose. Articulate the main hypotheses examined in the outcomes study.
Instructions and reminders. This section should present the research question(s) for the outcomes study from your descriptive analysis plan. These questions typically focus on how participating in the intervention is associated with healthy relationship/marriage or responsible fatherhood outcomes. Each question should focus on a specific outcome and time point to help connect the outcome(s) and time point(s) to the intervention’s theory of change.
Potential sources. Descriptive analysis plan, Section B.1
Non-text elements. None
1. Sample Formation
Purpose . Describe the study’s eligibility criteria and the sample you used to conduct the outcomes study.
Instructions and reminders. Describe how members of the focus population became part of the outcomes study, including the eligibility criteria, purposeful sampling, and consent process for study enrollment.
This content can be pulled directly from your descriptive analysis plan.
Potential sources. Descriptive analysis plan, Section B.2
Non-text elements. None
Purpose. Indicate how and where you obtained data on outcomes of interest (as well as key explanatory variables).
Instructions and reminders. Describe the data collection you conducted, including sources, timing, mode of administration, and overall process. This information can be pulled from your descriptive analysis plan. You can include copies of your data collection instruments in Appendix F.
Report details about data cleaning in Appendix C.
Potential sources. Descriptive analysis plan, Section B.3 and Table 3
Non-text elements. A table can succinctly summarize the features of the data collection process.
See Table III.1 in the table shells document as an example. (This is the same as Table 3 in your descriptive analysis plan.)
Briefly introduce this section, indicating you will describe (1) the construction of the analytic sample you used for the outcomes analysis, (2) the outcome measures, and (3) the characteristics of the analytic sample.
Purpose. Describe the flow of participants into the analytic sample and how you operationalized outcomes of interest, and give the sample’s characteristics.
Instructions and reminders.
Clearly state how many clients (for example, individuals or couples) were in your analytic sample for each research question, and how the analytic sample was created. How did you define the sample? What data were required for an individual to be part of the sample? How did you handle missing baseline data? Discuss the approaches you used to minimize sample attrition. Describe why any attrition occurred. Document the overall attrition rate from the baseline sample (the sample of study participants who completed the baseline data collection) through the final analytic sample that the outcomes study was conducted on.
Describe the outcomes examined for each research question, and briefly explain how you constructed each outcome measure. If you constructed a measure from multiple items, please document the source items and explain how the items were combined to create an outcome that was analyzed.
Describe the baseline characteristics of the analytic sample, and indicate which characteristics are included as covariates in the analysis.
Potential sources. Descriptive analysis plan, Section B.5 and B.6 and Table 4.
Non-text elements.
Summarize the number of individuals (and couples, if applicable) from baseline through the analytic sample using Table III.2 in the table shells document.
Please add rows or columns as appropriate. For example, if you had two follow-up surveys, include a row for “Completed post-program survey (timing)” and “Completed follow-up survey (timing).” If your study did not have couples, delete the last column.
“Timing” refers to the amount of time that passed since baseline or enrollment into the intervention.
The last row of sample sizes should be your final analytic sample and match the sample size in Table III.3, described next.
In the last row of the table, include the overall attrition rate (depending on your study, this could be from baseline to follow-up or enrollment to follow-up).
Summarize baseline sample characteristics using Table III.3 in the table shells document. Examples are included for you in italics. In the appendix, summarize how the analytic sample differs from the sample that does not have follow-up data (Table D.1 in the table shells document, Appendix D). Briefly summarize key differences in this report section. Be sure to complete this analysis for each analytic sample if your research questions focus on different follow-up periods and samples.
Summarize the outcome measures in a table. Refer to Table III.4 in the table shells document. This is the same as the Table 4 in your analysis plan, with some new information (see guidelines below). The template includes examples for you in italics.
In the “Outcome measure” column, please include the name you will use for the outcome throughout the report, in both text and tables.
In the “Description of outcome measure” column, please describe the outcome and detail any items you used to construct it. If an outcome is a composite of multiple items (for example, a scaled measure that is the average of five survey items) please report its Cronbach’s alpha (measure of internal consistency). See the example in the table. If the outcome is a published measure or scale, please provide the name of the measure. Note whether you used mean imputation for cases missing 20 percent or fewer items.
In the “Source column,” document the source for each measure. If all measures are from the same source, please delete this column, and add the source as a note at the bottom of the table.
In the “Timing of measure” column, please indicate the amount of time that has passed since baseline or enrollment into the intervention.
Purpose. Present the results for the outcomes study, and describe the approach used to arrive at the results.
Instructions and reminders. This section, like the section that presents the process/implementation findings, is divided into subsections organized by research question. First, in an introductory paragraph at the beginning of Section C, please briefly summarize the key findings from the outcomes study.
Next, discuss the findings for each research question in turn. For each research question, first state the key finding(s) in the text box, Key findings, at the top of each subsection.
For each research question, present the results in a table(s), then discuss the findings in the text. You can structure the table and discussion as you see fit. Make sure each finding aligns with a given research question. Briefly elaborate on the findings and patterns of findings in this section, but save the broader discussion for the conclusion.
Please present the findings in a metric (for example, percentage point difference) that is easy for readers to interpret. For example, if the method was logistic regression, do not present results as odds-ratios in the body of the report; instead, transform them into something that will make sense to a lay reader, such as predicted probability or an adjusted mean.
Next, describe how you analyzed the outcome measures. For example, explain whether you conducted a correlational analysis of the association between process/implementation variables and outcome measures, or conducted a pre-post analysis that compared indicators at baseline and follow-up. This discussion of methods should summarize the following:
The model specification, including covariates that you included. Note whether covariates differed across the models used to answer research questions. It might help to refer to Table III.4 (Characteristics of participants in the outcomes study at study enrollment).
Criteria used to assess statistical significance
If applicable, information on sample weights and other items related to the study design
Potential sources. Descriptive analysis plan, Section B.5
Non-text elements. Please summarize results in tables or figures. You may choose to present findings from multiple research questions in the same table or figure, or separately. Tables III.5 and III.6 in the table shells document are examples of tables you can use if appropriate for your study. Additional findings can be placed in Appendix C.
Purpose. Summarize and discuss the key findings of the study, and discuss lessons learned, limitations, and conclusions.
Instructions and reminders. Discuss the main findings of the study, and, if applicable, weave the outcomes and process/implementation findings into a coherent story about observed outcomes and intervention implementation, taking into account the results from the attrition analyses. Discuss important lessons learned that are consistent with study findings or that could help others replicate the intervention or serve the same population. Discuss limitations of the study. Discuss next steps for intervention and research. Draw conclusions.
Potential sources. Earlier sections of this report
Non-text elements. None
Purpose. Provide the full reference for any work cited in the report.
Instructions and reminders. Please use the American Psychological Association (APA) style guide for citing works in the report. This section should include the full reference for any work cited.
Potential sources. None
Non-text elements. None
Based on our guidance for the report sections, the report may include the following appendices. (Note: it may not be necessary to include appendices for all of these items; appendices are not included in the page count guidance.)
Logic model (or theory of change) for the intervention
Details of the process/implementation analysis, organized by research question. For questions that involve data coding or complex analysis, discuss the details in this appendix.
Methods used to clean and prepare data for the outcomes study (including detailed descriptions of how missing and inconsistent data were handled)
Attrition analyses and tables
Details of the outcomes analysis, organized by research question. For questions that involve data coding or complex analysis, discuss the details in this appendix.
Data collection instruments
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | WWC Management |
File Modified | 0000-00-00 |
File Created | 2023-10-26 |