According to the Paperwork Reduction Act of 1995, an agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a valid OMB control number. The valid OMB control number for this collection is 0970-0356; this number is valid through 06/30/2021. Public reporting burden for this collection of information is estimated to average 8 hours, including the time for reviewing instructions, gathering and maintaining the data needed, reviewing the collection of information, and revising it. This collection of information is voluntary for individuals, but the information is required from Grantees. It will be used to gather information about the healthy marriage and relationship education grantees with proposed local evaluations. Send comments regarding burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to Robert Wood; rwood@mathematica-mpr.com
The Administration for Children and Families (ACF) Office of Family Assistance (OFA) requires all Healthy Marriage and Relationship Education (HMRE) grantees with local evaluations funded by OFA to submit a local evaluation plan. The purpose of this document is to provide instructions on how to structure your evaluation plan so it is comprehensive and demonstrates a strong research design. Below, we provide an annotated outline that contains guidance for each section. This outline describes: (1) the purpose of the section and what should be discussed, (2) things to keep in mind when writing the section, (3) considerations for programs serving different populations, and (4) whether non-text elements (tables or figures) could be useful.
You can write the report directly in the accompanying template (Local Evaluation Plan Template.docx), which will facilitate review by ACF and your Evaluation Technical Assistance Partner (ETAP). Or you can provide the specified information in a separate document, making sure to include all sections of the template in that document. The template gives you an outline of the evaluation plan with sections for you to fill in. The plan should be about 15 to 20 pages long and single-spaced, excluding appendices.
Here are more suggestions for your evaluation plan:
Reach out to the local evaluation TA help desk at xxx@mathematica-mpr.com with questions about the plan guidance or the template as you begin to work on your plan. Getting questions resolved early will simplify the planning process.
Please submit a plan that you believe is ready for review by your ETAP and OFA Family Support Specialist (FPS). Ideally it will have been edited and read by a few different people before you submit it to minimize the number of editorial comments your ETAP and FPS will need to provide. Their goal is to focus on content and technical details rather than presentation.
Please email your evaluation plan to your FPS and ETAP when it is ready, but no later than [DATE]. For consistency, please use this common naming convention when submitting your plan: LclEvPlan_[Grantee Name_MM.YYYY - plan draft date]. Please send a Word version of the document, not a PDF. In order to proceed with implementing your evaluation, OFA must approve your evaluation plan by the end of the planning period.
The plan’s cover page should include the intervention name, grantee organization, and authors and their affiliations.
Purpose. Describe how the evaluation will expand and build on the existing evidence base.
Instructions and Reminders. Describe previous literature or existing research that informs the stated research question(s) and how the evaluation will expand the evidence base. Explain why the research question(s) are of specific interest to the program and/or community. Provide full references for any work cited in the References section. Note whether this is the first study of this program or the first study of the program with a particular population.
Population-Specific Considerations. Existing research should speak to the evidence base for the population(s) you plan to serve.
Non-Text Elements. None.
Purpose. Articulate the research question(s) that the evaluation intends to answer about the impacts of the intervention on participant outcomes.
Instructions and Reminders. State the research question(s) that the evaluation intends to answer, specifying the inputs (such as program components, program supports, implementation features) and the outcomes (for example, participant outcomes) that will be examined to answer the research question(s). Each question should focus on a specific outcome and time point to help connect the outcome(s) and time point(s) to the intervention’s theory of change.
Population-Specific Considerations. None.
Non-Text Elements. None.
Purpose. Link research question(s) to program logic model.
Instructions and Reminders. Clearly demonstrate how the research question(s) (and the related implementation features and/or participant outcomes) link to the proposed logic model and the theory of change for the program.
Population-Specific Considerations. None.
Non-Text Elements. It may be helpful to provide a graphical representation of the program logic model in an appendix and/or a table summarizing program elements (for example, with rows for target population, program components, program content, planned mode of delivery, and so on). Your program logic model can be included as Appendix A.
Purpose. Articulate hypothesized result(s).
Instructions and Reminders. For each specific research question, state the hypothesized result(s), and briefly describe why these results are expected.
Population-Specific Considerations. None.
Non-Text Elements. Consider using a table to organize the research questions alongside the hypothesized results.
Purpose. Summarize and give a justification for how the research design will answer the research question(s).
Instructions and Reminders. Describe the research design proposed to answer the research question(s). State whether the proposed evaluation is a descriptive or impact evaluation, and justify why the proposed research design is best suited to answer the research question(s).
For descriptive studies, please describe how the study group(s) for the local evaluation will be selected or formed and describe the programming for which the sample(s) will be eligible. For example, if only a subset of participants served will be included in the local evaluation, describe how those participants will be identified from a larger pool. If you plan to describe changes over time for outcomes or processes for more than one study sample (for example for single adults and married couples), explain who is in each group, how they differ, and what changes you expect to see.
For impact studies, briefly summarize how the intervention/treatment group and comparison/control group(s) will be formed or selected, and describe the programming for which each will be eligible and how the groups will differ. Please provide details of study group formation below in Section II.B.
If the evaluation will collect multiple waves of data, describe the timing of these waves. When describing follow-up periods, specify whether the follow-up period will be post-baseline or post-program completion. If you do not plan to collect data beyond program exit, include a justification or rationale for why a longer-term follow-up data point is not feasible or not needed for the evaluation. Describe how respondents will be tracked over time for later data collection.
Grantees are required to collect, store, and report data on standardized performance measures at program entry and exit in a management information system designed specifically for this purpose: The Information, Family Outcomes, Reporting, and Management (nFORM) system. Separate from this evaluation plan, ACF requires grantees to develop and submit a comprehensive data collection plan for collecting the performance measures data in nFORM. However, most grantees will also use some nFORM performance measures for their local evaluation. If the study uses nFORM data, such as outcome data or enrollment and service data, please include a brief summary of the data collection plan for the nFORM data in this section. The specifics of the data collection methods and measures for the local evaluation (both nFORM data and local surveys) should be discussed under Section III.A below.
For process or implementation studies, describe specific framework(s) or approach(es) that will be used (for example, implementation science frameworks). Note: All impact studies must propose a related implementation or process study.
Population-Specific Considerations. None.
Non-Text Elements. It may be useful to summarize information about the follow-up periods in a table (for example, with rows for research question, data source, baseline period, first follow-up period, second follow-up period, and so on). For descriptive studies, consider including a flow chart depicting study enrollment through services and program exit. Include elements such as timing of consent, data collection, and length of services.
Purpose. For impact studies, describe how study groups will be formed.
Instructions and Reminders. Complete Section II.B only if the evaluation is an impact study. If the evaluation is descriptive, please note the evaluation has no comparisons and continue to Section II.C “Sample.”
If the research design includes the comparison of two or more groups (for example, an intervention group and a comparison group), describe how the groups will be formed. The control/comparison group and the treatment/program group should be assigned using a systematic approach appropriate to the research design. Note: If the research question(s) and study design do not necessitate comparisons, that is, the evaluation is descriptive, Section B does not need to be addressed.
Random assignment to develop study groups. If groups will be constructed by random assignment, describe how, when, and by whom random assignment will be done. Describe how random assignment will be monitored to prevent crossover of those assigned to specific study groups. Describe methods to monitor the comparability of the study groups.
Matching to develop study groups. If a comparison group(s) will be constructed using an approach other than random assignment, such as statistical matching, describe how and when the program and comparison group will be formed. Detail steps that will be taken to increase the likelihood that participants in the intervention/treatment and control/comparison groups of the project are similar and on what metrics. Describe methods to monitor the initial comparability of the research groups, and include justification for the belief that the proposed design is the most rigorous possible for addressing the research question(s) of interest.
Other method(s). If another type of evaluation research design is proposed, such as a regression discontinuity, single case, or other (non-matching) quasi-experimental designs, include an adequate description of the approach. Include justification that the proposed design is the most rigorous possible design for addressing the research question(s) of interest.
Population-Specific Considerations. If groups will be constructed by random assignment, consider the appropriate unit and method of randomization for your intended population. For example, if your program is serving youth in school, describe applicable stratification or blocking by school and/or classroom. If you are working with youth in schools, describe how/if relationships with school districts will be formed and maintained for the purpose of the evaluation.
Non-Text Elements. Consider including a flow chart depicting enrollment through services and program exit. Include elements such as timing of consent, data collection, and length of services. If study is a randomized controlled trial (RCT), include the timing of random assignment.
Purpose. Describe the target population and intended sample, if applicable.
Instructions and Reminders. Describe the target population(s) and explicitly state whether the population(s) in the evaluation differs from those who will be served broadly by the grant. Describe how the target population will be identified. Explicitly state the unit of analysis (for example, married couple, unmarried couple, individual).
If an impact evaluation is proposed, state the intended sample size (overall and by year), estimated attrition, and the anticipated size of the analytic sample (for both intervention/treatment and comparison/control groups). Provide power analyses (that is, a calculation of minimum detectable effect sizes) demonstrating that proposed sample sizes will be able to identify outcomes of interest as statistically significant. Include intended sample sizes and power analyses for any subgroups of central interest to the evaluation. Refer to previous studies of similar interventions for estimates of the expected effect sizes for the targeted outcomes to inform the calculation of minimum detectable effect sizes. Note: If an impact evaluation is not proposed, this issue does not need to be addressed.
Detail methods to ensure a large enough sample is recruited, enrolls, and participates in the program. Describe who will be responsible for recruiting the evaluation sample, and specify whether the same staff will recruit for both the intervention and comparison groups. Describe any incentives to be offered for participating in or completing the program and/or in data collection.
Population-Specific Considerations. None.
Non-Text Elements. Consider including tables with the results of the calculation of minimum detectable effect sizes (power analyses) and sample sizes by year.
Purpose. Describe the main roles of the study team.
Instructions and Reminders. Clearly define the roles of lead staff for the evaluation, especially the principal investigator and/or research project director. Articulate the experience, skills, and knowledge of the staff (including whether they have conducted similar studies in this field), as well as their ability to coordinate and support planning, implementation, and analysis related to a comprehensive evaluation plan.
Include curriculum vitae for the principal investigator/research project director and up to four additional staff to be involved in the local evaluation in Appendix B.
Population-Specific Considerations. None.
Non-Text Elements. None.
Purpose. Summarize how the grantee and local evaluator will work together throughout the evaluation while maintaining the independence of the evaluation.
Instructions and Reminders. Describe how the grantee and local evaluator collaboratively worked together to identify the research question(s) and research design to ensure their feasibility and relevance. Describe how the grantee and local evaluator will continue to work together throughout the evaluation to proactively address unforeseen challenges as they arise and ensure the rigor and relevance of the evaluation and its findings. Describe how the grantee and local evaluator will coordinate dissemination efforts. Describe how these processes will occur while maintaining the independence of the evaluation.
Population-Specific Considerations. None.
Non-Text Elements. None.
Purpose. Document the data sources and methods that will be used for the evaluation.
Instructions and Reminders. Describe the data sources for each research question being examined. Describe the data collection methods, including who will collect the data (for example, program staff or evaluation staff), timing, mode of administration, and overall process. Provide details on differences between intervention and comparison conditions in who will collect the data, timing, mode, and processes used for data collection, as appropriate.
Constructs and measures/data collection instruments. Clearly articulate the constructs of interest, measures to evaluate those constructs, and specific data collection instruments. Provide any information on the reliability and validity of the data collection instruments. If measures and data collection instruments will be determined during the course of the evaluation planning, describe the process to determine the measures and instruments, including any pre-testing of data collection instruments.
Consent. Describe how and when program applicants will be informed of the study and will have the option of agreeing (i.e., consenting to) or declining to participate in the study. Impact evaluations utilizing random assignment should consider collecting consent and then baseline data before randomizing participants.
Methods of data collection. Describe how data for the local evaluation will be collected, including required performance measure data in nFORM if those data will be used for the evaluation (see Section II.A above for more information on performance measures). Include a table detailing which data collection measures/instruments will be collected by which persons, and at what point in the programming or at what follow-up point. Describe any incentives to be offered to participants for completing surveys or other data collection efforts. Please keep in mind that the evaluator, not program staff, should handle all data collection for the evaluation to the extent possible.
Ensuring and monitoring high quality data collection. Describe plans for training data collectors and for updating or retraining data collectors about procedures. Detail plans to regularly review data that have been collected to assess and swiftly address problems.
Tracking participants and reducing attrition. If participants will complete post-program and/or follow-up surveys, describe plans for keeping track of participants in order to conduct follow-up surveys with as many participants as possible. Outline a plan for monitoring both overall and differential attrition. Note: If no post-program or follow-up surveys are proposed, this issue does not need to be addressed.
Population-Specific Considerations. If the study includes youth under the age of 18, describe how and when their parent(s) or guardian(s) will be informed of the study and will have the option of agreeing (i.e., consenting to) or declining that their child may participate in the study, and describe how and when youth will offer assent to agree or decline to participate in the evaluation. Note: If youth under the age of 18 will not be involved in the evaluation, the issue of assent does not need to be addressed.
Non-Text Elements. Include a table detailing which data collection measures/instruments will be collected by which persons, and at what point in the programming or at what follow-up point. Include a table that lists the outcome measures and the psychometric properties, such as reliability and validity, of the measures.
Purpose. Describe the planned approach to data analysis.
Instructions and Reminders. Briefly describe the planned approach for data analysis. If an impact analysis is proposed, name the key dependent and independent variables, and describe any methods to minimize Type I error (i.e., finding positive impacts by chance) such as limiting the number of impacts to be analyzed and/or multiple comparison correction. Describe proposed approach(es) for addressing missing data.
Briefly explain how each variable will be operationalized and constructed. If a measure will be constructed from multiple items, please document the source items and explain how the items will be combined to create an outcome for analysis.
Population-Specific Considerations. None.
Non-Text Elements. None.
Purpose. Articulate how the study will ensure the privacy of study participants.
Instructions and Reminders. Specify how the methods for data collection, storage, and transfer (for example, transfer of performance data to the federal government) will ensure privacy for study participants.
Population-Specific Considerations. None.
Non-Text Elements. None.
Purpose. Describe how the study will obtain Institutional Review Board (IRB) approval.
Instructions and Reminders. Name the specific IRB to which you expect to apply. Include a description of the process for protection of human subjects and IRB review and approval of the proposed program and evaluation plans.
During the planning period or afterwards, grantees or their local evaluators will be required to obtain a Federal-Wide Assurance, per guidance from the Federal Office of Human Research Protections (for more information, see https://www.hhs.gov/ohrp/register-irbs-and-obtain-fwas/fwas/fwa-protection-of-human-subjecct/index.html), and submit their research projects to an IRB (per 45 CFR 46.118). Please include the Federal-Wide Assurance for relevant institutions as Appendix C, or include a description and/or timeline for obtaining a Federal-Wide Assurance in this section.
Population-Specific Considerations. For programs serving youth (in school or out of school), include plan for obtaining parental consent and youth assent for those participants younger than 18 years old. For programs serving youth in school settings, include plan for obtaining research review clearance from the school district, if applicable. For programs serving justice-involved populations, include plan for obtaining any necessary human subjects review from the responsible organization.
Non-Text Elements. None.
Purpose. Describe how study data will be stored.
Instructions and Reminders. Describe the database the data will be entered into (i.e., nFORM and/or other databases), including both performance measure data and any additional local evaluation data. Describe the process for data entry (i.e., who will enter the data into the database).
Data reporting and transfer. Indicate the ability to produce reports (for example, for OFA) and to export individual-level data (with all of the above variables) to Excel or a comma-separated format.
Ability to link. Indicate the ability to maintain individual identifying information to facilitate linking to data from other sources (for example, administrative data systems such as unemployment insurance).
Current security and confidentiality standards. Indicate the ability to encrypt data access during transit (for example, accessed through an HTTPS connection); encrypt data at rest (that is, when not in transit); have in place a data backup and recovery plan; require all users to have logins and passwords to access the data they are authorized to view; and have current antivirus software installed to detect and address malware, such as viruses and worms.
Population-Specific Considerations. None.
Non-Text Elements. None.
Purpose. Describe how data will be archived and transferred.
Instructions and Reminders. Describe a data archiving plan that establishes procedures and parameters for all aspects of data/information collection (for example, informed consent, data maintenance, de-identifying data, etc.) necessary to support archiving the information and data collected through the evaluation. Describe how the collection methods for all types of proposed data collection will support the archiving and transfer of each type, and how consent form language accurately represents the plans to store data for sharing and/or transferring to other researchers. Describe methods of data storage that will support archiving and/or transferring data, and explain how construction and documentation of the data and analysis files will support data archiving and/or transferring.
Population-Specific Considerations. None.
Non-Text Elements. None.
Purpose. Describe how study findings will be disseminated.
Instructions and Reminders. Briefly describe the planned dissemination efforts associated with the local evaluation, including any dissemination that will occur while the evaluation is ongoing (rather than after the evaluation is completed), and any plans for study registration with an appropriate registry (for example, clinicaltrials.gov, socialscienceregistry.org, osf.io, etc.).
Population-Specific Considerations. None.
Non-Text Elements. None.
Purpose. Provide the full reference for any work cited in the report.
Instructions and Reminders. Please use the American Psychological Association (APA) style guide for citing works in the report. This section should include the full reference for any work cited.
Population-Specific Considerations. None.
Non-Text Elements. None
The plan may include the following appendices. (Note: it may not be necessary to include appendices for all of these items):
Update and include the logic model or theory of change for the program that was submitted as part of the grant application. A logic model is a diagram that presents the conceptual framework for a proposed project and explains the links among program elements. Logic models must target the identified objectives and goals of the grant program. A logic model may include connections between the following items: inputs (e.g., additional resources, organizational profile); target population (e.g., the individuals to be served); activities, mechanisms, processes (e.g., evidence-based practices, key intervention and evaluation components); outputs (i.e., the immediate and direct results of program activities); outcomes (i.e., the expected short and long-term results the project is designed to achieve, typically described as changes in people or systems), and goals of the project (e.g., overarching objectives).
Include curriculum vitae for the principal investigator/research project director and up to four additional staff to be involved in the local evaluation.
Include a Federal-Wide Assurance for the grantee or evaluator’s institution (see Section V.B for instructions). Through the Federal-Wide Assurance, an institution commits to HHS that it will comply with the requirements in the HHS Protection of Human Subjects regulations at 45 CFR part 46. See https://www.hhs.gov/ohrp/register-irbs-and-obtain-fwas/fwas/assurance-process-faq/index.html for more information.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Mathematica Report Template |
Author | Sheryl Friedlander |
File Modified | 0000-00-00 |
File Created | 2023-10-17 |