Alternative Supporting Statement for Information Collections
Designed for
Research, Public Health Surveillance, and Program
Evaluation Purposes
Sexual Risk Avoidance Education (SRAE) National Evaluation Overarching Generic
OMB Information Collection Request
New Umbrella Generic
Supporting Statement
Part B
May 2025
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
3rd Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officers:
Calonie Gray
The information collected under this umbrella generic clearance is intended to inform ACF programming by building the evidence on strategies that have the potential to improve the delivery and/or quality of Sexual Risk Avoidance Education (SRAE) programming. As the study team identifies strategies ready for evaluation, rapid, responsive work will be undertaken to allow any learnings to be disseminated back to SRAE grant recipients within their grant award periods.
The data collections under this umbrella generic clearance (GenICs) will consist of mixed methods studies using innovative learning methods, which may include qualitative and quantitative descriptive methods and rigorous impact evaluations to evaluate promising implementation strategies.
The population of interest is youth served by SRAE programs and the staff who deliver the programs, and the information collected in this study is intended to inform ACF’s understanding of how to best serve them: information collected through GenICs for impact studies is intended to produce estimates of the impact of program services or components in chosen sites; information collected through GenICs for implementation or descriptive studies is intended to present an internally valid description of the youth or the implementation of a strategy in chosen sites. Information collected through the study is not intended to promote statistical generalization to other sites or service populations. GenICs submitted under this umbrella will include details on the intended generalizability of the results based on the study design.
The information collected under this umbrella clearance will draw from several study designs and methods (i.e., random assignment, quasi-experimental, observational, descriptive, etc.). Each GenIC will be aligned with the most rigorous and efficient options based on the study aims, type of information collected (quantitative and/or qualitative), and feasibility based on SRAE program implementation setting (e.g., multiple classrooms vs. one classroom, juvenile justice facility vs. traditional school, etc.) and timeline (e.g., week-long vs. year-long programming). As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of highly influential scientific information.
Initial GenIC Request
We have identified one innovative strategy that we would like to learn more about immediately upon approval of this overarching generic clearance, before planning a more in-depth study that would be reflected in a future GenIC. In rural New Mexico, an SRAE grant recipient will serve youth participating in a 2025 summer youth employment program. This mechanism provides a means to deliver their SRAE program – NativeSTAND – during the summer months when youth are at a greater risk of engaging in unhealthy behaviors. The grant recipient will compress the 27 lessons of NativeSTAND into a concentrated three days offered during the summer employment program. We propose conducting a proof-of-concept study to assess the promise of this efficient approach to delivering programming. This proof-of-concept study will address two questions: (1) Do youth receive the majority of the intended content? and (2) Are youth outcomes improving as expected after receiving NativeSTAND in a compressed format?
Target sites for all evaluation activities under this umbrella generic consist of SRAE grant recipients and their partner organizations, which may include local health departments, school districts, and community-serving organizations. Target respondents within the selected sites include but are not limited to:
Administrators and program staff from SRAE programs, as well as directors and staff from their partner agencies.
Front-line facilitators working directly with youth.
Middle and high school-age youth participants in SRAE programs (i.e., including potential participants who are included in comparison groups)
Sites will be selected to participate in evaluation activities based on their ability to meet criteria for the impact or descriptive studies, including organizational capacity to support an evaluation and alignment of program operations with the proposed research objectives (to be detailed in the GenICs). To identify potentially eligible sites, the study team will review various extant data elements for each grant recipient, including grant application materials, annual reports, performance measures data (OMB Control No. 0970-0536), and data on program plans and implementation experiences collected under the first phase of the SRAE National Evaluation (OMB Control No. 0970-0530 and 0970-0596). The study team may also gather information directly from SRAE federal project officers.
In studies covered under this overarching generic, obtaining probability-based samples to reach the desired subpopulations of interest for study activities would be cost-prohibitive and not needed for achieving study goals. As such, ACF does not, at this time, anticipate undertaking a statistically sophisticated strategy for respondent selection. A description of the plans for selecting respondents will be provided to OMB as part of each individual GenIC request. Purposeful, targeted sampling through specific programs and other non-probability sampling designs will be used to develop a pool of potential respondents. The limitations associated with purposive, or any sampling method will be described in any GenIC submission, and will be clearly stated in any publications produced for this project.
Initial GenIC Request
For the proof-of-concept study on the compressed program delivery approach, respondents will be youth participating in the summer employment program where the SRAE programming will be delivered. The majority of participants are high school aged youth. We will coordinate with the grant recipient to collect parental consent for youth to participate in the study. A hard copy study parental consent form will be included with the summer youth employment consent materials, however youth can consent to the program and not the study (Appendix A1). We will also consider offering the option of an electronic consent, to be completed online and verbal consent, if we are experiencing challenges in getting consents returned. The consent form will include information about the various data collection activities, including surveys and collection of program attendance. It will inform youth that participation is voluntary, and they can refuse to participate without any negative consequences. They will be informed that they can refuse to answer any survey questions they do not wish to answer, and that all information they provide will be kept private to the extent allowed by law.
All youth with consent to participate in the study that complete the pre-survey will be part of the proof-of-concept study. After completing the pre-survey, youth will be randomly assigned to receive NativeSTAND or an alternative program of the same length, and informed of their assignment. Random assignment is being used to conduct the most rigorous test on an efficient timeline to gain the most accurate results of whether the compressed approach is a promising strategy.
Youth outcome data will be collected through the youth survey administered at three points: baseline (pre-programming), program exit (post-survey), and six months after programming. The baseline and immediate follow-up surveys will be administered via hard copy in-person, in a group setting at each site. The six-month follow-up surveys will be collected via a web-based survey. We will contact youth via email and text, providing them with a link to the six-month follow-up survey. We will follow up by phone, email, text, and mail for nonrespondents (Appendix C1: Proof-of-concept survey outreach materials). We will also consider the options of a field effort for follow-up data collection, if needed.
Data collection activities may include:
Surveys
Semi-structured interviews (in-person, telephone, or web-based)
Focus groups
Administrative data
Direct observation
Document analysis
For the collection of primary data through surveys, semi-structured interviews, and focus groups, the study team will draft data collection instruments and consent forms. Example instruments are provided in the attached Instruments 1-9.
The study team reviewed multiple OMB-approved instruments to identify potential questions for the youth survey, including the Strengthening Relationship Education and Marriage Services Youth Survey (OMB No. 0970-0481), and the Components Evaluation of REAL Essentials Youth Survey (OMB. No. 0990-0480). Sources for items in the youth survey are included in Appendix B. For the interview and focus group topic guides, the study team reviewed OMB-approved protocols from the Performance Measures and Adult Preparation Subjects study (OMB No. 0970-0356) and surveys developed for the formative co-regulation evaluation on the SRAE National Evaluation (OMB No. 0970-0531).
The study team will use pretesting to review and strengthen instruments for some efforts. The study team will pretest instruments with respondents similar to those who will be participating in the study. Pretest respondents will be asked to complete the draft survey and participate in a follow-up discussion about how the questions were worded, whether any questions or response choices were confusing, what respondents thought about when answering the question, and suggestions for improving the instruments. If these pretesting activities are subject to PRA, ACF will submit a request for OMB review and approval prior to commencing work.
Initial GenIC Request
For the proof-of-concept study youth survey, we reviewed the logic model for NativeSTAND, instruments used in previous studies of NativeSTAND, and variables that have been identified as predictors of delayed sexual behavior. We selected items in Instrument 1 that aligned with NativeSTAND and are appropriate for high school aged youth to create Instrument 1a. Because youth in the proof-of-concept study will primarily live in Tribal communities, Instrument 1a adds a section on Native Identity (Instrument 1a, Section C) and includes some minor updates to wording and response options.
ACF is contracting with Mathematica for this data collection and Mathematica will oversee all data collection and ensure quality control measures are in place and followed for GenIC requests submitted under this umbrella package. To ensure efficient and standardized data collection processes for individual studies, Mathematica study staff will participate in a study-specific data collection training. Individual GenIC requests will include details about study-specific data collection and quality control procedures, including recruitment protocols, mode of data collection, and how data collection activities are monitored for quality and consistency.
Initial GenIC Request
Mathematica staff will train site staff to administer the youth survey and to conduct random assignment on site. For the initial survey administration, trained Mathematica staff will administer the survey to youth and then conduct random assignment while site staff observe. For subsequent survey administrations and random assignment, Mathematica staff will be on site while site staff administer the surveys and conduct random assignment. Mathematica staff will observe and provide feedback as needed. Mathematica staff will provide additional training and oversight, as needed.
Expected response rates will vary for individual information collection requests. Information about expected response rates will be provided with each GenIC request.
In general, reminder calls, emails, and text messages will be used to maximize response rates for interviews and focus groups; reminder phone calls, letters, and/or emails are some methods that will be used to maximize response rates in web-based surveys completed outside of regular programming time. Each GenIC request will provide specific information about methods to maximize response rates and address nonresponse.
For data collection from selected sites, the study team will work closely with administrators and staff to develop recruitment strategies for participants and program staff for focus groups and interviews, particularly to ensure the respondents reflect a group with a mix of experiences. To further increase the likelihood of participation, the study team will also offer tokens of appreciation to participants for focus groups, interviews, and/or surveys, as discussed in Supporting Statement Part A.
Initial GenIC Request
For the proof-of-concept study, we expect the response rate for the pre-survey to be 100 percent as completing the pre-survey is a condition of being enrolled in the study. We anticipate a 90 percent response rate for the immediate post survey, to account for youth attrition during the youth program. For the six-month follow-up survey, we are targeting a 75 percent response rate. A 75 percent response rate will provide an adequate number of responses to help ensure quality data and that respondents are representative of the sample population. The proof-of-concept study will allow us to assess the feasibility of achieving a 75 percent response rate at 6 months post-program with this population. We will assess feasibility by tracking the response rate achieved using a range of data collection and contact methods (including contact by email, text, and mail, and potentially using a field effort). We will also track the length of time for achieving the response rate to assess how the rate increases over time and as different data collection and contact methods are used. If achieving a 75 percent response rate in the proof-of-concept study proves challenging, we will reassess the target response rates and follow-up procedures and describe these in a future GenIC for an in-depth study.
We will examine non-response as appropriate for the design of each study; we will describe these plans in each GenIC request. Respondent demographics will be documented and reported in written materials associated with the data collection.
Initial GenIC Request
For the proof-of-concept study, the team will take steps to understand the nature of any non-response and to account for the threat it may pose to the validity of the study’s impact estimates. Using data from the pre-survey, we will first test for statistically significant differences across demographic and pre-survey outcome variables for youth who complete the follow-up survey and by whether youth receive NativeSTAND or the alternative program. We will then control for these differences using covariates when estimating program impacts. At the six-month follow-up, we will examine response rates to detect any differences in attrition between the youth who receive each program, and adjust accordingly.
For descriptive studies and impact evaluations using qualitative and quantitative methods, the selected analytic approaches will provide internally valid answers to research questions on implementation, processes, and effectiveness. Impact evaluations could involve randomized controlled trials, quasi-experimental designs, and single-case designs and estimates will primarily be produced through linear regression. Information will be provided with each GenIC request, as appropriate. If, based on the statistical methods selected, a public comment period is appropriate, ACF will publish a notice in the Federal Register soliciting comments over a 30-day period. ACF will coordinate with OMB in these instances.
Initial GenIC Request
For the proof-of-concept study, we will analyze the change in youth outcomes from pre-program to immediately after the end of programming and then six months after the program ends. We will also analyze attendance data to examine how much of the intended programming youth received.
No personally identifiable information (PII) will be shared outside of the study team. Survey data and qualitative data, including typed notes and audio recordings, will be stored on Mathematica’s secure network, which is accessible only to the study team, and disposed of per ACF’s record retention protocols. All data will be saved under ID codes, rather than respondents’ names. All transcripts from interviews and focus groups will be de-identified to remove PII. Any information linking ID codes and respondents’ PII will be saved on Mathematica’s secure restricted drive and will be password protected.
The study team will use several methods to protect the integrity of collected data while preparing and analyzing them. Data collected through interviews and focus groups will be transcribed and coded by multiple individuals. The study team will run basic quality assurance checks on all data. For example, the study team will conduct initial analyses that include basic quality assurance checks. These will involve assessing for consistency across variables that should be consistent, examining outlier values, addressing any needed modeling assumptions, and identifying incomplete or missing data. The study team will develop all appropriate codebooks and data logs documenting decisions made. More information will be provided with each GenIC request.
Initial GenIC Request
The pre- and post-surveys will be collected via hard copy. All completed survey materials will be shipped to Mathematica via FedEx. Mathematica and site staff will be trained on secure data handling procedures. All hard copy materials will be securely stored at all times, and all information containing PII will be kept separate from materials containing survey data. For example, completed consent forms will be shipped separately from completed surveys. Surveys will be receipted at Mathematica and double entered into the Viking data entry system. Data for the six-month follow-up survey will be collected via a web-based survey. Study participants will be sent a unique URL or QR code to access and complete the survey.
The study team will develop an analysis plan for each study under this umbrella clearance.
The quantitative analytic approach will depend on the study design selected, in addition to the type of data collected. The study team anticipates that most analyses will involve ordinary least squares regression, such as with the proof-of-concept study. When applicable, each plan also will include analysis of qualitative data collected through interviews and focus groups, and program documents. We do not plan to collect any qualitative data for the proof-of-concept study. Specific analysis plans will describe how different types of qualitative data will be prepared for analysis. In general, the study team will employ a thematic approach to analyze qualitative data. Both deductive and inductive coding strategies will be used to code the interview and focus group data. Deductive codes, aligned with the study's research questions, will be predefined. Additionally, inductive coding will be applied to recognize themes and categories that emerge organically from the data, which may or may not directly align with the research questions. To ensure the reliability of the coding, the assignment of codes to text will be regularly reviewed and adjusted as needed. After coding is complete, the codes will be analyzed both within and across cases to identify key themes for each case and to draw comparisons and contrasts between cases, ultimately leading to the identification of overarching themes. More information will be provided with each GenIC request.
Initial GenIC Request
For the proof-of-concept study, we will analyze the change in youth outcomes from pre-program to immediately after the end of programming and then six months after the program ends. We will also analyze attendance data to examine how much of the intended programming youth received.
Under this umbrella generic, information collected is meant to inform ACF activities and may be incorporated into documents or presentations that are made public through conference presentations, websites, or social media.
For example, information collected may be shared through technical assistance (TA) plans, webinars, presentations, infographics, issue briefs/reports, evaluation specific reports, or other documents relevant to federal leadership and staff, grant recipients, local implementing agencies, researchers, and/or training/TA providers.
In sharing findings, the study team will describe the study methods and limitations regarding generalizability and as a basis for policy. Any planned uses, including for publication or sharing of information from this IC will be described and submitted for approval in each individual GenIC.
Initial GenIC Request
Information collected through the proof-of-concept study will help to assess whether youth outcomes improve for youth receiving the NativeSTAND program in the compressed format and whether youth received the intended amount of programming. The information collected will also inform whether a more in-depth study is feasible. Results will be summarized a memo that includes recommendations for whether to pursue a subsequent in-depth study.
Table B.2 lists the federal and contract staff responsible for the study, including their affiliation and email address.
Table B.2. Staff responsible for study
Name |
Affiliation |
Email address |
Calonie Gray |
Office of Planning, Research, and Evaluation Administration for Children and Families U.S. Department of Health and Human Services |
|
MeGan Hill |
Family and Youth Services Bureau Administration for Children and Families U.S. Department of Health and Human Services |
|
Tia Brown |
Office of Planning, Research, and Evaluation Administration for Children and Families U.S. Department of Health and Human Services |
|
Nakia Martin-Wright |
Office of Planning, Research, and Evaluation Administration for Children and Families U.S. Department of Health and Human Services |
|
Heather Zaveri |
Mathematica |
|
Melissa Thomas |
Mathematica |
|
Jennifer Walzer |
Mathematica |
Instrument 1: Youth survey
Instrument 1a: Youth survey for proof-of-concept study
Instrument 2: Administrator, staff, and partner topic guide
Instrument 3: Youth topic guide
Instrument 4: Youth exit ticket
Instrument 5: Facilitator exit ticket
Instrument 6: Analysis plan for impact evaluations
Instrument 7: Analysis plan for descriptive evaluations
Instrument 8: Report template for impact evaluations
Instrument 9: Report template for descriptive evaluations
Appendix A: Example consent and assent forms
Appendix A1: Proof-of-concept study consent and assent forms
Appendix B. Instrument 1 Item Source List
Appendix C1: Proof-of-concept survey outreach materials
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Susan Zief |
File Modified | 0000-00-00 |
File Created | 2025-06-04 |