SSA - SRAENE CIS GenIC

Supporting Statement A - CIS 08.25.22.docx

Formative Data Collections for ACF Research

SSA - SRAENE CIS GenIC

OMB: 0970-0356

Document [docx]
Download: docx | pdf

Alternative Supporting Statement Instructions for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Sexual Risk Avoidance Education National Evaluation: Formative Evaluation for the Program Components Impact Study



Formative Data Collections for ACF Research

0970 - 0356






Supporting Statement

Part A

August 2022


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Calonie Gray

MeGan Hill

Tia Brown




Part A




Executive Summary


  • Type of Request: This Information Collection Request is for a generic information collection under the umbrella generic, Formative Data Collections for ACF Research (0970-0356).


  • Progress to Date: The formative evaluation phase for the Program Components Impact Study (CIS) is part of the Sexual Risk Avoidance Education National Evaluation (SRAENE). It builds upon the findings from the CIS Proof of Concept Pilot Phase, which took place from January to May 2022 (Approved under the umbrella generic for Formative Data Collections for ACF Program Support; OMB #: 0970-0531).


  • Description of Request: The purpose of SRAENE is to provide information on the design and implementation of Sexual Risk Avoidance Education (SRAE) programs, the effectiveness of program component refinements, and the ways grant recipients can use data and evidence to improve SRAE programming. SRAENE involves the following three main parts: (1) the Nationwide Descriptive Study of SRAE program implementation and youth outcomes, (2) the CIS, and (3) data capacity building and local evaluation support for SRAE grant recipients. This request is specific to the CIS, which seeks to refine and test improvements to one or more components of programs to ultimately improve youth outcomes. This request builds upon the CIS Proof of Concept Pilot Phase, which tested promising program delivery strategies for facilitators of SRAE programming. Through the Proof of Concept Pilot Phase we learned that one specific facilitation approach – using co-regulation strategies in the classroom – was promising but required further information about its implementation before conducting a larger scale impact evaluation. As a result, we now propose to collect data on implementation and participation from up to nine sites who will implement the co-regulation strategies. Data collection will occur with facilitators and youth program participants to understand use of the strategy.


Data collected under this generic information collection request will be used to inform key areas outlined in ACF’s co-regulation learning agenda1 and a potential future large-scale impact study to test effectiveness, and inform the provision of technical assistance planning and resources. We do not intend for this information to be used as the principal basis for public policy decisions.

A1. Necessity for Collection

Study Background

In February 2018, as part of the federal government’s efforts to support youth in making healthy decisions about their relationships and behaviors, Congress reauthorized Title V, Section 510 of the Social Security Act to fund the Sexual Risk Avoidance Education (SRAE) grant program. SRAE grants fund programs that teach adolescents to refrain from sexual activity. The Family and Youth Services Bureau (FYSB), Administration for Children and Families (ACF) of the U.S. Department of Health and Human Services (HHS), administers the program. SRAE programs also provide education on personal responsibility, self-regulation, goal setting, healthy relationships, a focus on the future, and preventing drug and alcohol use. The reauthorization included a requirement to evaluate the SRAE grant program. ACF awarded a contract for the SRAE National Evaluation (SRAENE) in 2018.

This request pertains to one of the required SRAENE activities—The Program Components Impact Study (CIS), which seeks to refine and test improvements to one or more components of programs to ultimately improve youth outcomes. Program components can include any part of a program, including curriculum content, supplementary activities, delivery, facilitation, setting, and dosage. The SRAENE is focusing on improving the program component of facilitation to improve youth outcomes. Data collection for the CIS Proof of Concept Pilot was approved under the umbrella generic for Formative Data Collections for ACF Program Support (OMB #: 0970-0531; November 26, 2021). For the Proof of Concept Pilot, the SRAENE team and ACF identified and pilot tested two promising facilitation strategies: (1) a set of co-regulation strategies to support facilitators with building youth’s self-regulation skills (referred to as co-regulation)2 and (2) an approach that teaches facilitators to assess youth attitudes and beliefs on constructs associated with the delay of sexual initiation (referred to as facilitator foundations). Although both strategies have been used by youth-serving programs, their use had been very limited among SRAE programs prior to the proof-of-concept pilot study.

The Proof of Concept Pilot took place in five sites from January to May 2022 with a goal of determining the feasibility of training facilitators to use the selected facilitation strategies. The Proof of Concept Pilot study used a rapid-cycle approach and focused on three objectives: (1) field test training of facilitators on each strategy and observe early implementation, (2) identify the ongoing support facilitators need to feel equipped to successfully implement the strategies, and (3) assess the feasibility of replicating the training and use of the strategies with fidelity on a larger scale.

Based on the results of the Proof of Concept Pilot, ACF recommends that one of the strategies tested – the co-regulation strategy – is promising but would benefit from further information on implementation before conducting a larger-scale impact evaluation. As such, we propose to collect data on implementation and participation from up to nine sites that will implement the co-regulation strategies over a longer period than was allowed during the Proof of Concept Pilot study. This proposed data collection will (1) gather evidence to inform the direction of a future large-scale evaluation of the effectiveness of using co-regulation strategies, (2) inform guidance and products to support successful replication of co-regulation strategies by additional programs if it is rigorously evaluated, and (3) help to further ACF’s co-regulation learning agenda. This generic information collection request seeks approval to conduct data collection to support this formative evaluation phase. There are no legal or administrative requirements that necessitate this collection. ACF is undertaking the collection at the discretion of the agency.

A2. Purpose

Purpose and Use

The goal of the SRAENE CIS and this proposed data collection is to inform the development of ACF research, specifically related to integrating co-regulation strategies into the SRAE grant program improvement efforts. This formative evaluation phase of the study will contribute new knowledge and will build preliminary evidence to support several key questions outlined by ACF’s co-regulation learning agenda3, such as: What is the feasibility of implementing co-regulation strategies on a larger scale? What practice-based supports are needed? How are co-regulation strategies implemented in different contexts and among different subpopulations? This proposed formative evaluation will also examine the relationship between improved facilitation and youth outcomes, which is important for understanding whether improving any one component of a program – in this case, facilitation – can bolster youth outcomes.

This formative evaluation study seeks to document the implementation of the co-regulation strategies at a larger scale than the Proof of Concept Pilot Study and assess associations of the strategies with proximal youth outcomes. The information collected through this study will be used to inform future research on the effectiveness of using co-regulation strategies on youth well-being and other program impacts. This specific request is related to the collection of qualitative and survey data that will be used to describe how the facilitators integrate the co-regulation strategies into their classroom instruction, which ACF could use to inform the planning for a possible future effectiveness evaluation of the co-regulation strategies. Analysis of other extant data not associated with this information collection request will be used to examine the associations between the use of co-regulation strategies and youth proximal outcomes. The findings from this formative evaluation will also be used to inform and support the development of TA resources, such as guidance documents and videos, that can be made available to support further replication which will be necessary for a future rigorous study of the strategies.

This proposed information collection meets the following goals of ACF’s generic clearance for formative data collections for research and evaluation (0970-0356):

  • inform the development of ACF research,

  • maintain a research agenda that is rigorous and relevant, and

  • inform the provision of technical assistance.

The information collected is meant to contribute to the body of knowledge on ACF programs and specifically to ACF’s learning agenda on co-regulation. It is not intended to be used as the principal basis for a decision by a federal decision-maker and is not expected to meet the threshold of influential or highly influential scientific information.


Research Questions or Tests

This formative evaluation phase will address two primary research questions, each with two secondary questions. As noted above, these questions reflect the information needs for ACF’s co-regulation learning agenda and begin to explore whether integrating co-regulation strategies can improve youth learning through improved facilitation.


  1. Does the use of the co-regulation facilitation strategy appear to support improvement in facilitation?

    1. What are facilitators’ reactions to the training?

    2. How well do facilitators implement the co-regulation strategy? What are the successes and challenges associated with implementation?


  1. Does the use of the co-regulation facilitation strategy appear to support improvement in youth proximal outcomes?

    1. How do youth respond to SRAE programs when facilitators use the co-regulation strategy?

    2. Does implementation of the co-regulation strategy suggest change in proximal outcomes for youth, as measured by performance measures for participating study sites?

Study Design

To conduct the CIS formative evaluation, the team will recruit up to nine sites to receive training on the SRAE co-regulation strategies. For more information about the sites to be recruited, see Supporting Statement B, Section B2. Data will be collected from all facilitators in each site (estimated at four facilitators per site), and from youth in up to two focus groups per site. We estimate the data collection period will take 32 weeks, accounting for training the facilitators and collecting data in fall 2022 and spring 2023.

Program facilitators working in the nine selected sites will be asked to complete several brief surveys: a facilitator pre-training survey (Instrument 1) to occur one week before the start of the facilitator training; a facilitator post-training survey (Instrument 2) immediately following the last training session; and two follow-up facilitator surveys (Instrument 3) to occur at the end of the fall semester (December 2022) and the end of the spring semester (May 2023). The pre- and post-training surveys capture the facilitator’s understanding of and confidence in using the co-regulation strategy before and after training, collect information on other trainings and experiences that the facilitator has had previously, ask about their experiences with the training, and capture their perceptions of their own self-regulation skills. The follow-up surveys ask about the facilitator’s perceptions of the own self-regulation skills and their confidence in using the co-regulation strategy.

Once during fall 2022 and then once again in spring 2023, facilitators will be asked to participate in a one-hour, in-person interview (Instrument 4. Facilitator Interview Protocol) to discuss their perceptions of how the strategies are working, areas where they need more training or support, their ability and comfort in integrating the strategies into the curriculum, and their comfort and confidence in using the strategies in the classroom. Facilitators will also be asked to complete a daily log (Instrument 5. Facilitator Log) where they will indicate what strategies they used during facilitation that day and the successes and challenges of doing so.

The study team also plans to conduct up to two focus groups with youth per site. We plan to include up to eight youth per group from the classes in which facilitators are implementing the co-regulation strategies. The youth focus groups will gather their reactions to the content, setting, and delivery of the program and their relationship with the facilitator(s) (Instrument 6. Youth Focus Group Protocol). Youth participating in focus groups must have parental consent and assent to their participation (Appendix B. Parent Consent and Youth Assent for Youth Focus Groups).

The focus groups will take place in person at the program sites. The in-person nature of the groups is important for ensuring that focus group participation does not vary demographically due to differences in student digital technologies and online access.4,5 Conducting the groups in person avoids the complication of and potential exclusion of youth due to a lack of personal technological devices such as laptop computers or tablets with cameras to ensure full engagement, differing data-use plans, and internet bandwidth and service that may be unreliable in rural settings. Focus group engagement and interaction, important for procuring verbal data from youth participants, will be more effective in-person than remotely. A recent survey shows socioeconomic differences in student self-reported engagement with remote learning 6 and data from the current SRAENE Grantee COVID-19 Interviews (approved November 1, 2021 under the umbrella generic for Formative Data Collections for ACF Program Support (OMB #: 0970-0531) consistently showed SRAE program grant directors reporting that remote learning did not provide the same level of youth engagement in the programming as that of in-person. If necessary, a virtual video conference option will be available should local COVID-19 safety measures require it. Supporting Statement Part B, Section B2 further describes the study’s methods, design, and sample.

Table A.1 includes data collection by instrument, participant, content, purpose, and mode and duration of the data collection. To understand program implementation experiences, we are focusing data collection on the facilitators who interact directly with youth and the youth.

Table A.1. Study design summary

Instruments

Participant, content, purpose of collection

Mode and duration

Instrument 1. Facilitator Pre-training Survey


Instrument 2. Facilitator Post-training Survey


Instrument 3. Facilitator Follow-up Survey

Respondents: Program facilitators

Content: Facilitators’ understanding of the strategy tested and perceived self-efficacy for implementing the strategy

Purpose: To capture facilitator knowledge of content; understanding of the strategy before and after training; and explore perceptions of and experiences with training and efficacy in using the strategies over time

Mode: Web

Duration: 25 minutes total

Instrument 4. Facilitator Interview Protocol

Respondents: Program facilitators

Content: Feedback on training, guidance, and materials; use of strategy; youth responsiveness; effectiveness of strategy; and suggestions for improvement

Purpose: To determine how the strategy is being used and facilitators’ perceptions and experiences with using the strategy

Mode: In-person

Duration: 60 minutes

Instrument 5. Facilitator Implementation Log

Respondents: Program facilitators

Content: Adherence and adaptations to program delivery

Purpose: To determine the strategies facilitators implement, their reactions to implementing the strategy, and their perception of how youth reacted to the strategy

Mode: Web

Duration: Daily input during selected weeks

Instrument 6. Youth Focus Group Protocol

Respondents: Youth program participants

Content: Experiences with the program and overall satisfaction

Purpose: To gauge youth participants’ perceptions of the program’s climate, use of the strategies, their interactions with and perceptions of the facilitators, and their engagement in the program

Mode: In-person

Duration: 60 minutes




Other Data Sources and Uses of Information

The study team will also utilize existing attendance data and data from program entry and exit surveys completed by youth that the sites are required to administer and report to ACF as performance measures7. The study team will use these additional measures to examine the associations among the strategies and the youth’s program experiences and their proximal outcomes.





A3. Use of Information Technology to Reduce Burden

The study team plans to use information technology wherever possible. The surveys and facilitator log will be available via the web and can be completed using a tablet, smartphone, desktop computer, or laptop. Programming the surveys for web-based administration allows data cleaning specifications to be embedded into the survey, for example, prompts for missing or illogical responses. Survey participants can quickly make corrections, reducing the need for data cleaning follow-up contacts with the survey participants after the interview.

The study team plans to conduct youth focus groups in person but is prepared to conduct them remotely—via videoconference if necessary—if necessary (for example, based upon COVID-19 restrictions).

A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency

None of the instruments ask for information that can be reliably obtained through other sources.

A5. Impact on Small Businesses

The programs participating in the study will be small, non-profit organizations. The SRAENE team will request information required only for the intended use. The burden for respondents will be minimized by restricting the interview and survey length to the required minimum, conducting interviews at times convenient for the respondents, and not requiring additional record-keeping on the part of the programs.

A6. Consequences of Less Frequent Collection

This is a one-time data collection.

A7. Now subsumed under 2(b) above and 10 (below)



A8. Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published two notices in the Federal Register announcing the agency’s intention to request an OMB review of the overarching generic clearance for formative information collection. This first notice was published on November 3, 2020, Volume 85, Number 213, page 69627, and provided a sixty-day period for public comment. The second notice published on January 11, 2021, Volume 86, Number 6, page 1978, and provided a thirty-day period for public comment. ACF did not receive any substantive comments.

Consultation with Experts Outside of the Study

Several experts in SRAE programming and research provided consultation to the study team and ACF throughout the CIS Proof of Concept Pilot Study. Feedback received clarified the purpose and usefulness of this next proposed phase of the study.

A9. Tokens of Appreciation

No tokens of appreciation for study participants are planned for this data collection.

A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing

Personally Identifiable Information

This data collection effort will collect personally identifiable information (PII) from facilitators (names, work email addresses, and telephone numbers) and program participants (names, email addresses, and telephone numbers) to obtain consent to participate in data collection activities and arrange data collection (including scheduling and sending invitations/reminders for data collections).

Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individual’s personal identifier.

Assurances of Privacy

All study participants will be informed of the planned uses of data, that their participation is voluntary, and that the study team will keep their information private to the extent permitted by law. The study team will discuss issues of privacy during training sessions with staff who work on the project. The contractor, Mathematica, requires that staff complete online security awareness training when they are hired and then participate in annual refresher training thereafter. Training topics include the security policies and procedures outlined in the Mathematica Corporate Security Manual. All records containing data will be transferred using a secure file transfer protocol site, in case the files contain PII such as facilitator name or, in the case of the youth focus groups, the youth and parent names. As specified in the contract, Mathematica will protect respondents’ privacy to the extent permitted by law and will comply with all federal and departmental regulations for private information. In addition, the study leaders at Mathematica will conduct project-specific trainings of all staff who work on the study to communicate the expectations on privacy, informed consent, and data security procedures.

Parent consent and youth assent forms (Appendix B. Parent Consent and Youth Assent Forms) inform parents and youth that the youth are invited to participate in focus groups, that the information requested from them is for program improvement purposes only, and that their identities will not be disclosed to anyone outside the study team. With participants’ permission, the focus groups will be recorded. Participants will be assured that their recorded comments will be saved only until transcribed and that the transcription summaries will not reveal their identities. All participants (and their parent or legal guardian) must read and acknowledge the consent and assent form before participating in the data collection. The study will be reviewed by Mathematica’s institutional review board (IRB), the Health Media Lab. Outreach and data collection will not begin until IRB approval has been received.

Data Security and Monitoring

As specified in the contract, the contractor shall protect respondents’ privacy to the extent permitted by law and will comply with all federal and departmental regulations for private information. The contractor has developed a Data Security Plan that assesses all protections of respondents’ PII. The contractor will ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor who perform work under this contract and subcontract receive training on data privacy issues and comply with the above requirements. All Mathematica staff must sign an agreement to (1) maintain the privacy of any information from individuals, businesses, organizations, or families participating in any projects conducted by Mathematica; (2) complete online security awareness training when they are hired; and (3) participate in a refresher training annually.

As specified in the evaluator’s contract, the contractor will use encryption compliant with the Federal Information Processing Standard (Security Requirements for Cryptographic Module, as amended) to protect all sensitive information during storage and transmission. The contractor will securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Information Processing Standard. The contractor will incorporate this standard into its property management and control system and establish a procedure to account for all laptop and desktop computers and other mobile devices and portable media that store or process sensitive information. The contractor will secure any data stored electronically in accordance with the most current National Institute of Standards and Technology requirements and other applicable federal and departmental regulations. In addition, the contractor’s data safety and monitoring plan includes strategies for minimizing to the extent possible including sensitive information on paper records and for protecting any paper records, field notes, or other documents that contain sensitive information to ensure secure storage and limits on access.

No information will be given to anyone outside the SRAENE study team and ACF. All PII, typed notes, and audio recordings of interviews and focus groups will be stored in restricted, encrypted folders on Mathematica’s network, which is accessible only to the study team.

A11. Sensitive Information 8

There are no sensitive questions as part of the facilitator surveys, facilitator interviews, or facilitator logs (Instruments 1, 2, 3, 4, and 5).

The youth focus group protocol (Instrument 6) includes questions about youth impressions of the SRAE programs, their overall reaction to the facilitation strategies used, and their relationships with staff and peers which some program participants might consider sensitive. However, these questions are essential for understanding how facilitators are using the co-regulation strategies. The study team will obtain active parental consent and youth assent in all sites and will inform potential study participants of the purpose of the data collection, stressing that participants may refuse to answer any question. Additionally, the protocol and all related materials, such as the consent form, are currently under review by Mathematica’s IRB.

A12. Burden

Explanation of Burden Estimates

In Table A.2 we summarize the estimated reporting burden and costs for each instrument. The survey estimates include time for respondents to review the instructions, complete and review their responses, and transmit their responses. The focus group time includes time for participants to review the instructions and participate in an in-person or virtual focus group. The study team expects the total annual burden to be 410 hours for all the instruments in this information collection request. Figures are estimated as follows:

  1. Facilitator Pre-Training Survey. The survey will be administered to all program facilitators sampled to participate in the study (N =36) just prior to starting the co-regulation training. Because facilitators will take the survey just before the start of their training, we anticipate a 100 percent participation rate for a total of 36 completed surveys. The facilitator pre-training survey is estimated to take 10 minutes to complete via the web.

  2. Facilitator Post-Training Survey. The survey will be administered to all program facilitators in the study (N = 36) immediately following the last session of the co-regulation training. Because this survey will be administered during the training session, we anticipate a 100 percent participation rate for a total of 36 completed surveys. The facilitator post-training survey is estimated to take 5 minutes to complete via the web.

  3. Facilitator Follow-up survey. The survey will be administered to all program facilitators in the study (N = 36) at the end of the fall semester (December 2022) and the end of the spring semester (May 2023). We estimate that all program facilitators trained on the co-regulation strategies will complete the survey for a total of 36 completed surveys. Each facilitator follow-up survey is estimated to take 5 minutes to complete via the web.

  4. Facilitator Interview. The interview will be completed with all program facilitators in the study (N=36). The interview is estimated to take 60 minutes and will occur in person (unless local COVID regulations or other restrictions during that time require virtual administration). There will be two rounds of interviews – one round in Fall 2022 and another round in Spring 2023. We estimate that all program facilitators trained on the co-regulation strategies will participate in an interview.

  5. Facilitator Implementation Log. Implementation logs will be web-based, which facilitators complete daily, during the weeks they are delivering programming. All facilitators in the study will complete implementation logs (N=36) 5 times per week during 16 selected weeks across two semesters. The log is estimated to take 3 minutes to complete each time.

  6. Youth Focus Group. Youth recipients of SRAE programming who have parental consent and provide assent will be selected to participate in focus groups lasting no more than 60 minutes per group. The study team will conduct two focus groups at each of the 9 sites. Each focus group will have no more than 10 students participating, for a total of 180 students (9 sites* 2 focus groups per site * 10 students per group = 180 total students).

Estimated Annualized Cost to Respondents

The study team expects the total annual cost to be $8,025 for all instruments in the current information collection request. The Occupational Employment Statistics (2021)9 from the Bureau of Labor Statistics have been used to estimate the average hourly wage for the participants of this study and derive total annual costs. For each instrument listed in Table A.2, the study team calculated the total annual cost by multiplying the annual burden hours by the average hourly wage, as follows:

  • The mean hourly wage for educational instruction and library workers (Occupational Code 25-9099) of $23.3610 was used for the program facilitators who complete the facilitator pre- and post-training surveys, the facilitator interviews, the facilitator implementation logs, and the youth attendance record logs.

  • The average hourly wage for high school–age youth was estimated at $14.68. The hourly wage was based on median weekly earnings of $587 for youth ages 16 to 19 who work a 40-hour workweek.11


Table A.2. Total burden requested under this information collection

Instrument

No. of participants (total over request period)

No. of responses per participant (total over request period)

Avg. burden per response (hours)

Total/annual burden (hours)

Avg. hourly wage rate

Total annual participant cost

Instrument 1. Facilitator Pre-training Survey

36

1

0.16

5.76

$23.36

$135

Instrument 2. Facilitator Post-training Survey

36

1

0.08

2.88

$23.36

$67

Instrument 3. Facilitator Follow-up Survey

36

2

0.08

5.76

$23.36

$135

Instrument 4. Facilitator Interview Protocol

36

2

1

72

$23.36

$1,682

Instrument 5. Facilitator Implementation Log

36

80

.05

144

$23.36

$3,364

Instrument 6. Youth Focus Group Protocol

180

1

1

180

$14.68

$2,642

Estimated total annual burden

410


$8,025

A13. Costs

There are no additional costs to respondents.

A14. Estimated Annualized Costs to the Federal Government

The estimated total cost to the federal government for this study is $1,221,637 (Table A.3). This includes the costs for collection and processing the data, conducting analysis, and preparing reports.

Table A.3. Estimated total cost by category

Cost category

Estimated costs

Fieldwork

$ 934,184

Analysis and reporting

$ 287,453

Total/annual costs over the request period

$ 1,221,637



A15. Reasons for changes in burden

This is for an individual information collection under the umbrella formative generic clearance for ACF research (0970-0356).

A16. Timeline

Table A.4 contains the timeline for data collection, analysis, and reporting activities for the CIS. The study team expects to collect data fall 2022 through spring 2023, followed by analysis in spring and summer 2023 and reporting in summer 2023.

Table A.4. Schedule for CIS formative data collection and reporting

Activity

Timinga

Data collectiona


Facilitator Pre-Training survey

Late summer 2022

Facilitator Post-Training Survey

Late summer 2022

Facilitator Follow-up Survey

Late Fall 2022 and Late Spring 2023

Facilitator Interviews

Fall 2022 and Spring 2023

Facilitator Implementation Logs

Fall 2022 and Spring 2023

Youth focus groups

Spring 2023

Data Analysis

Spring and Summer 2023

Reporting

Summer 2023

a After obtaining OMB approval.


A17. Exceptions

No exceptions are necessary for this information collection.

Attachments

Appendices

Appendix A: Study Notification and Reminder Materials

Appendix B: Youth Focus Group Consent and Assent Forms

Appendix C: SRAENE CIS Surveys Crosswalk and Research Questions

Instruments

Instrument 1. Facilitator Pre-training Survey

Instrument 2. Facilitator Post-training Survey

Instrument 3. Facilitator Follow-up Survey

Instrument 4. Facilitator Interview Protocol

Instrument 5. Facilitator Implementation Log

Instrument 6. Youth Focus Group Protocol

1 https://www.acf.hhs.gov/opre/blog/2022/03/co-regulation-connection-human-services-developing-learning-agenda

2 A prior formative evaluation completed as part of the Self-Regulation Training Approaches and Resources to Improve Staff Capacity for Implementing Healthy Marriage Programs for Youth (SARHM, OMB Control Number 0970-0355) guided the development of the co-regulation strategies and suggested they can be integrated into youth-serving programs, such as SRAE.

3 https://www.acf.hhs.gov/opre/blog/2022/03/co-regulation-connection-human-services-developing-learning-agenda

4 Domina, T., Renzulli, L., Murray, B., Garza, A.N., and Perez, L. 2021. Predicting Successful Engagement with Online Learning during COVID-19. Socius. Volume 7, pp, 1–15.

5 Vigdor, J.L. Ladd, H.F., and Martinez, E. 2014. Scaling the digital divide: Home computer technology and student achievement. Economic Inquiry, 52(3), pp, 1103-1119.

6 Barnum, M. and Bryan, C. 2020. America’s great remote-learning experiment: What surveys of teachers and parents tell us about how it went. Chalkbeat.

7 As approved under OMB #0970-0536, expiration date 11/30/2022. An extension request for this information collection is currently underway.

8 Examples of sensitive topics include (but not limited to): social security number; sex behavior and attitudes; illegal, anti-social, self-incriminating and demeaning behavior; critical appraisals of other individuals with whom respondents have close relationships, e.g., family, pupil-teacher, employee-supervisor; mental and psychological problems potentially embarrassing to respondents; religion and indicators of religion; community activities which indicate political affiliation and attitudes; legally recognized privileged and analogous relationships, such as those of lawyers, physicians and ministers; records describing how an individual exercises rights guaranteed by the First Amendment; receipt of economic assistance from the government (e.g., unemployment or WIC or SNAP); immigration/citizenship status.

9 U.S. Bureau of Labor Statistics. “May 2021 National Occupational Employment and Wage Estimates.” Available at https://www.bls.gov/oes/current/oes_nat.htm.

9


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSusan Zief
File Modified0000-00-00
File Created2023-10-26

© 2024 OMB.report | Privacy Policy