Grantee Leadership Interview Initial

Cross-site Study Data for Improving Implementation Evaluation among Office of Adolescent Health (OAH) Teen Pregnancy Prevention (TPP) Grantees to inform National Implementations (IMAGIN)

Instrument 1A_9.24.19_Clean

Grantee Leadership Interview Initial

OMB: 0990-0469

Document [docx]
Download: docx | pdf


OMB# 0990-XXXX

Expiration Date: XX/XX/XXXX

















INSTRUMENT 1A

GRANTEE PROGRAM LEADERSHIP STAFF INTERVIEW TOPIC GUIDE (INITIAL)






















According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this information collection is 0990-XXXX. The time required to complete this information collection is estimated to average 1 hour 30 minutes per respondent, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you have comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: U.S. Department of Health & Human Services, OS/OCIO/PRA, 200 Independence Ave., S.W., Suite 336-E, Washington D.C. 20201, Attention: PRA Reports Clearance Officer.


INSTRUMENT 1A. TOPIC GUIDE FOR INITIAL INTERVIEWS WITH GRANTEE LEADERSHIP

INTRODUCTION

Thank you for agreeing to meet with us. I am from Mathematica Policy Research. I’m part of an independent research team that is studying the implementation of the programs funded by the Office of Population Affairs (OPA) Teen Pregnancy Prevention (TPP) grant in [2018/2019].

The purpose of our discussion today is to learn more about your experiences planning and implementing [insert program name] in Phase I of your [2018/2019] OPA grant. Your point of view is valuable. The interview should last about [90 minutes or 60 minutes], and we will take notes during our conversation so we can accurately represent your experience and views in our reporting. We would also like to record this discussion to make sure our notes are accurate, if that is okay with you.

Your responses will be kept private, and the notes and recording from this discussion will not be shared with anyone beyond the research team. The recording will be erased once we have finalized our notes. We will combine most information from this conversation with information from other discussions we conduct.

We will report most information based on these discussions in the aggregate. We may use quotes to illustrate findings, but if we do, we will not report any information that will allow a quote to be identified with you.

Please keep in mind:

There are no right or wrong answers to these questions. We just want to learn about your experience and perspective.

Your participation in this conversation is completely voluntary. You don’t have to answer any questions you don’t want to answer during our discussion today.

Do you have any questions for us before we get started?

PRE-DISCUSSION QUESTIONS

I want to emphasize again that there are no right or wrong answers to our questions. By voluntarily agreeing to participate in this study, you are agreeing to answer these questions with responses that are true for you.

Do you understand the purpose of our conversation today?

Do you have any questions before we begin?

PROJECT CONTACT INFORMATION

If you have further questions about this project or if you have a research-related problem, you may contact the project director, Dr. Jean Knab, at (609) 945-3367 or JKnab@mathematica-mpr.com.


TOPICS FOR DISCUSSION

[Note to interviewer: Topics and questions will be tailored to each grantee’s individual context and stage of readiness, based on program materials, TA calls and the quarterly reports. Some topics may only be discussed in the first interview with grantee leaders, and can be dropped from the follow-up discussion, as needed. These topics are denoted by “(initial interview)”].

A. Program readiness: Program design and readiness for implementation (initial interview)

1. Theory of change

  • Description of the program's theory of change

  • Description of the program's philosophy, values, and principles

  • Stage of intervention development at the beginning of grant

  • Target outcomes of program

  • Process for selecting or developing the program(s)

  • Target population(s) the program is intended for

  • Implementation setting(s)

2. Core components

  • Essential elements or core components of the program

  • Protective factors that the program targets, and strategies to address them

  • How well elements of optimal health, risk avoidance, and/or risk reduction are incorporated into the intervention

  • Program structure (frequency, duration, number of sessions, etc.)

3. Standardized program operations

  • Description of program materials (e.g., facilitator manual, student workbooks, handouts)

  • Defined fidelity and quality benchmarks

  • Defined staff requirements and training processes

B. Organizational readiness: Preparation and planning for implementation (initial interview)

1. Enabling organizational context

  • Organization's history of working in target communities

  • Process for assessing community needs and the demand for program and/or services

  • Whether and how input from youth, facilitators, and community stakeholders was incorporated into program planning

  • Key challenges or barriers to getting community support or buy-in

  • Other similar programs and services available in the community

  • Leadership support and buy-in to the need for the program (e.g., staff involvement in decision making, or presence of program "champions" and who they are)

  • How the intervention fits within the current organization structure and mission

  • Extent of available program resources and funding at the time of implementation

  • Extent that a shared vision was used to guide intervention planning among organization or agency staff

  • Process for identifying evidence-based or evidence-informed practices

  • Use and utility of the SMARTool for program selection (the Tool to Assess the Characteristics of Effective Sex and STD/HIV Education Programs)

  • Factors that helped or hindered intervention planning

  • Frequency and type of planning meetings

  • How involved staff at various levels were in developing, selecting, and/or planning the intervention

2. Infrastructure and implementation supports

  • Description of partnerships utilized during Year 1 (e.g., key partners, process for involvement)

  • Procedures for recruiting and hiring staff

  • Staff qualifications and credentials

  • Description of the current staffing structure and capacity

  • Timeline for recruitment and hiring

  • Descriptions and benchmarks of what staff are expected to do for optimal implementation (i.e., staff performance)

  • How much implementation procedures and training are operationalized for repetition

  • Plans for initial (and refresher) staff training for optimal implementation (e.g., timeline, content, and delivery of training)

  • Plans for individual and group supervision

  • Technical assistance and other supports available to staff during implementation

  • Plans for recruiting and engaging youth (e.g., inclusion/exclusion criteria, setting, methods of outreach)

C. Organizational readiness: Preparing for program evaluation

1. Continuous quality improvement (CQI)

  • How extensively CQI approaches were utilized in developing and refining programs

  • Process for incorporating feedback from community stakeholders, youth, and families into program improvement during Phase I

  • Description of data that were used for CQI

  • How data that were collected are used for program improvement

2. Formative evaluation

  • Description of the measures and data used for the formative evaluation

  • Description of how data were used to influence planning in Phase I

  • Advantages of having the Phase I funding to prepare for a summative evaluation

  • Disadvantages of having the Phase I funding

  • Accomplishments due to the Phase I funding that would not have been feasible otherwise

  • Plans for scale-up, if any

3. Summative evaluation

  • Plan for the summative evaluation (e.g., timeline, target outcomes, plans for data collection and analysis)

  • Factors that were helpful or challenging in designing a rigorous evaluation plan

  • Extent of community commitment to rigorous evaluation

  • Inclusion of an economic evaluation

4. Process evaluation

  • Plans for collecting process data: types of data included, and who will collect them

  • Plans to collect data based on coaching or technical assistance to assess staff needs and effectiveness

  • Fidelity measurement overview (e.g., measure, analysis of data, using data for program improvement)

D. Promising evidence: Early implementation experience

1. Implementation supports

  • Overview of staff recruitment and hiring

  • Staff training overview and experiences

  • Changes in staff recruitment, hiring, or training since program start

  • Lessons learned from staff training

  • Successes and challenges of supervision structure


2. Community need and demand

  • Main unmet needs in the target communities

  • Community perceptions about and desire for programs focused on preventing teen pregnancy

  • Efforts to engage and request input from local stakeholders and beneficiaries on the demand for the program

  • Response from youth and families during Phase I (i.e., descriptive or quantitative feedback from end users on program fit, challenges, successes)

3. CQI

  • Staff experiences with the CQI process

  • Data collected for CQI and how they were used

  • Adjustments or changes to the original CQI plan

  • How program was improved as a result of CQI

4. Fidelity

  • Early data on fidelity measures and how these were used

  • Data on program quality and how they were used

  • Adjustments made to program or implementation infrastructure based on fidelity data

5. Participant outcomes

  • Indicators of youth engagement and participation, and how these changed over time

  • Reasons for changes in response from youth and families over time

E. Lessons learned

  • Overall lessons related to program and organizational readiness for implementation and evaluation

  • Successes, challenge, and lessons based on phased grant structure



This page has been left blank for double-sided copying


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRice, Tara (HHS/OPHS)
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy