Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches (OMB Control #0990-xxxx)
Part B: Statistical Methods for Implementation Data Collection
October 2010
B1. Respondent Universe and Sampling Methods 1
B2. Procedures for Collection of Information 1
B3. Methods to Maximize Response Rates and Deal with Nonresponse 4
B4. Tests of Procedures or Methods to be Undertaken 4
B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 4
This ICR specifically requests clearance to collect information in the following ways:
Interviews with program staff, and with community members where programs are implemented, using the Master Topic Guide (see Attachment E);
Focus groups with front-line staff, using a Discussion Guide (see Attachment F);
Focus groups with youths in both the program and control conditions, using a Discussion Guide (see Attachment G); and
Interviews with program staff and community members in the control condition, when appropriate, using a topic guide (see Attachment H).
In the PPA evaluation, OAH and ACF will identify eight study sites that will implement different pregnancy prevention approaches. In approximately six of these sites, the programs to be tested are based in high schools or middle schools. In other sites, the programs to be tested will be operated in community-based organizations (CBOs). All eight sites will be included in the implementation study. The eight sites will be recruited purposefully, rather than selected randomly. They are thus intended to serve as tests of a range of programs of various types, and will not be statistically representative of a larger universe of pregnancy prevention programs.
Within each site, implementation study data will be collected from staff and community members in positions with varying roles and responsibilities who will be knowledgeable about the origins and operations of their program and the challenges it has encountered. Focus groups will also be held with 8-12 participating youths per group, with different levels of involvement in the program (or possibly in the control group), who agree to participate in a focus group discussion. The youths will not be randomly selected for the focus group.
Which topics will be relevant and important and who is best positioned to provide information on each aspect of program implementation will vary from site to site. For that reason, we established an overall topic guide (Attachment E) to organize data collection and documentation of each site’s implementation. The master topic guide identifies the information that will be gathered to document the program plans (background, program design and theory of change, and program context); describe program implementation (funding, infrastructure, staffing, training and technical assistance, outreach and recruitment, enrollment, and key program features); assess implementation fidelity and quality (youth participation and engagement, fidelity benchmarks, quality indicators, implementation challenges and successes); and describe the control condition (e.g. differences in program experiences).
These topics will be explored through six main data sources: program documents; site documents and records; evaluation team notes from the site selection and monitoring process; interviews with key informants during site visits and telephone discussions (including focus groups with participants and frontline staff); observation of program activities; and the baseline and followup surveys of the evaluation sample (addressed in other OMB submissions). Although the general topic guide will be tailored to the circumstances and design of each site’s program, we can project in general terms how we will use these data sources to explore the major topics of interest (see Attachment A).
The most intensive data collection for the implementation study will take place in two (or in some cases three) visits to each evaluation site, each for two or three days. A first visit will occur early in the period of program operations. A second visit will be conducted during the subsequent year, with the exact timing depending on the length of the program and the schedule of its activities. In some sites a third visit may be useful, but we expect in most cases to conduct two visits.
Two-person teams led by the site study leader will conduct each visit. The site study leader will be a senior project member who can communicate clearly, organize work effectively, provide strong analytical thinking, and remain objective and professional. To promote objectivity, the senior staff members who led recruitment of the site and developed the site agreement will not be eligible to serve as the site study leader, although they will be consulted for background information by the site study leader. The two-person approach to the site visits will increase the effectiveness of probing during interviews and the accuracy of information obtained. It also builds in flexibility to accommodate site schedules, allowing site visitors to split up and cover different interviews if the need arises.
Preparation for site visits will involve customizing the plan and protocols to each site. This will involve two steps. The first step is to prepare a site-specific logic model or “program framework.” Beginning with a general template, the site study leader will fill in what is known about the logic of a site’s program at the outset of data collection, the planned inputs, contextual factors and external influences, and program vision, as well as the intermediate and longer term outcomes the program seeks to affect. This process will highlight gaps in our understanding of what the program developer and the site leaders believe are the processes for affecting youth behavior and the factors that will affect the program’s success. It will also provide a basis for identifying implementation fidelity benchmarks.
The second step in customizing, even before a site’s program is implemented, is creating a preliminary program profile and preliminary control condition profile. The study site leader will review existing documents available from the program developer and program site leaders, as well as notes from discussions during the site selection and readiness assessment process, to gather as much information as possible on the topics listed in the topic guide. These documents might include implementation plans, grant applications, program budgets or justifications, communications with PPA evaluation staff, staffing plans, and materials used to communicate about the program. The site study leader will use this information to create the beginnings of a site/program profile and a control condition profile. The entries to these profiles, and the gaps in the partially completed profiles, will focus our attention on what needs to be investigated or confirmed in further data collection. The site study leader will use these profiles to plan site visits, so that individual and small group interviews focus on information that cannot be obtained from other sources.
The site study leader will then create customized discussion guides to ensure that we collect the needed information in an efficient, consistent way from the most appropriate respondents. The site-specific plan will include a customized topic guide, which may elaborate on or provide “local language” versions of topic definitions, and may eliminate some topics as not relevant or already thoroughly explored. The plan will identify which information will be collected from which sources, key respondents who should be interviewed, and other sources that should be tapped. Implementation study leaders will review the site visit plans and customized discussion guides for each site to help ensure consistency across sites and to facilitate inclusion of topics to inform cross-site issues that are emerging from early visits.
On-site data collection will be done in five ways. We will conduct interviews with key personnel, group discussions with front-line staff, focus group discussions with participants, observation of program activities, and discussions with personnel at control group locations (in sites with cluster random assignment). Collecting data from diverse respondents who may have different information or perspectives will allow us to triangulate information and gain a more complete understanding of program implementation.
Interviews. During each visit, site visitors will conduct individual and small-group interviews with people with the following roles or perspectives:
Program leadership at the site (staff with major responsibility for implementing and delivering the program)
Representative of the sponsoring organization (school district, nonprofit organization, public agency)
Key school or community-based organization representatives (depending on site locations)
Program partners, including funders and other parties involved in delivering service components
Community members knowledgeable about related services for adolescents
The interviews will be conducted with tailored protocols based on the master topic guide, customized by site study leaders. Site visitors will request copies of any documents identified in these interviews that might provide additional information about relevant topics. In addition, site visitors will work with other evaluation team members to make or facilitate requests for program records related to the participation of evaluation sample youths.
Group Discussions with Front-Line Staff. The individuals who lead activities and provide services to youths have an important role in the program. They have a unique perspective on the training and support they received for carrying out their responsibilities, the implementation of some key program features, and the strengths and needs of the youths with whom they work. Discussing these topics with the frontline staff directly will ensure that our understanding of each program is informed by the experiences of those who are responsible for implementing key activities.
We will invite as many frontline staff (and others who conduct program activities) as feasible and practical to participate in discussions. The site visitors will work with program leaders to arrange the discussion with staff at a convenient time and location. These discussions will be guided by a protocol (Attachment F), which will be customized in advance of the site visit.
Focus Group with Participants. Another perspective that is crucial for understanding program implementation is that of the youths who participated in the program. We will convene a group of participating youths and talk to them about their decisions to participate in specific program activities, their opinions about the activities in which they participated, the aspects of the program that they liked or would change, and their participation in other similar programs.
Site visitors will work with program staff to identify and recruit about 12 program participants per focus group, from multiple program locations if feasible or from just one or two if locations are too dispersed. Focus groups will be conducted using a general guide (Attachment G) which, like other protocols, will be tailored to each site to reflect proper program nomenclature and the program design. Food will be provided.
Program Observation. Observing some program activities can help deepen site visitors’ understanding of information obtained in interviews and group discussions and provide illustrations of the way the program works. In visits after the first, site visitors will observe typical program activities with youth and record descriptive information. The site visitor will record information about the setting, staffing, participants, topics covered and messages conveyed, and engagement of staff and participants in the activity.
To the extent feasible, site visits will be scheduled so that site visitors can observe program activities in several program locations, selected in consultation with the program leaders to represent a range of activities, settings, and participant characteristics. For example, if the program and evaluation is being conducted in multiple schools within a school district, site visitors will arrange to observe activities in schools that illustrate the variation that exists in program staffing, school characteristics, and youth characteristics. Information from the observations will not be used to rate the program or generate outcome or mediating variables; it will be used to help site visitors understand how the program works and illustrate information in the program/site profile or evaluation report.
Discussions about the Counterfactual. Site visits will clarify the counterfactual services available to control group youth. Two scenarios are possible. If the site evaluation design involves random assignment of schools or other cluster units, then our field staff will conduct interviews with key personnel at those schools or other clusters, using the protocols in Attachments F and H). Field staff may also conduct focus groups with youth. If the site design involves random assignment of individuals, then we will be exploring the range of services available to control group youth by interviewing lead relevant staff at the organizations that are viewed as the major sources of alternative services, as well as possible focus groups with youth. We will identify those organizations through our contacts with the site staff and our own independent web-based research about the site. Those interviews will be guided by the outline in Attachment H.
Site visits will be planned well in advance so that all identified respondents can participate in individual or group interviews, as appropriate. We anticipate that refusals to participate and absences will be rare.
No pretest of the implementation study protocols has been conducted.
The PPA implementation study site visits will be conducted by OAH and ACF’s contracting organization, Mathematica Policy Research, and its subcontractors Child Trends and Twin Peaks Partners, LLC. Individuals whom OAH and ACF consulted on the collection and/or analysis of the implementation data include the contractor staff listed below, as well as members of the project Technical Work Group who attended a TWG meeting in spring 2010 to review the design and provided input about the overall project design.
Alan Hershey
Mathematica Policy Research, Inc.
P.O. Box 2391
Princeton, NJ 08543
(609) 275-2384
Ellen Kisker
Twin Peaks Partners, LLC
7639 Crestview Drive
Longmont, CO 80504
(303) 834-8364
Alicia Meckstroth
Mathematica Policy Research, Inc.
P.O. Box 2391
Princeton, NJ 08543
614-505-1401
Rachel Shapiro
Mathematica Policy Research, Inc.
P.O. Box 2391
Princeton, NJ 08543
(609) 936-279-6384
Jennifer Manlove, Karen Walker, and Kristine Andrews
Child Trends
4301 Connecticut Ave. NW
Washington, DC
20008-2333
(202) 362-5580
TECHNICAL WORK GROUP MEMBERS
James Jaccard, Ph.D.
Professor of Psychology
Department of Psychology
Florida International University
945 Roderigo Ave.
Coral Gables, FL 33134
Phone: 305-348-0274
Meredith Kelsey
Abt Associates
55 Wheeler St.
Cambridge, MA 02138
Christine Markham
The University of Texas School of Public Health
P.O. Box 20186
Houston, TX 77225
(713) 500-9646
Pat Paluzzi
President
Healthy Teen Network
1501 Saint Paul St., Suite 124
Baltimore, MD 21202
(410) 685-0410
Susan Philliber
Philliber and Associates
16 Main St.
Accord, NY 12404
(845) 626-2126
Michael Resnick
Division of Adolescent Health and Medicine
717 Delaware St. SE, Suite 370
Minneapolis, MN 55414-2959
(612) 624-9111
Jeffrey Smith, Ph.D.
Professor of Economics and Faculty Associate, Survey Research Center, Institute for Social Research
Department of Economics
University of Michigan
238 Lorch Hall
611 Tappan St
Ann Arbor, MI 48109-1220
(734) 764-5359
Don Winstead
Deputy Secretary
Department of Children and Families
1317 Winewood Blvd.
Building 1, Room 202
Tallahassee, FL 32399-0700
(850) 921-8533
Inquiries regarding statistical aspects of the study design should be directed to:
Seth Chamberlain
Office of Planning, Research, and Evaluation
Administration for Children & Families
U.S. Department of Health and Human Services
370 L’Enfant Promenade, SW
Washington, DC 20477
(202) 260-2242
Mr. Chamberlain is the project officer and has overseen the design of the implementation study protocols.
File Type | application/msword |
File Title | Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches Part B: Statistical Methods for Base |
Author | Mary Hess |
Last Modified By | AMargolis |
File Modified | 2011-04-05 |
File Created | 2011-04-05 |