Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches
Part B: Statistical Methods for First Follow-up Data Collection
CONTENTS
B1. Respondent Universe and Sampling Methods
B2. Procedures for Collection of Information
B3. Methods to Maximize Response Rates and Deal with Nonresponse
B4. Tests of Procedures or Methods to be Undertaken
B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
ATTACHMENTS:
Attachment A: Question by Question Source List and Crosswalk between
PPA Baseline and First Follow-up Surveys
Attachment B: Question Justification
Attachment C: Sources Referenced
Attachment D: Entities Consulted
Attachment E: 60 Day Federal Register Notice
Attachment F: Pretest Report
The Administration for Children & Families (ACF) of the U.S. Department of Health and Human Services (HHS) is assisting the HHS Office of Adolescent Health (OAH) in conducting the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA), an eight-year demonstration designed to study the effectiveness of promising policy-relevant strategies to reduce teen pregnancy. The study was designed to include up to eight evaluation sites, and at this point it appears that there will be seven sites:
One site – Chicago Public Schools, implementing the HealthTeacher curriculum – has been recruited, and a baseline survey has been implemented
Six federally-funded grantees have been recruited
Approval for outreach discussions with stakeholders, experts in the field, and program developers was received on November 24, 2008 (OMB Control No. 0970-0360). Approval for the baseline survey data collection and the collection of youth participant records was received on July 26, 2010 (OMB Control No. 0970-0360). Emergency clearance for site-specific variants of the baseline survey questionnaire was received on August 22, 2011 (OMB Control No. 0970-0360).
We now seek OMB approval for the first follow-up data collection, and for two tailored site-specific follow-up questionnaires. Similar to the baseline survey effort, a large group of federal staff has collaborated to modify a previously drafted PPA follow-up instrument into a “concordance follow-up instrument” suitable for all HHS pregnancy prevention evaluations, including but not limited to PPA. HHS is trying to maximize consistency across evaluations of federal pregnancy prevention grant programs. In 2010 and 2011, the Administration for Children and Families (ACF) and the Office of Adolescent Health (OAH), in coordination with other HHS offices overseeing pregnancy prevention evaluation, collaborated to consider revisions to the previously drafted PPA instrument.
As in the case of baseline data collection, site-specific variation in follow-up data collection instruments is planned, because of the differences among the seven PPA sites. As PPA sites were recruited, we found that variations in their target populations and program models make it essential to tailor data collection, at both baseline and follow-up, to analytical priorities in each site. Developing those site-specific instruments, and follow-up schedules, involves working closely with six of the sites that are federal pregnancy prevention grantees, and with the local evaluators they have engaged as a condition of their grants. That process has been completed site by site, and the result determines when the first follow-up is to be conducted in each site, and thus determines for which sites approval of follow-up instruments is most urgent.
This submission presents follow-up questionnaires and estimated burden for the two sites scheduled for follow-up data collection in the nearest future. In the Chicago Public Schools (CPS) site, baseline data collection was conducted in fall 2010, and the first follow-up is to be conducted in fall 2011, as part of a test of the Health Teacher curriculum for seventh-graders. CPS is not a federal grantee, and the standard PPA follow-up instrument can be used; in this case, therefore, the tailored follow-up questionnaire is also the “concordance” questionnaire that has been defined as a foundation for all PPA sites and for use in other federal pregnancy prevention evaluations. The second site involves a federal grantee—the Oklahoma Institute for Child Advocacy (OICA), which is testing the effect of Power Through Choices 2010 on youth residing in foster care group homes. OICA will enroll the first of its sample cohorts in early fall 2011, and deliver a ten-session program. The program will be followed by an “immediate posttest” and then by a follow-up six months after program completion, both using the same instrument. For both of these PPA sites, early approval of follow-up questionnaires is essential to maintain the schedule of data collection. As development of site-specific follow-up questionnaires for the remaining PPA sites is completed, they will be submitted to OMB along with the estimated burden. In all cases, the site-specific questionnaires represent the full extent of survey data collection for the PPA impact analysis. All of the research priorities relevant to the impact analysis have been addressed in these questionnaires, and local evaluators will not be administering any other instruments for the PPA impact analysis.
In the PPA evaluation, HHS has identified seven study sites that will implement different pregnancy prevention approaches. In three of these sites, the programs to be tested will be school-based—operated in high schools or middle schools. In the other sites, the programs to be tested will be operated in community-based organizations (CBOs). The study will use a sample of approximately 9,000 teens across all sites. In each site, youth will be assigned to a treatment group that receives the program of interest, or to a control group that does not. In five sites, to ensure that behavior of control group youth is not affected, or “contaminated” by interaction with treatment group youth, random assignment will be done generally at the cluster level (that is, the school or CBO). In the other two sites, random assignment will be done at the individual level, because risks of contamination are low. In the two sites whose follow-up questionnaires are submitted now for approval, random assignment is by cluster. In Chicago, middle schools have been randomly assigned, and for the Oklahoma grantee foster care group homes will be randomly assigned, and a total of 2,680 youth will be enrolled in the sample across these two sites.
A baseline survey will be conducted with both the program and control groups before the youth in the program group are exposed to the pregnancy prevention programs. The first follow-up surveys (the purpose of this OMB submission) will be conducted in most sites, and pursuant to the TWG guidance, no sooner than 3-6 months after the end of the scheduled program intervention for each sample member. The final follow-up survey (for which approval will be sought in a later submission) will be conducted with participating youth no later than 18-24 months after the scheduled end of the program. The exact timing of the follow-up surveys has been determined in each site, taking into account the length of the program, the age of the target population, and the priority outcomes of interest. In the Oklahoma site, there will be three follow-ups: immediately following program completion, and at six and twelve months after completion. Wherever possible, there will be group administration of the self-administered survey; when necessary to increase response rates, this method will be augmented with web survey and telephone follow-up.
The universe of potential respondents will vary across study sites, depending on the type of program in place at each site. Hence, we first describe the possible types of program structures and the corresponding study design.
Of the seven sites in the evaluation, five will involve random assignment at the cluster level (schools or other groupings), and two will involve random assignment at the individual level. Random assignment will occur at the time of sample enrollment (after the baseline survey). At follow-up, we plan to target all youth who were randomly assigned at baseline to the program or control group. It is possible, however, in schools that might have an appreciably larger population of students than the target sample size, that following the baseline survey we subsample students for follow-up.
Table B.1 summarizes our sample size estimates for all seven evaluation sites (burden estimates for the two sites whose instruments are submitted here are presented in Part A12). Based on our plans to include five sites with cluster random assignment and two with individual-level random assignment, we expect the total sample size will be approximately 9,000.
Table B.1. Expected Sample Sizes
Type of Program |
Number of Sites |
Average
Sample |
Total Sample Size by Program Type |
Required in-school |
3 |
1,600 |
4,800 |
Community-based |
2 |
1,100 |
2,200 |
Clinic/service-based (individual) |
2 |
1,000 |
2,000 |
Total |
7 |
|
9,000 |
We expect to achieve a response rate of 85 percent on the first follow-up survey, and 80 percent on the second follow-up survey. (In Oklahoma, we are projecting 85 percent completion for both the immediate posttest and the six-month follow-up.) These rates are comparable to the response rates achieved on the study of Title V abstinence education programs conducted by Mathematica Policy Research.1 Reasons for projecting these response rates are explained in section B3.
HHS will collect information in the follow-up surveys on youth behaviors from approximately 9,000 youth across seven sites (see Table BI.1 for distribution). Whenever possible, the assignment to treatment (receipt of one of the approaches to reducing teen pregnancy) or control groups (not receiving such treatment) takes place at the site, school, or classroom level in order to minimize contamination between control and treatment group youth. When there are more youth at a site than anticipated, youth may be subsampled at baseline, or in some cases the youth completing the baseline survey may be randomly sampled for follow-up if their numbers substantially exceed sample requirements.
We will attempt to collect follow-up data for any sample member with consent, regardless if they completed the baseline survey. Active parental consent will be obtained for each participant prior to administration of the baseline survey. In addition, participant assent is obtained prior to the administration of each of the surveys. In three sites, parental consent is not required for all participants. In Oklahoma, some of the youth are under the legal guardianship of the state foster care system, so a caseworker, lawyer, or other identified legal representative will be providing consent for these youth to participate. In OhioHealth, some of the youth are 18 and older. Parental consent is not required for these participants, so we will obtain active consent directly from these sample members. In CHLA, the IRB has determined that parental consent is not required for these participants so, active consent will be collected from the CHLA sample members. Parental consent was received for 73% of eligible youth in Chicago, and 93% percent of consented youth responded to the baseline survey. (At the time of this writing – August 2011 – the remaining sites had not enrolled sample, so we are unable to determine consent and baseline rates for those sites.)
The general plan for follow-up data collection is to conduct two follow-up surveys in each site. In most sites, a first follow-up will be administered no earlier than 3-6 months after the end of program participation for the program group, and the second follow-up no later than 18-24 months after the end of the program, as recommended by the PPA Technical Work Group. The exact timing for each site (see table below) takes into account the age of the sample population, the length of the intervention, and the period over which detectable impacts on the key priority outcomes could be expected to emerge.
In two sites, an additional early follow-up has been scheduled. In the Oklahoma (OICA) site, an immediate posttest will allow analysis of immediate effects on knowledge and attitudes, using the progression of three follow-up data points to model the role of intermediate outcomes on long-term impacts. In the OhioHealth site, where intervention effects on short-term contraceptive practice of teen mothers after the birth of their child is an important goal, the plan includes a follow-up six months after enrollment, while the program sample is still active in the program. The addition of this early follow-up should have no effect on the quality of data collected at later follow-ups. The interval between the early follow-up and the next is six months or more; that interval, and even shorter ones, are commonly used in teen pregnancy prevention studies multiple follow-up surveys. We will work in each of these sites to ensure that the same procedures are used in the early follow-up as in later ones, and to maintain respondent commitment to sustained participation in the study.
PPA EVALUATION SITES: TARGET POPULATIONS
AND FOLLOW-UP SCHEDULES
Site (Grantee) |
Target Population/Enrollment Point |
Length of Intervention (elapsed time) |
Timing of Early Follow-up (from end of program) |
Timing of Final Follow-up (from end of program) |
Chicago Public Schools (CPS) |
7th grade students/start of 7th grade |
16 weeks (fall 2010–spring 2011 |
5-6 months |
13-14 months |
Engender Health |
14-16 year old participants in summer youth employment program/start of program |
5 days |
6 months |
18 months |
Princeton Center for Leadership Training (PCLT) |
9th grade students/start of 9th grade |
5-16 weeks (depending on school schedule) |
6-7 months |
18-19 months |
OhioHealth Research and Innovation Institute |
Pregnant/parenting females 15-19 years old/recruited after delivery or during prenatal care |
18 months |
- Early FU during program (6 months after enrollment) - FU at end of intervention (18 months after enrollment) |
12 months (30 months after enrollment) |
Oklahoma Institute for Child Advocacy (OICA) |
Youth in foster care group homes 15-19 years old/resident at time of study recruitment |
10 weeks |
- At program completion - 6 months |
12 months |
Children’s Hospital Los Angeles (CHLA) |
Teen mothers less than 20, with child less than 6 months/recruited through clinics and other programs |
12 weeks |
9 months |
21 months |
Live the Life (LtL) |
7th grade students/start of 7th grade |
2 school years (8-day dose each year) |
3-5 months (spring 8th grade) |
15-17 months (spring 9th grade) |
In all sites in varying degrees, locating some sample members for follow-up will be required. Sample members in school-based sites will, at a minimum, have changed classrooms since baseline, and some will have changed schools. In other sites, sample members may have moved. Prior to the follow-up survey data collections, the contractor will work with the site to locate sample members in their new classrooms or schools, or obtain any available updates to contact information. Additionally, information will be collected at various points throughout the study through emails, phone calls, and postcards asking sample members to provide updated contact information. Cases that are particularly difficult to find will be sent to the contractor’s locating staff.
Where the program enrolls students and is delivered in schools, the first follow-up will begin with group administration. Contractor staff will work with sites to determine a date and exact venues for conducting group survey administration. Contractor staff will arrive at the site for the survey day, two staff members per survey room. When in the survey room(s), contractor staff will use the survey roster to take attendance and determine whether any youth are missing and to exclude any not on the survey roster. Any sample members who have moved out of the area will be given the option of completing the follow-up survey via the web or over the telephone. Contractor staff will hand out pre-identified survey packets to the youth whose names are on the packets, and obtaining youth assent. Each packet will consist of the PPA paper-and-pencil interview (PAPI) questionnaire and a sealable, blank survey return envelope. The questionnaire and outside envelope will have a label with a unique ID number (no personally identifying information will appear on the questionnaire or return envelope). All youth will complete Questionnaire Part A, which asks for background information and concludes with a single screening question about sexual experience. Youth with sexual experience will complete Part B1 and those without will complete Part B2. Two contractor staff members will monitor each survey room. Upon completion, youth will place the questionnaire Parts A, B1, and B2 (both the used and the unused sections) in the return envelope, seal it, and return it to a contractor staff member. Staff will send the completed questionnaires to the contractor’s office, where the questionnaires will be receipted and checked for completeness and scannability. All questionnaires that pass the check will be sent to a scanning vendor to be scanned. All scanned data will be electronically transmitted to the contractor.
Telephone and web-based administration of the follow-up survey will also be used. In sites with group administration, sample members who do not complete in an initial group session or a make-up session will be given the option to complete by phone or web. In other sites, telephone and web will be the primary modes of data collection, because the sample will be dispersed and assembling groups will not be feasible. In both situations, contractor staff will contact them and provide a PIN/password for web completion or, interview them by telephone using the PAPI instrument. After such completions, the same receipting and scanning processes as for PAPI completions will take place. Web instruments will be prepared after OMB approval of the basic hard-copy questionnaires, and provided to OMB.
In one site (CHLA), follow-up will be conducted in person with individual respondents. Data collectors will provide laptop computers equipped with audio for respondent self administration (ACASI).
Our current projections of completion rates by mode for the first follow-up are as shown in the following table. For completeness, the table includes the two sites for which first follow-up instruments are submitted now and the other sites for which instruments will be submitted later.
As with any survey that uses different modes of administration, the answers by some respondents to certain questions may differ depending on the mode in which they complete the survey. Any such “mode effects” should not affect the validity of the impact estimates because we anticipate that equal proportions of treatment and control group members will complete the different modes. Indeed, as a means to assure that this is the case, we will follow identical plans for administering the different modes of the survey between the two experimental groups, including using identical methods for locating respondents and more generally maximizing survey response rates (discussed below in Section B.3).
PROJECTED DISTRIBUTION OF FIRST FOLLOW-UP COMPLETIONS BY MODE, BY PPA SITE
Site (Grantee) |
Site Type |
Target Population/Enrollment Point |
Projected Mode of Completion FU1 |
Chicago Public Schools (CPS) |
School-Based |
7th grade students/start of 7th grade |
95% group 0% web 5% phone |
Engender Health |
CBO-Based (Individual) |
14-16 year old participants in summer youth employment program/start of program |
0% group 80% web 20% phone |
Princeton Center for Leadership Training (PCLT) |
School-Based |
9th grade students/start of 9th grade |
90% group 5% web 5% phone |
OhioHealth Research and Innovation Institute |
Clinic-Based |
Pregnant/parenting females 15-19 years old/recruited after delivery or during prenatal care |
0% group 0% web 100% phone |
Oklahoma Institute for Child Advocacy (OICA) |
CBO (Group-Based) |
Youth in foster care group homes 15-19 years old/resident at time of study recruitment |
50% group 25% web 25% phone |
Children’s Hospital Los Angeles (CHLA) |
Clinic-Based |
Teen mothers less than 20, with child less than 6 months/recruited through clinics and other programs |
100% in-person (ACASI)
|
Live the Life (LtL) |
School-Based |
7th grade students/start of 7th grade |
90% group 5% web 5% phone |
We expect a response rate of 85 percent or better on the first follow-up surveys (including the first and second data collection in the two sites with three follow-ups). We can expect to achieve this completion rate for several reasons. Survey administration will occur at most six months after the program end date.2 This timing will ensure that our contact data are quite current (minimal location problems). In some sites, surveys can be administered to most youth in the location where the baseline survey was conducted, and where the program took place (for example, the school). In addition, we expect that obtaining the site’s willing assistance will be very important to maximizing the response rate; we will invest significant effort in gaining their cooperation from the beginning of the study, minimizing burden on sites and assuring privacy and confidentiality to the youth participants. Sites will be given detailed information about the surveys, how they will be administered and on what schedule, what involvement and time will be required of school staff, and how data will be used and protected. Bringing sites into the process while minimizing burden will assure site support of the PPA data collection. We do not anticipate differential response rates across sites. Moreover, by applying identical methods for maximizing the response rates of the treatment and control groups, we anticipate no differences in the rates within sites between the two experimental groups.
Prior to survey administration in the school-based sites, we will work closely with our school contacts to locate respondents in their new classrooms. We will ask schools to post reminders and make announcements prior to and on the day of the survey administration to maximize attendance. On the day of the survey administration, contractor staff will take attendance prior to beginning administration and immediately follow-up with the school contact regarding any unexpected absentees. As previously noted, sample members who have transferred schools or moved out of the area will be tracked and given the option to complete the survey over the web or by telephone.
In sites where group-based administration is not possible, an advance letter will be sent to sample members, notifying them of the data collection and providing them with the information necessary to complete the survey on the web or over the phone. Additional email and telephone prompts will be conducted as needed.
Additionally, incentives will be provided to respondents to encourage participation in the survey. Participants completing the first follow-up surveys in a group setting will receive a $10.00 gift card. Group make-up sessions will be offered to capture any initial non-respondents. Those youth who do not complete the survey in a group setting (in any site) will be given the option to complete the follow-up survey via telephone or web; these respondents will receive a $25.00 gift card. A higher incentive is offered to these respondents because completion outside of the group administration requires greater initiative and cooperation on behalf of the respondent, as well as additional time outside of the school day.
Despite our expectation that the non-response rate will be low in each site, we will nevertheless take steps to both understand the nature of any non-response and to account for the threat that it may pose for the validity of the study’s impact estimates. Using data from the baseline survey, we will first test for statistically significant differences across all of the demographic and baseline outcome variables. We will then control for any such differences by using baseline data as covariates (see Section A.16).. In addition, to the extent that non-response is higher than anticipated (above 20 percent), we will correct for differences between respondents and nonrespondents by constructing sample weights that mirror the characteristics of the full sample. These weights will be used in each of the models used to estimate the program effects (described in Section A.16).
B4. Tests of Procedures or Methods to be Undertaken
We conducted pretests of the first follow-up instrument. We recruited pretest participants and study staff talked directly with all interested teens to explain the pretest and the need to obtain parental consent prior to their participation. Letters explaining the study and the purpose of the pretest were sent to parents, along with the active parental consent form. Those with parental consent were invited to participate in one of several pretest administrations, during which small groups of four or five teens completed the self-administered questionnaire in a group setting and then completed a one-hour one-on-one debriefing with a researcher. The pretest sample included youth ages 12-16 from both high and low socioeconomic backgrounds, some of whom were receiving social support services from a community organization.
The administration of the pretest mirrored as closely as possible what will happen during the actual study. The survey administration began with a brief description of the study, an explanation of the purpose of the pretest, and a clear reassurance to respondents of confidentiality. Student assent was then obtained for each respondent, and staff distributed surveys, explaining that each person was to complete Part A, but that they were to complete only one Part B and then put all three Parts (complete and not complete) in the blank return envelope. As will done in the study, no distinction was made between the two Part Bs – it was simply noted they were not to complete both Part B1 and Part B2 and they were to follow instructions carefully about which Part B to complete. To the extent possible, respondents were seated as if in a classroom, with at least one seat in between each person.
Once they completed the survey, pretest respondents attended a one-on-one debriefing session with a same-sex staff member, where they were asked about questions or terms that may have been unclear or unknown, thoughts on the survey and how comfortable they would feel responding in class; what they thought of when they were answering particular questions, how they came to their answer for particular questions, etc.
Attachment F is a copy of the Pretest Report detailing recommendations for changes to the instrument based on the feedback from pretest respondents.
This pretest was conducted with a draft of what is now the “concordance” instrument, and site-specific variants of that instrument have been developed. However, all items included in these site instruments have been tested, used already in the PPA baseline context, or derived from other surveys. Of immediate relevance is the testing done on the two instruments submitted now. The instrument for the Chicago Public Schools is the concordance instrument. In Oklahoma, the local evaluator conducted a pilot of their program in summer 2011. As part of the pilot, they conducted a pretest of their draft follow-up instrument; all local evaluator questions subsequently incorporated into the PPA follow-up instrument were included in that pretest. Pretest participants were recruited from eight foster care homes across four sites (California, Illinois, Maryland, and Oklahoma). Sites were asked to select homes that were representative of homes in their system, but would not be ideal for participation in the PPA study. Surveys were administered to youth ranging in age from 13 to 18 (mean age of 15.7), from various racial and ethnic backgrounds. As will be done in the study in Oklahoma, the survey was read aloud to participants. The evaluator reported no issues with sensitivity, comprehension, or wording of any of the questions.
Items incorporated in the other site-specific follow-up instruments based on local evaluators’ draft instruments have been similarly tested. All grantees are required to conduct a pilot, including their instrumentation. As a result, items taken from local evaluators have been tested under the terms of the grantees’ federal funding.
The PPA first follow-up survey will be administered by HHS’s contracting organization, Mathematica Policy Research. The same contractor will analyze data with support from evaluation colleagues at Child Trends. Individuals whom HHS consulted on the collection and/or analysis of the follow-up data include those listed below.
Alan Hershey Mathematica Policy Research, Inc. P.O. Box 2393 Princeton, NJ 08543 (609) 275-2384
Christopher Trenholm Mathematica Policy Research, Inc. P.O. Box 2393 Princeton, NJ 08543 (609) 936-2796
Brian Goesling Mathematica Policy Research, Inc. P.O. Box 2393 Princeton, NJ 08543 (609) 945-3355
Kristin Moore Child Trends 4301 Connecticut Ave. NW Washington, DC 20008-2333 (202) 362-5580 |
Melissa Thomas Mathematica Policy Research, Inc. P.O. Box 2393 Princeton, NJ 08543 (609) 275-2231
Jennifer Manlove Child Trends 4301 Connecticut Ave. NW Washington, DC 20008-2333 (202) 362-5580
Silvie Colman Mathematica Policy Research, Inc. P.O. Box 2393 Princeton, NJ 08543 (609) 750-4094 |
We have also consulted with:
Amy Margolis HHS Office of Adolescent Health U.S. Department of Health and Human Services 1101 Wootton Parkway Rockville, MD 2085 (240) 453-2820
Stan Koutstaal Family and Youth Services Bureau (prior to FY11) U.S. Department of Health and Human Services 370 L’Enfant Promenade, SW Washington, DC 20477 (202) 401-5457
|
Lisa Trivits Office of the HHS Assistant Secretary for Planning and Evaluation (ASPE) U.S. Department of Health and Human Services 370 L’Enfant Promenade, SW Washington, DC 2047 (202) 205-5750 |
Inquiries regarding statistical aspects of the study design should be directed to the project officer:
|
|
Amy Farb Office of Adolescent Health 1101 Wootton Parkway Suite 700 Rockville, MD 20852 (240) 453-2836 |
|
1 Trenholm, Christopher, Barbara Devaney, Kenneth Fortson, Lisa Quay, Justin Wheeler, and Melissa Clark. “Impacts of Four Title V, Section 510 Abstinence Education Programs.” Final report submitted to the U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation. Princeton, NJ: Mathematica Policy Research, 2007.
2 The estimated response rate is based on the number of consented individuals and is not conditional on completion of the baseline survey.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches |
Author | Barbara Collette |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |