Evaluation of the Food and Drug Administration’s Tobacco Public Education Campaign
0910-0753
A. Justification
On June 22, 2009, the Food and Drug Administration (FDA) was granted new authority to regulate the manufacture, marketing, and distribution of tobacco products and educate the public about the dangers of tobacco use. Under the Family Smoking Prevention and Tobacco Control Act (Tobacco Control Act) (P.L. 111-31) (Attachment 1), FDA is responsible for protecting the public health and reducing tobacco use among minors. Section 1003(d)(2)(D) of the Food, Drug and Cosmetic Act (21 U.S.C. Section 393(d)(2)(D)) and Sections 2, 3, 105, 201, 204, 904, and 908 of the Tobacco Control Act support the development and implementation of FDA public education campaigns related to tobacco use. Accordingly, FDA will implement multi-strategy youth-targeted public education campaigns to reduce the public health burden of tobacco that will consist of general market paid media campaigns, geo-targeted campaigns to reach specific target audiences, community outreach activities, and a comprehensive social media effort.
Tobacco use is the leading preventable cause of disease, disability, and death in the United States. More than 440,000 deaths are caused by tobacco use each year in the United States (USDHHS, 2010). Each day, more than 3,600 youth in the United States try their first cigarette, and an estimated 900 youth become daily smokers (NSDUH, 2011). The FDA Center for Tobacco Products (CTP) was created to carry out the authorities granted under the 2009 Tobacco Control Act to educate the public about the dangers of tobacco use and serve as a public health resource for tobacco and health information.
Through CTP, FDA researches, develops, and distributes information about tobacco and health to the public, professionals, various branches of government, and other interested groups nationwide using a wide array of formats and media channels. CTP collaborates closely with the Centers for Disease Control and Prevention’s (CDC) Office on Smoking and Health (OSH), which has experience implementing and evaluating national anti-tobacco media campaigns. FDA is in the process of implementing youth tobacco prevention campaigns, which are evidence-based and which rely on paid media advertising that highlights the negative health consequences of tobacco use. The objective of this evaluation is to measure the effectiveness of CTP public education campaigns designed to reduce tobacco use among general market youth aged 12 to 17. FDA’s general market youth prevention campaigns will focus on reducing tobacco use in the following audience segments: (1) youth who have not tried FDA-regulated tobacco products (non-triers), (2) youth who are intermittent users of FDA-regulated tobacco (experimenters), and (3) youth in rural areas who are susceptible to or use smokeless tobacco products. The goal of the proposed information collection is to evaluate the effectiveness of these efforts in affecting specific cognitive and behavioral outcomes related to tobacco use that are targeted by the campaigns.
This study is designed to measure awareness of and exposure to FDA’s youth tobacco prevention campaigns among youth in targeted areas of the U.S. and to assess their impact on outcome variables of interest. The primary outcome study will rely on in-person data collection and Web surveys to be self-administered on personal computers. The first data collection for both the non-trier and experimenter and the rural smokeless campaigns consists of a baseline survey of youth and their parent/guardian. Youth in the study are invited to complete follow-up surveys at 8-month intervals following baseline data collection. The follow-up surveys will be conducted largely online (75%), with the remainder (25%) conducted in-person. This design will facilitate analysis of relationships between individuals’ exposure to the campaigns and pre-post changes in outcomes of interest. This longitudinal design allows us to calculate baseline-to-follow-up changes in campaign-targeted outcomes for each study participant. We hypothesize that if the campaigns are effective, the baseline-to-follow-up changes in outcomes should be larger among individuals exposed to the campaigns more frequently (i.e., dose-response effects). Eligible youth will be aged 11 to 16 at baseline and 13 to 19 by the end of data collection, allowing us to follow the same youth over time and understand tobacco initiation, prevalence, and cessation for the campaigns’ target audience of youth aged 12 to 17.
In addition to the outcome evaluation surveys, we will complete a series of Web-based media tracking surveys to better understand awareness of and receptivity to campaign materials among youth subpopulation groups of interest (e.g., gender, age, geographic area). Research studies have demonstrated that receptivity to advertisements is causally antecedent to actual ad effectiveness (e.g., Duke et al., 2015; Davis et al., 2013; Davis, Uhrig, et al., 2011; Dillard, Shen, & Vail, 2007; Dillard, Weber, & Vail, 2007). Surveys will be conducted periodically throughout the evaluation period. The proposed surveys will provide indicators of the campaign’s reach and resonance with specific youth subpopulations of interest. A new sample for the tracking study is necessary because more frequent surveys of the outcome evaluation could introduce unintended bias in their responses (i.e., panel conditioning).
The outcome baseline survey includes measures of tobacco-related beliefs, attitudes, intentions, and behaviors. The outcome follow-up surveys will include measures of audience awareness of and exposure to the campaign advertisements as well as the aforementioned outcome variables of interest. The baseline and follow-up questionnaires are presented in Attachments 2_E1, 2_E2a, 2_E2b and 2R. The rationale for use of these specific measures is in Attachment 2a. The tracking survey will assess awareness of the campaigns and receptivity to campaign messages throughout the campaign; similar measures of beliefs, attitudes, intentions, and behavior are also critical in the survey in order to examine awareness across subgroups and to assess comparability with the representative outcome survey. As part of the outcome evaluation study, a baseline survey is being conducted with the parent or legal guardian of each youth baseline survey participant to collect data on household characteristics and media use (Attachments 3_E2a, 3_E2b, 3_E2c, 3_E2d, and 3_3E2e). Tracking survey data will not be used to make statistical inferences about the U.S. population of youth. The media tracking parent permission is Attachment 4_E2a1, the media tracking youth assent is Attachment 4_E2a2, the media tracking screener is Attachment 4_E2a3 and the media tracking instrument is Attachment 4_E2b. Further rationale for conducting media tracking can be found in Attachment 4.
The requested data collection is an evaluation designed to closely assess the planned media dose of FDA campaign advertisements across the U.S. The evaluation will rely on a pre-post evaluation design that leverages natural and created variation in exposure to campaign messages across media markets. As such, the highest standard of evidence for causal relationships between health marketing campaigns and behavior change is the demonstration of changes in behavioral outcomes of interest by media dose (e.g., Farrelly et al., 2005, 2009, 2012). The effect of the campaigns on tobacco-related outcomes will be examined using two types of campaign exposure measures, market-level media dose and self-reported campaign exposure at the individual-level.
Exogenous market-level doses of media will be measured with advertising targeted rating points (TRPs). TRPs are based on Nielsen ratings for the television programs or other media platforms on which campaign ads air. The primary hypothesis of this approach is that individuals who reside in media markets that receive higher doses of campaign media will exhibit an increased likelihood of behavior change, such as decreased intention to use tobacco. This hypothesis is testable with the use of market-level campaign TRP data in combination with individual-level survey data on outcomes of interest and generally requires two conditions to be met: (1) reasonable randomness in the media delivery at the market level and (2) a sufficient amount of variation in TRPs to identify statistical relationships between the individual-level survey data and market-level TRPs. However, campaign media are not delivered in random doses across U.S. media markets. This non-randomness in media delivery can potentially obscure campaign effects or lead to spurious effects if the media delivery is based on market characteristics that are also correlated with outcomes of interest, such as smoking susceptibility. Moreover, the use of TRPs for determining the impact of a campaign can be hindered by a lack of market-to-market variation in media dose. While variation may increase as campaign ads are aired across the U.S. over time, we do not know a priori whether sufficient variation in media delivery across markets will exist and can be used to test hypotheses based on TRPs.
A second measure of campaign exposure, self-reported exposure, may be used to examine campaign effects given the limitations of market-level exposure measures. Self-reported recall of campaign ads will be measured at the individual level. The primary hypothesis of this approach is that individuals who self-report greater frequency of exposure to campaign advertisements will exhibit an increased likelihood of behavior change. This approach may result in greater overall variation in exposure and potentially increased statistical power to identify associations between campaign advertisements and key outcomes of interest. The primary limitation of this approach is that self-reported measures of exposure are subject to “selective attention” bias whereby smokers who are more willing to quit can also be more attentive to campaign messages and thus more likely to indicate exposure. Because this can obscure the direction of causality for campaign effects, we will account statistically for preexisting selective attention. In summary, the specific and frequent measurement of both market-level and individual-level campaign exposure requested as part of this data collection effort are necessary to accurately evaluate campaign exposure and potential impact while mitigating the limitations of one approach in isolation.
The information obtained from the proposed data collection activities is collected from individuals or households and will be used to inform FDA, policy makers in the United States, prevention practitioners, and researchers about the extent of youth’s exposure to the campaigns’ messages and the extent to which exposure to these messages is associated with changes in target outcomes. While not exhaustive, the list below illustrates a range of purposes and uses for the proposed information collection:
Provide critical data on the reach of the campaigns among youth in the United States, particularly with estimates of the proportion of the population that was exposed to the campaigns.
Understand the influence of the campaigns on beliefs, attitudes, intentions, and behaviors around tobacco use.
Inform FDA, policy makers, and other stakeholders on the impact of the campaigns overall.
Inform the public about the impact of the campaigns.
Inform future programs that may be designed for similar purposes.
To achieve these goals, data collection will consist of a baseline interview and several follow-up interviews with selected parents and youth. The in-person baseline household data collection for parents and youth will occur over a 3-month period, with the majority of data collection occurring in the first 2 months. Longitudinal follow-up surveys will occur in 8-month intervals following the baseline data collection. The follow-up surveys will be conducted largely in person (approximately 70%), with the remainder conducted via a Web-based survey (approximately 30%). Eligible youth will be aged 11 to 16 at baseline and 13 to 19 by the end of data collection. This design allows the same youth to be followed over time and provides the data needed to address the study’s goals. In addition, a series of media tracking surveys with a convenience sample of U.S. youth will be conducted during the period of the longitudinal data collection. Media tracking surveys will be conducted by RTI International using a convenience sample purchased from the digital data collection company Lightspeed (formerly Global Marketing Insite, Inc.). Lightspeed will provide youth respondents for unique, cross-sectional surveys, manage the data collection, and store data on Lightspeed’s secure server until it is delivered to RTI.
The baseline non-trier and experimenter campaign surveys includes youth aged 11 to 16 at baseline in 75 U.S. markets. We expect an 80% response rate. The baseline rural smokeless campaign surveys include male youth aged 11 to 16 at baseline in 30 rural media markets. We expect an 80% response rate. The cross-sectional media tracking surveys will include youth aged 13 to 17 in the United States.
This outcome study will rely on in-person surveys for baseline data collection and in-person and Web surveys for follow-up waves. The proposed approach of in-person recruitment and online surveys provides a number of methodological advantages, including increased accuracy in measurement of key variables of interest, sample characteristics that are representative of the population of interest, and reduced burden on study participants. This methodology permits the instrument designer to incorporate into the questionnaire routings that might be overly complex or not possible with alternative methods. The laptop computer which will be used to collect youth data can be programmed to implement complex skip patterns and fill specific wordings based on the respondent’s previous answers. Interviewer and respondent errors caused by faulty implementation of skip instructions are virtually eliminated. Second, this methodology increases the consistency of the data. The computer can be programmed to identify inconsistent responses and attempt to resolve them through respondent prompts. This approach reduces the need for most manual and machine editing, thus saving time and money. In addition, it is likely that respondent-resolved inconsistencies will result in data that are more accurate than when inconsistencies are resolved using editing rules. FDA estimates that 100% of the respondents will use electronic means to fulfill the agency’s requirement or request.
The self-administered technology for the survey permits greater expediency with respect to data processing and analysis (e.g., a number of back-end processing steps, including coding and data entry). Data are transmitted electronically at the end of the day, rather than by mail. These efficiencies save time due to the speed of data transmission, as well as receipt in a format suitable for analysis. Finally, as noted above, this technology permits respondents to complete the interview in privacy. Providing the respondent with a methodology that improves privacy makes reporting of potentially embarrassing or stigmatizing behaviors (e.g., tobacco use) less threatening and enhances response validity and response rates.
Interviewers also use hand-held tablets to conduct household screening interviews and collect adult data. For the ExPECTT study, the tablet will also be used to administer the parental permission text for youth under age 18. The primary advantage of this computer-assisted methodology is improved accuracy in selecting the correct household member for an interview. The computer automatically selects the correct household member based on the demographic variables entered, thus substantially reducing the probability for human error. The hand-held computers also provide the benefits of complex case management tools, ability to generate ID codes for youth respondents which will be used to link adult and youth data, and quick, secure electronic transfer of data.
In designing the proposed data collection activities, we have taken several steps to ensure that this effort does not duplicate ongoing efforts and that no existing data sets would address the proposed study questions. We have carefully reviewed existing data sets to determine whether any of them are sufficiently similar or could be modified to address FDA’s need for information on the effectiveness of the campaign with respect to reducing youth tobacco initiation. We investigated the possibility of using existing data to examine our research questions. Data sources we examined for this purpose include data collected as part of ongoing national surveillance systems, such as the National Youth Tobacco Survey (NYTS), the Youth Risk Behavior Surveillance System (YRBSS) and the Population Assessment of Tobacco and Health (PATH); and data collected to evaluate other national tobacco-focused media campaigns, such as the CDC’s Tips From Former Smokers campaign. We concluded that these data sources do not include all of the measures needed to evaluate the campaigns and that they are not conducted frequently enough to capture outcomes of interest. Although the NYTS and YRBS measure youth smoking initiation and use, including thoughts about dependence and quitting, they do not include measures needed for us to assess short and mid-term campaign outcomes, such as campaign and ad awareness, reactions to campaign advertising (which are predictive of subsequent behavior), and baseline and follow-up levels of agreement with campaign-related beliefs. PATH includes three items related to The Real Cost campaign (referred to in this document as the non-trier and experimenter campaign). The items are a brand awareness measure and two measures of advertising awareness, using advertising taglines as a prompt. Response options for each item is yes, no and don’t know. This set of measures and the data they will produce is insufficient to conduct the rigorous evaluation we plan for the campaign. We collect data on campaign awareness in order to estimate the proportion of the audience which has been exposed to the campaign, but we also collect data on frequency of exposure, so we can analyze outcome data by exposure levels. This is a potentially important element in our overall assessment of campaign effectiveness. We also measure receptivity to campaign advertising, which provides information about how the various creative elements of the campaign are functioning. This permits the FDA to make adjustments to the media which can potentially increase the impact of the campaign. It is also critical to measure a number of tobacco-related beliefs, some of which are related to campaign messaging and some which are not. This enables us to determine whether youths’ tobacco-related beliefs are changing in general, as a result of societal trends or other health promotion efforts, or whether belief change is specific to beliefs targeted by campaign advertising. Each of these measures helps us to ascertain whether observed changes (or some proportion of observed change) in tobacco-related cognitions and behaviors is the result of the campaign. The PATH measures cannot provide us with critical data on ad awareness, receptivity or beliefs. Furthermore, consistent with the literature on health behavior change, we gather data from respondents every 8 months. If we collected data less frequently, such as on the annual schedule of the NYTS, YRBSS or PATH, we would likely miss many short and mid-term indicators of campaign effectiveness, which would make it more challenging to relate later behavioral outcomes to the campaign. In other words, we would not have the opportunity to build a reasonable case that the campaign was responsible for observed change by demonstrating antecedents to behavior change.
Data collection for the Tips from Former Smokers (Tips) campaign is not suitable for evaluation of The Real Cost campaign because it is adult focused. Evaluation of a youth-focused campaign requires a large number of variables not present in an adult survey, including variables that measure susceptibility and initiation, the home environment including relationship with parents or guardians and influential siblings, and the school and social/peer environment. The survey also lacks information about the specific beliefs that are being targeted by The Real Cost campaign advertising.
This is an ongoing data collection. To date, we have conducted a baseline survey and three follow-up surveys for the non-trier and experimenter campaign and a baseline survey only for the rural smokeless campaign. Baseline data collection for both campaigns was necessary to document pre-campaign susceptibility to tobacco and tobacco behavior, the level of agreement with beliefs that would be targeted by the campaign, and variables that might mediate or moderate campaign effects, such as demographics and the home and social environment. For the non-trier and experimenter campaign, at first follow-up we measured campaign and ad awareness to assess campaign adherence to CDC-recommended levels of exposure, ad receptivity to assess the likely effectiveness of individual advertisements, and change in tobacco-related cognitions, with a specific focus on change in beliefs targeted by the campaign. The second and third data collections were designed to answer these questions and to begin to measure changes indicative of subsequent behavior change, such as changes in susceptibility to smoking and intention to smoke. The third follow-up began to examine campaign-related changes in tobacco use susceptibility and behavior. The fourth follow-up data collection will continue to assess changes in tobacco use susceptibility and behavior over time.
The non-trier and experimenter campaign evaluation is a longitudinal study that has been ongoing for more than 2 years; more than 25% of the sample will age out of the campaign’s target audience in the coming months, and younger members of the campaign audience are increasingly under-represented. In order to properly evaluate the campaign going forward, it is necessary to develop a second cohort of youth; if we do not begin to collect data from a new cohort, CTP cannot assess the campaign’s effect on U.S. youth ages 13 and 14, which is one third of the key target age of the campaign. Second cohort data collections will follow the pattern described above, allowing for the fact that the campaign is ongoing. For example, we recognize that at baseline youth will already have been exposed to campaign advertising. The evaluation will continue to monitor beliefs targeted by new campaign advertising with the goal of documenting whether and to what degree changes in beliefs and other cognitive and behavioral outcomes are attributable to the campaign, rather than to societal trends or other health promotion efforts.
Respondents in this study will be members of the general public, specific subpopulations or specific professions, not business entities. No impact on small businesses or other small entities is anticipated.
Respondents to this collection of information will answer on an occasional basis. While there are no legal obstacles to reduce burden, any lack of information needed to evaluate the Tobacco Public Education Campaign may impede the federal government’s efforts to improve public health. Without the information collection requested for this evaluation study, it would be difficult to determine the value or impact of the campaigns on the lives of the people they are intended to serve. Failure to collect these data could reduce effective use of FDA’s program resources to benefit youth in the United States. Careful consideration has been given to how frequently the campaigns’ intended audience should be surveyed for evaluation purposes. We believe that the proposed longitudinal survey and tracking survey will provide sufficient data to evaluate the campaigns effectively.
There are no special circumstances for this collection of information that require the data collection to be conducted in a manner inconsistent with 5 CRF 1320.5(d)(2). The message testing activities fully comply with the guidelines in 5 CFR 1320.5.
In accordance with 5 CFR 1320.8(d), FDA published a 60-day notice for public comment in the Federal Register on February 19, 2016 (81 FR 8511) FDA received two public comments that were not related to the information collection. The first comment, submitted by a private citizen, does not agree with the government spending money on campaigns like this. The comment does not go in detail or provide any alternatives for conducting this study. The second comment, from a tobacco advocacy group for reducing tobacco use, is a letter of support for FDA to gain approval to conduct this study. Both comments did not contain any information related to the Paperwork Reduction Act, and therefore, are beyond the scope of this collection.
The following individuals inside the agency have been consulted on the design of the campaign evaluation plan, audience questionnaire development, or intra-agency coordination of information collection efforts:
April Brubach
Office of Health Communication & Education
Center for Tobacco Products
Food and Drug Administration
9200 Corporate Boulevard
Rockville, MD 20850
Phone: 301-796-9214
E-mail: April.Brubach@fda.hhs.gov
Gem Benoza
Office of Health Communication & Education
Center for Tobacco Products
Food and Drug Administration
9200 Corporate Boulevard
Phone: 240-402-0088
E-mail: Maria.Benoza@fda.hhs.gov
David Portnoy
Office of Science
Center for Tobacco Products
Food and Drug Administration
9200 Corporate Boulevard
Phone: 301-796-9298
E-mail: David.Portnoy@fda.hhs.gov
Janine Delahanty
Office of Health Communication & Education
Center for Tobacco Products
Food and Drug Administration
10903 New Hampshire Avenue
Silver Spring, MD 20993
Phone: 240-402-9705
E-mail: Janine.Delahanty@fda.hhs.gov
Matthew Walker
Office of Health Communication & Education
Center for Tobacco Products
Food and Drug Administration
10903 New Hampshire Avenue
Silver Spring, MD 20993
Phone: 240-402-3824
E-mail: Matthew.Walker@fda.hhs.gov
Alexandria Smith
Office of Health Communication & Education
Center for Tobacco Products
Food and Drug Administration
10903 New Hampshire Avenue
Silver Spring, MD 20993
Phone: 240-402-2192
E-mail: Alexandria.Smith@fda.hhs.gov
The following individuals outside of the agency have been consulted on questionnaire development. FDA CTP has also participated in meetings with CDC OSH throughout 2013 with updates on CTP campaign activities. Additionally, input has been solicited and received from FDA on the design of this study, including participation by FDA in meetings with OMB.
Michelle O’Hegarty
Centers for Disease Control and Prevention
4770 Buford Highway NE, Mailstop F79
Atlanta, GA 30341
Phone: 770-488-5582
E-mail: mohegarty@cdc.gov
Matthew Farrelly
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709
Phone: 919-541-6852
E-mail: mcf@rti.org
Jennifer Duke
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709
Phone: 919-485-2269
E-mail: jduke@rti.org
Jane Allen
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709
Phone: 919-597-5115
E-mail: Janeallen@rti.org
Pamela Rao
Akira Technologies, Inc.
1747 Pennsylvania Ave NW Suite 600
Washington, DC 20002
Phone: (202) 517-7187
Email: prao@akira-tech.com
Xiaoquan Zhao
Department of Communication
George Mason University
Robinson Hall A, Room 307B
4400 University Drive, 3D6
Fairfax, VA 22030
Phone: 703-993-4008
E-mail: xzhao3@gmu.edu
Nathaniel Taylor
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709
Phone: 919-316-3523
Email: ntaylor@rti.org
For respondents to the media tracking survey, Lightspeed will provide non-monetary “LifePoints” or “MySurvey Points,” which are part of panel maintenance strategies for their panels, which participants can trade for material items with Lightspeed partner vendors (e.g., Amazon.com and Starbucks) or for cash. LifePoints and MySurvey Points are valued at approximately $10 per survey.
Households that receive the mail screener inviting them to participate in the male rural smokeless study will all receive a nominal incentive of a $2 bill to encourage participating in this brief survey. A meta-analysis of studies examining the use of incentives in mail surveys showed that pre-paid incentives and promised incentives increase participating in mail surveys by 19% and 8% respectively, compared to no incentives (Church, 1993). More recent studies confirm these findings (e.g., Montaquila et al., 2013; Brick et al., 2012; Beebe et al., 2005).
Parents or legal guardians of participants in the outcome evaluation study will not receive incentives. However, youth participants in the outcome evaluation surveys will receive incentives. Youth participants will be offered a $20 incentive for completion of the baseline survey. At follow-up, respondents will be offered a $25 incentive to complete the survey online during an early release period of approximately three weeks. A $20 incentive will be offered to respondents for completing the survey after this period, whether online or in-person. Studies suggest that this incentive approach will increase response rates and reduce costs. We estimate that the baseline survey will take 30 - 45 minutes to complete, and the follow-up survey will take 45 minutes. The incentives are intended to recognize the time burden placed on participants, encourage their cooperation, and convey appreciation for contributing to this important study and are similar to incentives that are offered for most surveys of this type. Numerous empirical studies have shown that incentives can significantly increase response rates in cross-sectional surveys and reduce attrition in longitudinal surveys (e.g., Abreu & Winters, 1999; Castiglioni, Pforr, & Krieger, 2008; Jäckle & Lynn, 2008; Shettle & Mooney, 1999; Singer, 2002). The decision to use incentives for this study is based on the need to ensure high retention from baseline to follow-up in order to retain the necessary analytic power of the longitudinal sample.
A more detailed justification for the use of incentives is provided in Attachment 5. The use of modest incentives is expected to enhance survey response rates without biasing responses. A smaller incentive would not appear sufficiently attractive to participants. We also believe that the incentives will result in higher data validity as participants will become more engaged in the survey process. This will also enhance overall response to the baseline and follow-up surveys. The use of incentives will help ensure that baseline data collection is completed in a timely manner and potentially reduce the number of follow-up visits needed to contact nonrespondents. The specific amount of the proposed incentive is based on several previous projects conducted by RTI, including the National Survey of Child and Adolescent Well-Being, which found that use of similar incentives increased response rates among youth, particularly for retention in longitudinal studies (see Exhibit 1).
Exhibit 1. Incentive Type and Amount
Type of Incentive |
Participant |
Amount/Value |
Total Amount for Completing all Waves |
Youth Media Tracking incentive |
Youth selected through Lightspeed’s adult panels (not longitudinal panel members) |
A nonmonetary incentive valued at approximately $10 |
A nonmonetary incentive valued at approximately $10 |
Household mail screener incentive |
An adult household member |
$2 / household |
$2 |
Youth Baseline Questionnaire incentive |
All longitudinal panel members |
$20/survey |
$20 |
Youth Follow-up Questionnaire incentive- Early Release Period: Online completion during the initial three weeks of data collection |
All longitudinal panel members (up to 4 follow-up waves) |
$25/survey |
$100 |
Youth Follow-up Questionnaire incentive-Online or in-person completion after Early Release Period expires |
All longitudinal panel members (up to 4 follow-up waves) |
$20/survey |
$80 |
Both FDA’s Research Involving Human Subjects Committee (RIHSC) and RTI’s Institutional Review Board (IRB) will review and approve the protocols and consent forms for the outcome evaluation survey and media tracking survey prior to any respondent contact (Attachments 6_E1a, 6_E1b, 6_E1c, 6_E1d, 6_E2a, 6_E2b, 6_E2c, 6_E2d, 6_R1 and 6_R2). Both RIHSC’s and IRB’s primary concern is protecting respondents’ rights, one of which is maintaining the privacy of respondent information to the fullest extent of the law.
Concern for privacy and protection of respondents’ rights will play a central part in the implementation of the outcome evaluation study and will receive the utmost emphasis. Interviewers will be thoroughly educated in methods for maximizing a respondent’s understanding of the government’s commitment to privacy to the fullest extent of the law. Several procedures ensure that respondents’ rights are protected. First, the interviewer introduces himself or herself and the study to potential adult respondents using the Introduction and Informed Consent Scripts (Attachments 7_E2, 6_E1b, 6_E1d, 6_E2b, 6_E2d, 6_R1, and 6_R2), reading the scripted text aloud to each adult respondent. As part of the process for obtaining informed consent, respondents are given a Study Description (Attachment 8_E2), which includes information on their rights as study participants. Specifically, the Study Description states that respondents’ answers will be used only by authorized personnel for statistical purposes and cannot be used for any other purpose. Parental consent is obtained from the youth’s parent or guardian; subsequently, youth assent is requested. Although full names of youth and adult respondents and contact information will be collected for all respondents, signed consent and assent are waived in this study.
After obtaining informed consent, interviewers make every attempt to secure an interview setting in the respondent’s home that is as private as possible. In addition, the interview process, by design, includes techniques to afford privacy for the respondent. The self-administered portion of the interview maximizes privacy by giving control of the interview directly to the respondent. This allows the respondent to read the questions directly from the computer screen and then key his or her own responses into the computer via the keyboard.
Each day they work, interviewers electronically transmit all completed screening and interview data to RTI’s servers via secure encrypted data transmission. On the data files, respondents are distinguished only by a unique identifier assigned to screenings and interviews. These identifiers will not be linked with names and will be used to link adult and youth data prior to analysis.
Security for respondents of the Web-based media tracking surveys will be assured in a number of ways: (1) Lightspeed will invite youth panel participants to complete the survey through an invitation to their parents asking for their consent to have their child’s opinions, which is fully compliant with COPPA’s revised standards; each respondent will remain completely anonymous and will be known only by a unique alphanumeric variable; (2) participants will log onto Lightspeed’s secure server using a link provided by Lightspeed and this unique identifier, with the result that no information about the respondent’s identity will be downloaded to or housed on Lightspeed’s or RTI’s server; (3) respondents will be provided with information about the privacy of their data to the fullest extent of the law before they encounter the first survey item; (4) respondents will be required to provide their assent to freely participate before they encounter the first survey item; (5) respondents will have the option to decline to respond to any item in the survey for any reason; and (6) Lightspeed will deliver non-monetary compensation. All those who handle or analyze data will be required to adhere to the standard data security policies of RTI.
To ensure data security, all RTI project staff are required to adhere to strict standards and to sign a nondisclosure agreement as a condition of employment on this project. RTI maintains restricted access to all data preparation areas (i.e., receipt and coding). All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only. A detailed description of privacy safeguards is provided with this submission (Attachment 9_E2). No respondent identifiers will be contained in reports to FDA, and results will only be presented in aggregate form.
Implementation of data security systems and processes will occur as part of the survey data collection. Data security provisions will involve the following:
All data collection activities will be conducted in full compliance with FDA regulations to maintain the privacy of data obtained from respondents and to protect the rights and welfare of human research subjects as contained in their regulations. Respondents will receive information about privacy protections as part of the informed consent process.
All data collectors will be trained on privacy procedures and be prepared to describe them in full detail, if necessary, or to answer any related questions raised by respondents. Training will include procedures for safeguarding sample member information in the field, including securing hardcopy case materials and laptops in the field, while traveling, and in respondent homes, and protecting the identity of sample members.
All project employees will sign a privacy agreement that emphasizes the importance of respondent privacy and describes their obligations.
Access to the file linking respondent identifiers and item data with their contact information will be limited to project staff who have signed the privacy agreement.
All field staff laptops and tablet computers will be equipped with encryption software so that only the user or RTI administrators can access any data on the hard drive even if the hard drive is removed and linked to another computer.
Laptops will use the Microsoft Windows operating system and require a valid login ID and password to access any applications or data.
All data transferred to RTI servers from field staff laptops will be encrypted and transferred via a secure (SSL) broadband connection or optionally a secure telephone (land) line. Similarly, all data entered via the Web-based survey system will be encrypted as the responses will be on a Web site with an SSL certificate applied. Data will be passed through a firewall at RTI and then collected and stored on a protected network share on the RTI Network. Only authorized RTI project staff members will have access to the data on the secure network share.
Following receipt from the field, Personally Identifiable Information (PII) will be stored only on RTI password-protected, secured servers. Only authorized project members will have access to PII for research sample members. For the outcome evaluation, if a Web survey respondent breaks-off part way through the survey and can’t log back in, they will be contacted by a field interviewer who will do the interview in person or provide them with the assistance line phone number or the Field Director’s phone number to call for assistance to get back into the web survey. For the media tracking survey, respondents will be given a unique alphanumeric variable and will log onto Lightspeed’s secure server using a link provided by Lightspeed and this unique identifier, with the result that no information about the respondent’s identity will be downloaded to or housed on RTI’s server.
All respondents will be assured that the information they provide will be maintained in a secure manner and will be used only for the purpose of this research (see Attachment 7_E2). Respondents will be assured that their answers will not be shared with family members and that their names will not be reported with responses provided. Respondents will be told that the information obtained from all of the surveys will be combined into a summary report so that details of individual questionnaires cannot be linked to a specific participant.
Respondents will participate on a voluntary basis. The voluntary nature of the information collection is described in the introductory section of the Screener and Consent Process (Attachments 6_E1a, 6_E1b, 6_E1c, 6_E1d, 6_E2a, 6_E2b, 6_E2c, 6_E2d, 6_R1, and 6_R2 and 7_E2) and the initial lead letter (Attachments 10_E1, 10_E2a, 10_E2b, and 10_R).
The majority of questions asked will not be of a sensitive nature. There will be no requests for a respondent’s Social Security Number (SSN). However, it will be necessary to ask some questions that may be considered to be of a sensitive nature in order to assess specific health behaviors, such as cigarette smoking. These questions are essential to the objectives of this information collection. Questions about messages concerning lifestyle (e.g., smoking, current smoking behavior, attempts to quit smoking) and some demographic information, such as race, ethnicity, and income, could be considered sensitive, but not highly sensitive. To address any concerns about inadvertent disclosure of sensitive information, respondents will be fully informed of the applicable privacy safeguards. The informed consent protocol (see Attachments 6_E1a, 6_E1b, 6_E1c, 6_E1d, 6_E2a, 6_E2b, 6_E2c, 6_E2d, 6_R1, and 6_R2) will apprise respondents that these topics will be covered during the survey. This study includes a number of procedures and methodological characteristics that will minimize potential negative reactions to these types of questions, including the following:
Respondents will be informed that they need not answer any question that makes them feel uncomfortable or that they simply do not wish to answer.
Web surveys are entirely self-administered and maximize respondent privacy without the need to verbalize responses.
Participants will be provided with a specific toll-free phone number (linking directly to the RTI IRB Office) to call in case they have a question or concern about the sensitive issue.
Finally, as with all information collected, these data will be presented with all identifiers removed.
In this section we provide an overview of the status of the data collection since approval of the initial OMB package in October 2013, and proposed new information collection.
Evaluation of the General Market Youth Tobacco Prevention Campaign
To date, the baseline and three follow-up surveys have been conducted. A baseline survey was also conducted with the parent or legal guardian of each youth, to collect data on household characteristics and media use. Because the cohort aged over the study period, data have been collected from youth aged 11 to 18. Information has been collected about youth awareness of and exposure to campaign advertisements and about youth knowledge, attitudes, and beliefs related to tobacco use. In addition, the surveys have measured tobacco use susceptibility and current use. Information has been collected on demographic variables including age, sex, race/ethnicity, grade level, and primary language.
Evaluation of the Rural Male Youth Smokeless Tobacco Campaign
Baseline data collection for the rural male youth smokeless component of the evaluation study began in January 2016. The four follow up surveys will begin in September 2016, May 2017, January 2018, and September 2018. The Rural Male Youth Smokeless Campaign component of the evaluation differs from the General Market Campaign component in that only males in the age range will be considered eligible.
Media Tracking Survey
The media tracking survey consists of assessments of youth aged 13 to 17 that have been conducted periodically during the campaign period. The tracking survey assesses awareness of the campaign and receptivity to campaign messages. These data provide critical evaluation feedback to the campaigns and are conducted with sufficient frequency to match the cyclical patterns of media advertising and variation in exposure to allow for mid-campaign refinements.
In support of the provisions of the Tobacco Control Act that require FDA to protect the public health and to reduce tobacco use by minors, FDA requests OMB approval to collect additional information for the purpose of extending the evaluation of FDA’s general market youth tobacco prevention campaign. Specifically, FDA requests approval to conduct a fourth follow-up survey with youth who are part of the first longitudinal cohort, and who participated in the baseline through third follow-up surveys. Data from this cohort are being used to evaluate the non-trier and experimenter campaign. We estimate that a total of 6,666 (annualized) follow-up surveys will be completed at 0.75 hours per survey, for a total of 5,000 annualized burden hours. Baseline data collection for this cohort, approved for 2,288 participants (1,144 burden hours at 30 minutes per survey) is complete.
FDA also requests approval to develop and survey a second longitudinal cohort which will consist of an entirely new sample of youth, ages 11-16 at baseline. We expect 2,667 youth (annualized) to complete the baseline survey for the new cohort at 45 minutes per survey, resulting in a total of 2,000 burden hours for youth. Three follow up surveys are planned for this cohort. We expect a total of 6,270 participants to complete follow up surveys for a total burden of 4,703 annualized burden hours.
Development of both cohorts will involve screening a total of 30,880 individuals in the general population at 10 minutes per screening, for a total of 5,250 annualized burden hours.
Parents of the youth who complete baseline surveys will also complete surveys. We expect a total of 6,009 parent surveys for both cohorts to be completed at 10 minutes per survey, for a total of 1,022 annualized burden hours.
FDA also requests approval to extend the media tracking component of the evaluation. This is a cross sectional survey and thus necessitates brief screening prior to data collection. We expect 60,000 annualized participants to complete the screener at two minutes per screener, for a total of 1,800 annualized burden hours. We expect the screening process to yield a total of 6,000 surveys at 30 minutes per survey, for a total of 3,000 annualized burden hours.
FDA also requests approval to extend the time period of the rural male smokeless component of the outcome evaluation and to add a fourth follow up round of data collection. Previously-approved burden for the rural smokeless component includes 656 annualized participants (328 annualized burden hours at 30 minutes per questionnaire) for the baseline questionnaire and 1,281 annualized participants (961 annualized burden hours at 0.75 hours per questionnaire). Due to high retention rates, the estimated number of questionnaires to be completed for Follow up 3 has been increased. The revised number of participants (annualized) for Follow up 1 through Follow up 4 is 1,954 (1,466 annualized burden hours).
Table 1.--Estimated Annual Reporting Burden¹
Type of Respondent |
Activity |
No. of Respondents |
No. of Responses per Respondent |
Total Annual Responses |
Average Burden per Response |
Total Hours |
General population |
Screener and Consent Process (Youth and Parent) |
30,880 |
1 |
30,880 |
0.17 |
5,250 |
Parent of youth baseline survey participants |
Parent Baseline Questionnaire |
6,009 |
1 |
6,009 |
0.17 |
1,022 |
Youth aged 11 to 19 (Experimenters & Non-Triers) |
Youth Baseline Questionnaire (Experimenters & Non-Triers) |
2,288 |
1 |
2,288 |
0.50 |
1,144 |
Youth 1st, 2nd, 3rd, 4th Follow-up Questionnaire (Experimenters & Non-Triers) |
6,666 |
1 |
6,666 |
0.75 |
5,000 |
|
Youth aged 13 to 17 |
Media Tracking Screener |
60,000 |
1 |
60,000 |
0.03 |
1,800 |
Media Tracking Questionnaires 1st, 2nd, and 3rd |
6,000 |
1 |
6,000 |
0.50 |
3,000 |
|
Male youth aged 11 to 18 in U.S. rural markets (Male Rural Smokeless) |
Youth Baseline Questionnaire (Male Rural Smokeless) |
656 |
1 |
656 |
0.50 |
328 |
Youth 1st, 2nd, 3rd, 4th (Male, Rural Smokeless) follow up questionnaire |
1,954 |
1 |
1,954 |
0.75 |
1,466 |
|
Cohort 2--Youth Aged 11 to 18 |
Cohort 2--Youth Baseline Questionnaire |
2,667 |
1 |
2,667 |
.75 |
2,000 |
Cohort 2--Youth 1st, 2nd, 3rd Follow-Up Questionnaire |
6,270 |
1 |
6,270 |
.75 |
4,703 |
|
|
|
|
123,390 |
|
25,713 |
¹ One time burden, actual burden hours have been divided by 3 to avoid double counting in the ROCIS system
Respondents participate on a purely voluntary basis and, therefore, are subject to no direct costs other than time to participate. There are also no start-up or maintenance costs. RTI has conducted many smoking-related surveys of similar length among youth. We have examined diagnostic data from each of these prior surveys and estimate that data collection for this study will take, on average, between 10 minutes per respondent (for screening) to 45 minutes per respondent (for the follow-up surveys). According to the U.S. Department of Labor (DOL) Bureau of Labor Statistics as of April 2016 the national average hourly wage is $25.53. Thus, assuming an average hourly wage of $25.53, the estimated one-year annualized cost to participants will be $656,453. The estimated value of respondents’ time for participating in the information collection is summarized in Exhibit 3.
Exhibit 3. Estimated Annual Cost
Type of Respondent |
Activity |
Annual Burden Hours |
|
Hourly Wage Rate |
Total Cost |
General population |
Screener and Consent Process (Youth and Parent) |
5,250 |
|
$25.53 |
$134,032.50 |
Parent of youth baseline survey participants |
Parent Baseline Questionnaire |
1,022 |
|
$25.53 |
$26,091.66 |
Youth aged 11 to 19 (Experimenters & Non-Triers) |
Youth Baseline Questionnaire (Experimenters & Non-Triers) |
1,144 |
|
$25.53 |
$29,206.32 |
Youth 1st, 2nd, 3rd, 4th Follow-up Questionnaire (Experimenters & Non-Triers) |
5,000 |
|
$25.53 |
$127,650.00 |
|
Youth aged 13 to 17 |
Media Tracking Screener |
1,800 |
|
$25.53 |
45,954.00 |
Media Tracking Questionnaires 1st, 2nd, and 3rd |
3,000 |
|
$25.53 |
$76,590.00 |
|
Male youth aged 11 to 18 in U.S. rural markets (Male Rural Smokeless) |
Youth Baseline Questionnaire (Male Rural Smokeless) |
328 |
|
$25.53 |
$8,373.84 |
Youth 1st, 2nd, 3rd, 4th (Male, Rural Smokeless) follow up questionnaire |
1,466 |
|
$25.53 |
$37,426.98 |
|
Cohort 2--Youth Aged 11 to 18 |
Cohort 2--Youth Baseline Questionnaire |
2,000 |
|
$25.53 |
$51,060.00 |
Cohort 2--Youth 1st, 2nd, 3rd Follow-Up Questionnaire |
4,703 |
|
$25.53 |
$120,067.59 |
|
Totals |
|
25,713 |
|
|
$656,452.89 |
There are no capital, start-up, operating, or maintenance costs associated with this information collection.
This information collection is funded through a contract with RTI. The total estimated costs attributable to this data collection are $12,637,812 (Exhibit 4). There are additional contract-funded activities occurring before and after this data collection that include project planning and data analysis. Other activities outside this data collection include coordination with FDA and its media contractor, evaluation plan development, instrument development, reporting, RTI IRB, and progress reporting and project management. This information collection will occur from 2013 through 2019.
Exhibit 4. Itemized Cost to the Federal Government
Government Personnel |
Time Commitment |
Average Annual Salary |
Total |
GS-13 |
15% |
$89,003 |
$13,350 |
GS-13 |
25% |
$94,969 |
$23,742 |
GS-15 |
5% |
$123,758 |
$6,188 |
|
|
|
|
|
|
Total Salary Costs |
$43,280 |
Contract Cost |
$ 12,594,532 |
||
Total |
$12,637,812 |
FDA requests approval to: 1) conduct a fourth follow-up survey with youth who are part of the first longitudinal cohort, and who participated in the baseline through third follow-up surveys; 2) develop and survey a second longitudinal cohort which will consist of an entirely new sample of youth, ages 11-16 at baseline; 3) extend the media tracking component of the evaluation, and; 4) extend the time period of the rural male smokeless component of the outcome evaluation.
The rationale for conducting a fourth follow-up survey of youth who are part of the first longitudinal cohort—data which is used to evaluate the non-trier and experimenter campaign—is that we increase the likelihood of detecting behavioral outcomes associated with the campaign. While there is no set time period during which effective campaigns can expect to document behavior change, studies show that campaign-related change in youth smoking behavior is often observable at around two years. However, our third follow-up with the first cohort takes place less than two years after campaign launch. Therefore, an additional data collection eight months later, when the campaign has been on the air for two and a half years, may well enable us to capture campaign effects that are not observable at the third follow-up. Because behavior change is the standard for campaign effectiveness, and because we have over 4,000 youth who have been willing to complete the baseline and three follow-up surveys, it would be a lost opportunity if we did not conduct a fourth follow-up.
Based on earlier response rates, we estimate that 1,607 members of the first cohort will participate in the fourth follow-up survey, for a total of 6,666 annualized participants (including 5,059 previously approved). At 0.75 hours per survey, this adds 1,205 annualized burden hours to the 3,794 previously-approved hours for a total of 5,000 annualized burden hours.
A second longitudinal cohort, including a baseline and three follow-up surveys, is necessary to evaluate the ongoing campaign. This data collection will determine whether the campaign is reaching, resonating with, and influencing tobacco-related cognitions, intentions and behaviors of youth who are just now aging into the campaign audience. It is important to continue to evaluate the campaign in an ongoing way as campaign messages, media delivery approaches and youth culture may change and interact differently over time. We know from the experience of the National Youth Anti-Drug Media Campaign that it is not safe to assume that a well-intended campaign will function as expected; evaluation is critical to document proper campaign functioning.
Development of the second cohort will involve screening 17,467 new individuals in the general population for a total of 30,880 screening participants, including 13,413 previously approved (all annualized). At 10 minutes per screening, this adds 2,970 burden hours to the previously-approved 2,280 hours for a total of 5,250 annualized burden hours. We expect 2,667 youth to complete the Cohort 2 baseline survey (2,000 annualized burden hours at 45 minutes per survey) and 6,270 youth to complete up to three follow-up surveys (4,703 annualized burden hours at 45 minutes per survey).
We request an extension of the media tracking survey, which complements the longitudinal survey used to evaluate the non-trier and experimenter campaign, because it provides early data on whether campaign advertisements are reaching and resonating with youth. Using the media tracking survey we can determine what proportion of youth in the target audience are seeing campaign advertising and what they think of it. We can assess the relative persuasiveness of advertisements and feed that information back to media planners so they can prioritize those ads that appear to have the greatest potential for subsequent belief and behavior change. In this way we use media tracking data to both assess the implementation of the campaign and to make mid-course adjustments that have the potential to increase campaign impact.
We expect 20,000 new participants to complete screener for a total of 60,000 participants, including 40,000 previously approved (all annualized). At two minutes per screener, this adds 600 burden hours to the previously-approved 1,200 hours for a total of 1,800 annualized burden hours. We expect the screening process to yield 2,000 participants, for a total of 6,000 annualized burden hours (including 4,000 previously approved). At 30 minutes per survey, this adds 1,000 burden hours to the previously-approved 2,000 for a total of 3,000 annualized burden hours.
We request an extension of the time period of the rural male smokeless component of the outcome evaluation due to a delay in the timing of the campaign launch. Because the timing of the evaluation must be integrally tied to the campaign itself, this delay was unavoidable. This extension will permit us to evaluate the now ongoing campaign.
Annualized burden hours increased from 12,612 as previously approved to 25,713, and the average hourly wage has risen to $25.53 as of April 2016. Assuming $25.53 per hour as shown in Exhibit 3, this increases the estimated annual cost to $656,453.
Data from this information collection will be used to estimate awareness of and exposure to the campaigns among youth. These estimates will take the form of self-reported ad recognition and recall that assess basic exposure as well as frequency of ad exposure. These estimates will also be calculated separately for each specific campaign advertisement.
Data from this information collection will also be used to examine statistical associations between exposure to the campaigns and pre-post changes in specific outcomes of interest. This will be accomplished with the use of multivariate models that estimate follow-up measures of each relevant outcome as a function of prior self-reported exposure to the campaign, controlling for baseline measures of each outcome as well as baseline individual characteristics that may confound the relationship between campaign exposure and changes in outcomes. The primary outcomes of interest among youth will be awareness of the campaigns as well as tobacco-related beliefs, attitudes, intentions and behaviors. We hypothesize that there should be larger changes in outcomes among individuals who are exposed to the campaigns more frequently (i.e., dose-response effects).
We will also utilize measures of market-level campaign intensity, which will be constructed with available data on campaign gross rating points (GRPs) for each market covered by this survey. These data provide an overall measure of the reach and frequency of televised programming (in this case, campaign ads) within any given media market. These data will be merged to the survey to provide an additional measure of campaign exposure among study participants. This will allow us to analyze the relationship between the market-level delivery of the campaigns and actual levels of awareness in each sample that is collected. This will also facilitate further analyses of the relationship between exogenous market-level measures of campaign dose and changes in the aforementioned outcome variables of interest.
The reporting and dissemination mechanism will consist of three primary components: (1) summary statistics (in the form of PowerPoint presentations and other briefings) on individual awareness of and reactions to the campaign, (2) a comprehensive evaluation report summarizing findings from this information collection, and (3) at least three peer-reviewed journal articles that document the relationships between campaign exposure and changes in the aforementioned outcomes of interest. The key events and reports to be prepared are listed in Exhibit 5.
Baseline information collection must be completed before the launch of the campaign. OMB approval is requested as soon as possible.
Project Activity |
Date |
Status |
First Cohort, Baseline data collection: experimenter and non-trier youth |
November 2013 to March 2014 |
Complete |
First Cohort, First Follow-up data collection: experimenter and non-trier youth |
August through October 2014 |
Complete |
First Cohort, Second Follow-up data collection: experimenter and non-trier youth |
April through July 2015 |
Complete |
First Cohort, Third Follow-up data collection: experimenter and non-trier youth |
January through March 2016 |
Complete |
First Cohort, Fourth Follow-up data collection: experimenter and non-trier youth |
August through October 2016 |
Complete |
Second Cohort, Baseline data collection: experimenter and non-trier youth |
May 2018 through August 2018 |
Pending |
Second Cohort, First Follow-up data collection: experimenter and non-trier youth |
January 2019 through March 2019 |
Pending |
Second Cohort, Second Follow-up data collection: experimenter and non-trier youth |
September 2019 through November 2019 |
Pending |
Second Cohort, Third Follow-up data collection: experimenter and non-trier youth |
May 2020 through July 2020 |
Pending |
Baseline data collection: rural smokeless with male youth |
January 2016 through April 2016 |
Complete |
First Follow-up data collection: rural smokeless with male youth |
September 2016 through December 2016 |
Complete |
Second Follow-up data collection: rural smokeless with male youth |
May 2017 through August 2017 |
Complete |
Third Follow-up data collection: rural smokeless with male youth |
January 2018 through March 2018 |
Complete |
Fourth Follow-up data collection: Rural smokeless with male youth |
September 2018 through December 2018 |
Pending |
Preparation of analytic data file |
Approximately 2–4 weeks after completion of data collection |
|
Data analysis |
Approximately 5–12 weeks after completion of each analytic data file |
|
Report writing and dissemination |
Approximately 12-16 weeks after completion of each analytic data file |
|
Not applicable. All data collection instruments will display the expiration date for OMB approval of the information collection.
Not applicable. There are no exceptions to the certification statement.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Sanford, Amber |
File Modified | 0000-00-00 |
File Created | 2021-01-15 |