2. Part B ExPECTT_FU5_3-4-22

2. Part B ExPECTT_FU5_3-4-22.docx

Evaluation of the Food and Drug Administration's General Market Youth Tobacco Prevention Campaign

OMB: 0910-0753

Document [docx]
Download: docx | pdf

U.S. Food and Drug Administration

Evaluation of the Food and Drug Administration’s General Market Youth Tobacco Prevention Campaigns


OMB Control Number 0910-0753

B. Statistical Methods

  1. Respondent Universe and Sampling Methods


The Cohort 2 evaluation consists of a probability sample involving a longitudinal survey of approximately 6,000 youth assessed at baseline and five follow-up waves for the national (non-trier and experimenter) campaign. This longitudinal design allows us to calculate baseline-to-follow-up changes in campaign-targeted outcomes for each study participant. We hypothesize that if the campaign is effective, the baseline-to-follow-up changes in outcomes should be larger among individuals exposed to the campaign more frequently (i.e., dose-response effects). Eligible youth are aged 11 to 16 at baseline and 15 to 21 by the end of data collection. For the Cohort 2 evaluation, age is the only screening criterion. The survey is being conducted by RTI.

For the Cohort 2 sample, we began by taking a sample of 100 Primary Sampling Units (PSUs) probability proportional to the number of 11-17-year olds. Our PSU is Public Use Microdata Areas (PUMAs). PUMAs are created for the dissemination of Census public use microdata from the American Community Survey but also can serve as PSUs clusters (McMichael & Chen, 2015). The area frame of PUMAs covers the entire lower 48 states plus the District of Columbia. Our Secondary Sampling Units (SSUs) is Postal Carrier Route. This cluster of addresses amounts to the list of addresses a mail carrier will deliver in one day. We selected between 400 and 500 SSUs probability proportional to the number of 11-17-year olds. For our third and final stage we selected addresses from the Computerized Delivery Sequence file (CDS) leased from Compact Information Systems (CIS). We selected an approximate equal number of addresses from each SSU (Approximately 85 – 106 per SSU). Exhibit 1 details our response assumptions for Cohort 2.

Exhibit 1. Addresses and the Associated Assumptions to Yield the Needed Number of Completes

Activity

National Sample Cohort 2

(All Youth)

Selected addresses

42,510

Correctly geocoded housing units

NA

Occupied housing units

36,134 (85%)

Screened households

27,100 (75%)

Eligible households

11,111 (41%)

Eligible persons

11,111 (100%)

Baseline completes

6,000 (72%)

Wave 2 (1st follow-up) completes

4,800 (80%)

Wave 3 (2nd follow-up) completes

3,840 (80%)

Wave 4 (3rd follow-up) completes

3,456 (90%)

Wave 5 (4th follow-up) completes

3,110 (90%)

Wave 6 (5th follow-up) completes

2,799 (90%)

Note: The 50% response rate at the first time point is a product of the person completion rate and the household screening rate (72% * 70%).

  1. Procedures for the Collection of Information

B.2.2 Outcome Evaluation Follow-Up Data Collection Waves


Data collection at baseline and the first four follow-up rounds have been completed for Cohort 2. Therefore, this section describes data collection procedures for the final follow-up survey. This design will produce data for the same youth over a 4-year period. This study design will provide a more accurate and thorough understanding of tobacco initiation, prevalence, and cessation among the campaign’s target audience of youth aged 12 to 17. Eligible youth were aged 11 to 16 at the baseline survey and will be 15 to 21 at the final survey wave. As the cohort will be aging over this time period, the data collected throughout the study will reflect information from youth aged 11 to 21. When it is not safe to collect data in person, such as during the COVID-19 pandemic, we will collect data via a Web-based survey only.


Panel maintenance letters will be sent out in advance of follow-up data collections to update contact information to the degree possible (Attachment 15 and 15b). Lead letters with Web-survey log-in credentials will be sent to parents of youth under 18 years of age and to respondents 18 and older to invite them to participate in the follow-up by Web. These letters will inform parents and youth about the study’s purpose and background, explain the survey procedures, and provide information to the respondent on participating via the Web. The letters will provide the Web address for the online version of the survey and the user ID and password each sample member will need to access the survey. Parents of respondents who provided an e-mail address in the baseline survey will also receive an e-mail invitation for youth to complete each follow-up survey via the Web. Parents of respondents who provided a telephone number and agreed to receive text messages will also receive reminder texts for youth to complete the fifth follow-up via the web. The follow-up lead letter and text for the follow-up e-mail invitations are shown in Attachments 10, 10b, 18, 18b,19, 19b, 20, 20b, 21, 21b, 22, 22b, 23, 23b, 24, 24b. Participation via the Web will provide flexibility and convenience to participants.


For each follow-up wave of data collection, the parents name and mailing address for all active cases are sent to through batch tracing with Lexis Nexis to determine if there is a new mailing address. If a mailed letter or postcard is returned indicting the parents of the respondent no longer lives at the address, that case will be sent to interactive tracing through the project control system. Interactive tracing is conducted by one of RTI’s tracing specialists who reviews the case contact information in the control system and accesses resources and databases to search for additional or more current contact information to locate the parent. Once located, the contact information is shared with the interviewer through an update made within the project control system so that the interviewer can attempt to complete the associated case(s).


The youth surveys include the same set of items at baseline and follow-up with the exception of items regarding each campaign and its materials (e.g., television ads, print materials), which will vary over the course of the campaigns). Minor revisions to surveys may be necessary given the media development process and possibility of changes in campaign implementation, but every effort has been made to minimize the possibility of instrument changes. The youth survey instrument includes measures of demographics; tobacco use behavior; intentions to use tobacco; self-efficacy; cessation intentions; cessation behaviors; tobacco-related attitudes, beliefs, and risk perceptions; social norms; media use and awareness; environmental questions; and measures of awareness of and exposure to the campaign materials (see Attachment 2). There is no parent survey at follow-up.


  1. Methods to Maximize Response Rates and Deal with Nonresponse

The ability to obtain the cooperation of potential respondents in the baseline survey and maintain their participation across all survey waves will be important to the success of this study.


At follow-up waves one and two, youth respondents were offered a $25 incentive to complete the survey online during an early release period that ran for approximately three weeks. Subsequently, youth respondents were offered a $20 incentive to complete the survey after the early release period. For waves three, and four, we increased the incentives to $30 for completion of the survey during the early release period, and $25 for completion following the early release period. We plan to offer the same incentives for follow-up five, with a $30 incentive for survey completion during the early release period and a $25 incentive for survey completion after the early release period. Studies suggest that this incentive approach can increase response rates and reduce costs and nonresponse. In addition, the study will use procedures designed to maximize respondent participation. E-mail reminders and text messages will be sent to encourage participants to complete the survey via the Web.





  1. Test of Procedures or Methods to be Undertaken

Prior to launching the baseline survey, we fielded an eight-case pretest of the survey instrument. This pre-test survey was identical to the instrument being used in the Cohort 2 evaluation and approved by OMB, with the exception of a few additional questions to assess overall clarity of instrument questions and respondents’ opinions on aspects of the survey that were unclear. The purpose of the pretest was twofold: (1) to assess technical aspects and functionality of the survey instrument and (2) to identify areas of the survey that were either unclear or difficult to understand. We reviewed diagnostic data on average time of survey completion, survey completion patterns (e.g., are there any concentrations of missing data?), and other aspects related to the proper function of the survey. We also examined data on pilot test measures used to assess the clarity of item wording and ease of understanding.


In addition to the pretest survey, RTI conducted rigorous internal testing of the online survey instrument prior to its fielding in the first follow-up. Evaluators reviewed the online test version of the instrument used to verify that instrument skip patterns functioned properly, delivery of campaign media materials was working properly, and that all survey questions were worded correctly and in accordance with the instrument approved by OMB.


  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The following individuals inside the agency have been consulted on the design and statistical aspects of this information collection as well as plans for data analysis:


Erin O'Brien

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Ave

Silver Spring, MD 20993

Phone: 240-402-2760

E-mail: erin.obrien@fda.hhs.gov


Lindsay Pitzer

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20993

Phone:240-620-9526

E-mail: lindsay.pitzer@fda.hhs.gov

Morgane Bennett

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Ave

Silver Spring, MD 20993

Phone: 240-750-59961

E-mail: Morgane.Bennett@fda.hhs.gov


Debra Mekos

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Ave

Silver Spring, MD 20993

Phone: 301-796-8754

E-mail: Debra.Mekos@fda.hhs.gov


The following individuals outside the agency have been consulted on the questionnaire development, statistical aspects of the design, and plans for data analysis:


Xiaoquan Zhao

Department of Communication

George Mason University

Robinson Hall A, Room 307B

4400 University Drive, 3D6

Fairfax, VA 22030

Phone: 703-993-4008

E-mail: xzhao3@gmu.edu


The following individuals will conduct data collection and analysis:


Matthew Farrelly

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-6852

E-mail: mcf@rti.org


Anna MacMonegle

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-990-8427

E-mail: amacmonegle@rti.org

Jennifer Duke

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-485-2269

E-mail: jduke@rti.org


Jane Allen

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-597-5115

E-mail: Janeallen@rti.org


Nathaniel Taylor

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-316-3523

Email: ntaylor@rti.org



James Nonnemaker

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-7064

E-mail: jnonnemaker@rti.org

References


Abreu, D. A., & Winters, F. (1999). Using monetary incentives to reduce attrition in the survey of income and program participation. Proceedings of the Survey Research Methods Section of the American Statistical Association.

Castiglioni, L., Pforr, K., & Krieger, U. (2008). The effect of incentives on response rates and panel attrition: Results of a controlled experiment. Survey Research Methods, 2(3), 151–158.

Centers for Disease Control and Prevention. (2012). Youth Risk Behavior Surveillance–United States, 2011. Morbidity and Mortality Weekly Report, 61(4), 1162.

Davis, K. C., Nonnemaker, J., Duke, J., & Farrelly, M. C. (2013). Perceived effectiveness of cessation advertisements: The importance of audience reactions and practical implications for media campaign planning. Health Communication, 28(5), 461472. doi:10.1080/10410236.2012.696535

Davis, K. C., Uhrig, J., Bann, C., Rupert, D., & Fraze, J. (2011). Exploring African American women’s perceptions of a social marketing campaign to promote HIV testing. Social Marketing Quarterly, 17(3), 39–60.

Dillard, J. P., Shen, L., & Vail, R. G. (2007). Do perceived message effectiveness cause persuasion or vice versa? Seventeen consistent answers. Human Communication Research, 33, 467–488.

Dillard, J. P., Weber, K. M., & Vail, R. G. (2007). The relationship between the perceived and actual effectiveness of persuasive messages: A meta-analysis with implications for formative campaign research. Journal of Communication, 57, 613–631.

Farrelly, M. C., Davis, K. C., Haviland, M. L., Messeri, P., & Healton, C. G. (2005). Evidence of a dose-response relationship between “truth” antismoking ads and youth smoking prevalence. American Journal of Public Health, 95(3), 425431. doi: 10.2105/AJPH.2004.049692

Jäckle, A., & Lynn, P. (2008). Respondent incentives in a multi-mode panel survey: Cumulative effects on nonresponse and bias. Survey Methodology, 34(1), 105–117.

Janega, J. B., Murray, D. M., Varnell, S. P., Blitstein, J. L., Birnbaum, A. S., & Lytle, L. A. (2004). Assessing the most powerful analysis method for schools intervention studies with alcohol, tobacco, and other drug outcomes. Addictive Behaviors, 29(3), 595606.

McMichael, J., & Chen, P. (2015). Using census public use microdata areas (PUMAs) as primary sampling units in area probability household surveys. In JSM Proceedings, Survey Research Methods Section, pp. 2281–2288. Alexandria: American Statistical Association.

Murray, D. M., & Blitstein, J. L. (2003). Methods to reduce the impact of intraclass correlation in group-randomized trials. Evaluation Review, 27(1), 79103.

Murray, D. M., & Short, B. J. (1997). Intraclass correlation among measures related to tobacco-smoking by adolescents: Estimates, correlates, and applications in intervention studies. Addictive Behaviors, 22(1), 112.

Shettle, C., & Mooney, G. (1999). Monetary incentives in U.S. government surveys. Journal of Official Statistics, 15, 231–250.

Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey Nonresponse (p. 163–177). New York, NY: Wiley.

Snyder, L. B., Hamilton, M. A., Mitchell, E. W., Kiwanuka-Tondo, J., Fleming-Milici, F., & Proctor, D. (2004). A meta-analysis of the effect of mediated health communication campaigns on behavior change in the United States. Journal of Health Communications, 9, 7196.

Substance Abuse and Mental Health Services Administration (SAMHSA). (2012). Results from the 2011 National Survey on Drug Use and Health: Summary of national findings. NSDUH Series H-44, HHS Publication No. (SMA) 12-4713. Rockville, MD: Substance Abuse and Mental Health Services Administration.

U.S. Department of Health and Human Services (USDHHS). (2006). The health consequences of involuntary exposure to tobacco smoke: A report of the Surgeon General. Atlanta, GA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, Coordinating Center for Health Promotion, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health.

Wakefield, M. A., Spittal, M. J., Yong, H-H., Durkin, S. J., & Borland, R. (2011). Effects of mass media campaign exposure intensity and durability on quit attempts in a population-based cohort study. Health Education Research, 26(6), 988–997.




9





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorGittleson, Daniel
File Modified0000-00-00
File Created2022-04-08

© 2024 OMB.report | Privacy Policy