Next Ssb 12-2010

NEXT SSB 12-2010.doc

NEXT Generation Health Study - NICHD

OMB: 0925-0610

Document [doc]
Download: doc | pdf

Supporting Statement B for







NEXT Generation Health Study – NICHD




September 15, 2009



Project Officer:


Dr. Ronald J. Iannotti

Prevention Research Branch

Division of Epidemiology, Statistics, and Prevention Research

National Institute of Child Health and Human Development

Building 6100, 7B05

9000 Rockville Pike

Bethesda, Maryland, 20892-7510

Telephone: (301) 435-6951

Fax: (301) 402-2084

E-mail: ri25j@nih.gov


Table of Contents




  1. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1. Respondent Universe and Sampling Methods………………………………….……… 3

B.2. Procedures for the Collection of Information …………………………………….……16

B.3. Methods to Maximize Response Rates and Deal with Non-Response ……………….. 27

B.4. Test of Procedures or Methods to be Undertaken …………………………………….. 35

B.5. Individuals Consulted on Statistical Aspect and Individuals Collecting and/or

Analyzing Data …………………………………………………………………...…… 35



List of Attachments



Attachment 1: Data Collection Instruments – Student Survey

Attachment 2: Data Collection Instruments – Administrator Survey

Attachment 8: Data Collection Instruments – In-home Parent and Student Surveys

Attachment 9: Data Collection Instruments – Anthropometric/Biomedical Forms

Attachment 10: Data Collection Instruments – Physical Activity Diary

Attachment 11: Height and Weight Protocol

Attachment 12: Waist Circumference Protocol

Attachment 13: Saliva Collection Protocol

Attachment 14: Blood Collection Protocol

Attachment 15: Blood Pressure Protocol

Attachment 16: Recruitment Materials

B. Collections of Information Employing Statistical Methods


B1. Respondent Universe and Sampling Methods

A primary goal of NEXT is to examine the prevalence and determinants of selected health behaviors and health status measures in a longitudinal study of a nationally representative probability sample of students in grade 10 from public and private schools. The design provides estimates of population percentages with a margin of error of plus or minus 3 percentage points at the 95% confidence level. An oversample of minority children (African American and Hispanic) in grade 10 is also included in order to improve the validity of sub-group analyses and to better study health disparities. Toward these ends, we construct a sampling frame that is current, complete, and accurate with respect to information needed for selection and stratification. The population of interest includes all 10th-grade students in public, private, and parochial schools in the 50 states and the District of Columbia. The sampling frame for the construction of primary sampling units (PSUs) and selection of public schools is the list of school districts supplied the Quality Educational Data, Inc. (QED). Both for the 2005-2006 and 2009-2010 cross-sectional survey, the list of school districts and schools provided by QED was used. QED maintains a continuously updated list of every school district in the U.S. and is therefore current. It also maintains a current list of K-12 schools by state with contact information covering 100% of public, private and Catholic schools by State in the U.S. The list of school districts and schools has comprehensive data on enrollment by grade, race/ethnicity in addition to address and contact information. We had very few problems in terms of missing schools or misclassification by grade etc. in using this list for the selection of the sample and contacting selected schools based on the information provided on the list. These lists provided by QED require very little work in terms of adding information needed for building sampling frames for selection of primary sampling units and schools. Hence we chose to go with the lists supplied by the QED for NEXT.

The list of school districts was used to construct PSUs. PSUs were formed by grouping school districts within in each Census division. The total number of PSUs created for selecting public schools is 1,302. For example, a primary sampling unit may contain all school districts within a county or two adjacent counties. Some PSUs contained only one very large school district. For sampling students from public schools, PSUs, which are either individual school districts or groups of school districts will be created and selected as a sample of PSUs at the first stage. A data file that identifies and provides extensive data on school districts and individual schools has been purchased from The Quality Educational Data, Inc. (QED) and examined to construct a sampling frame of PSUs. The QED files are current and contain data on primary and secondary public schools as well as private and parochial schools. Private and parochial schools will be linked to public districts to ensure that these sampled schools will fall within the same sample clusters as sampled public schools.

Although private schools can present special recruitment challenges, we have had great success in recruiting them into the 2005-06 HBSC sample and the current 2009-10 HBSC sample. While the sample of private and parochial schools is proportionately smaller than the sample of public schools, our recruitment rates for private schools are comparable to the recruitment rates of public schools. We use experienced recruiters and methods to explain the value of participating in the study and stress the need to make sure that “the voices” of private and parochial schools and students are adequately represented in the national findings. This strategy has been successful and we anticipate it will be no different for NEXT.

Sample Selection Procedures.

A primary sampling unit (PSU) is either an individual school district or a group of school districts in adjacent counties. These were created using a population of around 14,000 school districts. At the first stage of sampling, a sample of primary sampling units was selected. The list of schools offering grade 10 was obtained for only the selected primary sampling units. There is no sampling of school districts within a selected PSU as the list of schools is formed based on all the schools in the PSU without regard to school district. From this list, a sample of schools was selected. Only after the selection of the school, the school district is identified for purposes of contacting the school. This method of sampling reduces the cost of data collection as the sample of schools is not spread very widely across the U.S. Also, if we want to directly sample schools from a list of schools, we need a complete sampling frame of schools which is a list of all schools in the U.S. offering grade 10 where in a multi-stage design we need the list only for selected school districts. It would be more expensive to get a complete and correct list of schools offering grade 10 than just restricting the list to selected PSUs. We plan to contact a probability sample of up to 135 schools and expect that 90 schools will agree to participate in the survey resulting in a minimum response rate of 67% because we anticipate some of the 135 schools to be ineligible or not approached. These estimates are based on our experience with recruiting schools for the HBSC survey.

For determining the sample size at the initial wave, a recruiting or retention rate of 80% at each wave (conservative compared to the previous work by us and others) is assumed as well as a response rate of 95% (conservative or consistent with previous work by us and others) from those students who are successfully recruited or retained in the sample at each wave. The required sample size at the end of wave 4 in terms of the number of completes was estimated based on the desired precision of the estimate of change between two time periods. The sample size should be such that we are able to reject the hypothesis of no difference in population percentages of characteristics of interest between two time periods (for example year 3 and year 4) with 80% power when actually there is a difference of 5.3 percentage points with a two-sided statistical test at 5% of level of significance. For the determination of the sample size we assumed that the correlation between two time periods was around 0.5. The sample was first determined assuming a simple random sample of students. This gives a sample of around 700 students. Since the sample is selected using a multi-stage sampling design, we assumed a design effect of 1.5 based on previous HBSC surveys and increased the sample to 1,050 completes in the main sample. The margin of error of the estimated population percentage at 95% confidence level at the end of wave 4 based on a sample size of 700 (or 1,050 with correction for the design effect) is plus or minus 3.7 percentage points.

The strategy for minority oversampling was based on the requirement of around 215 African-American students at the end of wave 4 out of sample of 1,050 completes. We expect to get around 180 African American students at the end of wave 4. Therefore, there is insufficient sampling of minorities in the basic sample. To get the additional minority students, we plan to identify school with a high percentage of African American students and select additional samples of students to screen and identify minority students. Originally it was planned to select additional primary sampling units for sampling Hispanic students. This plan is no longer necessary. We expect to get the required number of 215 Hispanic students without oversampling as the percentage of Hispanic students is slightly higher than African-American students.

It is anticipated that minority response rates for the NEXT Generation Health Study will mirror minority response rates seen in the HBSC Survey. The overall response rate to date in HBSC is 97.5%. The response rate in minority oversample schools (those with a high percentage of minority students) is 89.4%. Additionally, the percentage of students in minority oversample schools completing the survey is 85.5%, while 90.4% of students in the larger sample completed the survey. These rates indicate that although there is a differential rate in minority students, a high response and completion rate can be maintained in this population. The NEXT Generation Health Study will use the same student recruitment procedures employed in HBSC in order to replicate these response and completion rates.

The oversampling will be done at the time of data collection for the main sample in schools that have 30% or more of African American students enrolled. We estimate that we need to select an additional 430 persons to identify the African American students. This will be done in around 20 schools out of 90. The actual number of schools that we need for oversampling will depend on the number of African American students identified in the main sample. This will be monitored and on that basis the number of schools that we need to go to will be determined.

A conservative estimate of number of completes at each wave is shown in Table 1a and 1b.

Table 1a: Expected Number of Completes in Each Wave of the Main Sample

A. Wave

B. Number Selected

C. Agree to Participate (80% of B)

D. Respondents

(95% of C)

Wave 1

3,150

2,520

2.392

Wave 2

2,394

1,915

1,818

Wave 3

1,819

1,455

1,381

Wave 4

1,382

1,105

1,050



Table 1b: Expected Number of Completes at Each Wave of the Entire Sample


Wave

Completes


Total

Main Sample

Oversample


Wave 1


2,392


328


2,720


Wave 2


1,818


249


2,067


Wave 3


1,381


189


1,570


Wave 4


1,050


144


1,194



In this survey, it is of interest of obtain estimates of change in population percentages of health characteristics (yes/no) type between two time periods. The sample size should be large enough to be able to detect a difference of around 5 percentage points with 80% power when we do a two-sided statistical test. Assuming simple random sampling and a correlation of 0.5 between two time periods, it was estimated that a sample of 700 completes would be required at the end of wave 4. The sample sizes are bigger than this number at earlier waves. Because the sample of students is selected using a multi-stage sampling design, the sample was increased to 1,050. This was done assuming an average design effect of 1.5. This value for the design effect was based on the 2005-2006 HBSC survey results.

The expected starting sample size of the 10th grade students is 3,580. This includes a sample of 3,150 students in the main sample and 430 in the oversample. If we contact 3,150 students in the main sample in participating schools, we expect that 80% of the students will agree to participate in the study. Of those that agree to participate we expect 95% to respond to the survey at the first wave. This gives an student response rate of 76% yielding a sample of 2,394 respondents as shown in Table 1. Similarly, we contact 430 students in schools that have a high proportion of minorities and 328 are expected to respond. The rate of 80% for retaining students contacted is based on previous studies which had a higher rate of retention. This sample is selected both from public and private schools.

This research team has already conducted successful nationally-representative, cross-sectional surveys of high-school students as well as a number of longitudinal studies of health behaviors of students. Use of the internet and computer assisted telephone interviews will increase our ability to track and survey students across the four-year period even when they are no longer in the school system. Other such longitudinal studies such as the NHLBI Growth and Health Study (NGHS) had a response rate of 78% but only focused on girls. NGHS’s overall retention rate was 86% at year 5 and the same techniques for tracking students as well as new strategies (e.g., computer assisted tracking and follow-up with the schools) will be used to achieve a high retention rate (e.g., ~90%). Traditional strategies will include: follow-up with the schools; maintaining detailed contact information about the subjects and their families and two or three individuals who will likely have contact with them in the future; and sending birthday and/or holiday cards which will prompt notices of address changes if students move. If contact is lost, searches will begin using internet resources such as the “Ultimates” (national white pages, email directories), and Google searches. In addition to standard tracking procedures, the research team is exploring the use of social networking Web site such as Facebook or MySpace as well as current technology favored by youth such as text messages and monthly music downloads (which would require students to provide a current email address to receive the download) to keep the students engaged in the study.

The sampling design for the selection of 3,580 students in grade 10 closely follows the design adopted for the 2009-2010 HBSC U.S. National Survey. The students will be selected using a three-stage design. At the first stage, school districts in each County in each of the 9 Census divisions are grouped to form primary sampling units (PSUs).

A sample of PSUs is selected within each Census division with probability proportional to the total enrollment. Within each selected PSU, schools offering grade 10 are listed and a sample of schools are selected with probability proportional to enrollment. Within each selected school, 1 or 2 classes are selected at random. We have a sample of 90 schools. We want a sample of 3,150 students in the main sample. If we assume an average class size of 20 students, we need to select around 157 classes. If we select 2 classes in every selected school we will get 180 classes. Therefore we plan to select two classes from 67 schools (75%) and one class from the remaining 23 schools giving a sample of 157 schools. The resulting weighted sample will proportionately represent each region of the country.

Principals are given the opportunity to select the “type” of classes to participate in the survey.  They are instructed that the class type selected must be 1) a non-tracked class and 2) one that all 10th grade students are taking during that semester; the result is that all students should have an equal opportunity for being in a class that is selected to participate in the survey.  For example, a principal may select to use health classes, PE classes, or homerooms as long as the above requirements are met.  We use the same strategy in HBSC, and most principals chose to offer non-academic classes for random selection. Students are excluded from the survey if they cannot read and understand the questions which are written in English or if they have developmental limitations that affect their ability to understand or provide age appropriate responses. If a student can cognitively understand the questions but cannot read or write responses, arrangements will be made to read to the students or assist in writing their responses, as we have in HBSC. Students with permanent physical disabilities that prohibit the collection of height, weight, or waist circumference will be excluded from the anthropometric measurements and the home substudy. Our experienced staff verifies that the requirements are met during their scheduling contacts with the school.

All students in selected classes are included in the sample. Table 2 gives the number of students in the 10th grade and the number of students to be selected in the sample including the oversample in each Census division.

Table 2: Distribution of the Population and Sample by Strata



Census Division

Students in the Population

Grade 10

Number of Students in the Sample

New England

176,507

163

Middle Atlantic

496,861

460

East North Central

609,097

564

West North Central

263,353

244

South Atlantic

694,916

644

East South Central

214,404

199

West South Central

450,698

417

Mountain

289,766

268

Pacific

670,778

621

Total

3,866,380

3,580


As indicated earlier, the PSUs are stratified by Census divisions. The total sample of students is allocated to each stratum in proportion to the total number of students in grade 10 in that Census division.

Estimation Procedure. For producing population-based estimates, each responding student will be assigned a sampling weight. This weight combines a base sampling weight which is the inverse of the probability of selection of the student and an adjustment for nonresponse at the school level and the student level. The probability of selecting a student is the product of the probability of selecting the school district, the probability of selecting the school within the district and the probability of selecting the class in which the student is present. The inverse of the overall probability gives the base weight. Various selection probabilities will be recorded and used to construct the sampling weight. The base weights will be adjusted for nonresponse. All student level estimates including estimates of change will be weighted estimates using the student weight. All student level analyses will also use student weights.

The objective is to select each student with a known probability of selection. Because of probability proportional to size (PPS) sampling at the first and second stages and unequal number of classes in selected schools, the overall probabilities of selection for students are unequal. As indicated on pages 14 through 16, we will determine the overall probability of selecting each student in the sample considering the three stages of sampling. The base sampling weight assigned to each student will be the inverse of the overall probability of selection of that student.

The size measure for selecting primary sampling units using PPS sampling is total enrollment. The size measure for selecting schools offering grade 10 is enrollment in grade 10. We used PPS systematic sampling to select primary sampling units and schools within selected primary sampling units. The determination of probability of selection at each stage is straightforward under PPS systematic sampling. For example, the probability of selecting a PSU (say PSU ) within a Census division is



where is the number of PSUs selected, is the total enrollment in PSU and is the total enrollment in all the PSUs in that Census division. Similarly, we can determine the probability of selection within a selected PSU. Classes are selected within a selected school using equal probability systematic sampling. As indicated earlier, the overall probability is determined by taking the product of the probabilities of selection at the three stages.

The adjustment for nonresponse at each stage will be done using the original base weights assigned to each unit. For example, the adjustment for nonresponse at school level involves the adjustment of school weights of responding schools such the sum of the adjusted weights will equal the sum of the weights of all selected schools including respondents and nonrespondents. Similarly, the weights of the responding students will be adjusted to account for nonresponding students. There will be a final poststratification adjustment of all student weights using a raking procedure such that the sum of the students in gender and race groups will add to known number of students in the population of students in grade 10.

Sampling Overview for In-Home Substudy (N=750)

The sampling frame for the selection of the sample of schools for the substudy will be all schools successfully recruited to participate in the basic survey. A brief description of the proposed sampling design is given below.

  1. In each of the nine strata (Census Divisions) all schools recruited will be listed.

  2. Schools, which are in relatively close geographic proximity, will be grouped into clusters (or “communities”). Clusters will be formed such that these will be approximately equal in size in terms of the number of students

  3. On average, two clusters per Census Division will be randomly selected for a total of 27 communities.

  4. Within each “community” cluster, schools will be listed and sorted by whether they are urban, suburban, and rural schools to assure representation in the sample

  5. Within each “community” cluster, recruited schools will be listed and sorted by whether they are urban, suburban, and rural schools to assure representation in the sample. Using systematic sampling, two schools from each of the 27 clusters will be selected to obtain the sample of 54 schools.

  6. Students in the two classrooms that were originally randomly selected to participate in the basic survey will be eligible for selection in the subsample. (Expected number of students ~19 students/class; ~38 students/school).

  7. At the study office, students’ in the selected classrooms will be categorized as “overweight” or “normal weight” based on their height and weight measurements collected during the main study.

  8. Seven overweight children and seven normal weight children will be randomly selected across classes per school from the respective weight status categories and recruited to the substudy.

For specific hypotheses using data from the substudy, the subsample of the longitudinal sample will be adequate to address primary hypotheses relating to obesity and cardiovascular disease. Power analysis and sample size estimation for specific hypotheses were conducted using Monte Carlo simulation procedures recommended by Muthen and Muthen (Muthen & Muthen, 2000). Monte Carlo simulation is the most common and preferred method to determine sample size for sufficient statistical power in multivariate analysis and structural equation modeling. In a Monte Carlo simulation, random samples with a specified sample size are generated repeatedly from a population with known parameters consistent with the proposed model. Path coefficients are then estimated from each simulated sample. The percentage of simulated samples that have significant parameters indicates the power of the study. The required sample size can be accurately determined by varying sample sizes in a series of simulations. The Monte Carlo study for determining power and sample sizes for the present study was conducted using Mplus version 3.0, which provides extensive simulation facilities for structural equation modeling.

The power analysis for determining sample sizes was conducted using a latent growth curve model for the relationship between student physical activity and peer physical activity, i.e., a linear model with four repeated measures of physical activity as outcome with one-year intervals between the measures. Peer behavior was specified as a covariate with two additional covariates (gender and SES). Simulation was conducted using two peer effect sizes including various corresponding peer behaviors and outcomes in the study (substance use, physical activity, diet, obesity). A smaller effect size was defined by Cohen (1988) as 0.1 in standardized estimate and a medium effects size was 0.3. The path loadings from the intercept to the four outcome measures were set at 1 and to the slopes were set from 0 to 4 with each unit represents a one year interval of assessment. Missing values were also generated in the simulation with each variable having 15% random missing.

Muthen and Muthen (2001) recommend several criteria for estimating appropriate sample sizes in power analysis for structural equation modeling. Parameter bias should not exceed 10%; standard error bias should not exceed 5%, and the coverage remains between 90 to 98%. The Monte Carlo simulation for this study conducted 1,000 replications with various sample sizes. The results from the simulation indicated that a final sample size of N = 550 for the linear model with small effect size had a statistical power of 96% to detect a peer effect, provided that missing values are random and below 15%. A separate simulation with medium effect size indicated that a subgroup sample size of N = 150 would have a power greater than 90% for detecting a peer effect. As a marker of clinical significance, a 0.3 to 0.5 SD between-group difference in physical activity should have a significant relation to health outcomes such as metabolic syndrome or adiposity. Thus, we would have the power to detect a clinically significant change in adiposity in analyses of the main sample and in analyses of selected subgroups. Subject retention should be higher in the in-home assessment than the in-school sample because they will have already completed the Year 1 in-school assessment and will have consented to the additional in-home assessment. To assure a final sample size of 550 we will start with a sample of 750 in Year 1. The larger sample participating in the survey but not the home visits would provide power to examine smaller effects within multilevel models and comparisons across sub-groups of interest. All criteria recommended by Muthen and Muthen (2001) were satisfied for the simulation studies.

B.2. Procedures for the Collection of Information

The initial wave of data collection will be conducted between January 2010 and July 2010. Data collection will be repeated annually each spring in waves 2 through 4 (i.e., January 2011 - July 2011, January 2012 - July 2012, and January 2013 - July 2013, respectively). A team of 80 professionally-trained local data collectors will conduct multiple school and home visits. This number includes backups in case of illness or poor performance as well as staff to cover make-up school visits due to absentee student, if necessary. This also offers the flexibility to triple-staff the schools that are anticipated to be difficult.

Administrator Surveys. School administrators will be asked to complete a brief survey describing the school’s health-related environment, policies, and programs (see Attachment 2). Principals are asked to identify the most appropriate person on their staff who can respond to questions about the school’s health programs and curriculum. That individual is then contacted during data collection and asked to complete the survey. If the Principal determines that he/she is the best possible respondent the School Administrator Survey is completed by him/her. Areas covered include policies regarding physical education, food services, health education and health promotion, violence prevention, safety and other education programs. The survey can be completed in less than 20 minutes online or using paper and pencil.

Student Surveys. The student survey can be completed in less than 45 minutes. In our pilot study with 9 14- and 15-year-olds, the average time to complete the survey was 35 min. including instructions. The longitudinal survey focuses on a limited set of health behavior outcomes and has an expanded focus on potential etiological factors. Items will be included when they are deemed essential to the outcomes of interest and/or where it is likely that they are included in the longitudinal surveys being conducted by other participating HBSC countries. Other questions are drawn from items used in previous U.S. HBSC questions or that have appeared in previous U.S. surveys, have evidence of good reliability and validity, and address unique issues related to the health of students in the U.S. Topic areas covered in the survey include:

  • Eating habits, weight control, and body image;

  • Physical activity;

  • Sedentary Behavior and sleep;

  • Substance use;

  • Dating violence;

  • Motor vehicle risks;

  • Dental health;

  • Family structure, environment, and communication;

  • Peer influences;

  • School environment;

  • Medicine use and health care;

  • Health status;

  • Demographics.

Student Survey Data Collection. The following assumptions and accommodations will be made to facilitate planning for the administration of the student surveys:

  • In about 75% of cases, two classes will be selected from each school;

  • Typically, a 1-day site visit by two data collectors will be needed at each school to complete surveys in 2 selected classes, (one day per school, to complete surveys, measurements, and saliva sample collections in two classes);

  • Schools will be staffed by multiple data collectors when such a need is determined.

Student Survey Procedures (Year 1). The following steps will be taken to facilitate the administration of the student surveys in the schools:

  • Survey Coordinators (project staff who will oversee recruiting, site development, and data collection activities) and School Survey Liaisons (teachers or school administrators who will be “hired” at each school to serve as liaisons between the NEXT research team and the school’s staff, students, and parents; School Survey Liaisons (SSLs) will provide access to schools, assist with coordinating the visit, obtain consent, arrange make-up surveys, but not assist with the administration of student surveys) will determine the best dates for the visit to each school within their region.

  • The data collectors will receive four days of training, including consultation on managing classroom behavior.

  • The data collectors assigned to a school will receive the materials necessary to complete all data collection at each school, including the survey instruments, measurement equipment, saliva sample collection materials, envelopes, student incentives, administrative notes, shipping materials, and local maps.

  • Environmental planning will assure an appropriate and private survey environment.

  • School personnel will be asked to honor students’ privacy.

  • A standardized introduction to the survey will be used, stressing the importance and confidential nature of the survey, and pointing out that names are not attached, only ID numbers. Students will be reminded that their participation is voluntary.

  • Project staff will be trained to actively monitor and move about the room, letting students know they are present and available to answer questions, but honoring their privacy. Teachers will be encouraged to remain in the classroom to monitor student behavior but, if present, will be strongly discouraged from looking at student surveys during this process. Asking classroom teachers to remain in the classroom during the survey administration to monitor student behavior is a strategy currently employed in HBSC. Project staff is trained to explain to the teacher that their assistance with monitoring student behavior is greatly appreciated but that in order to protect student privacy and confidentiality it is essential that he/she refrain from approaching students while they are taking the survey. Teachers are specifically asked to not move throughout the room while students are working on the survey or respond to student questions. The completed surveys are then placed in an envelope and sealed in front of the student to reinforce the confidentiality of their answers. This strategy is successful in HBSC and will be replicated in the NEXT Generation Health Study.

  • Data collectors will be available to help students and reduce missing data.

  • Survey booklets will include no personal identifiers and upon completion will be sealed in envelopes.

  • Students will be thanked and given a ten dollar gift card for completing the survey.

  • Students who complete the survey early will be given word-finding games and similar puzzles; students who do not have permission to take the survey but are required to remain in the same room will be given math and language-arts packets provided by the school.

Survey Absentee Procedures. It is anticipated that some students in the longitudinal sample will be absent from classes on the day of the visit. These data will be obtained either by an in-school assessment conducted by the field staff, an online survey using a computer in a private space in the school, or by computer assisted telephone interviews (CATI). Data collection staff will have clear directions on how to maintain the confidentiality of the survey for the students and the online instructions will also emphasize to the students the importance of completing this in a private space. Access to the online survey will be password-protected, and the passwords will be given to students in a sealed envelope that will indicate that it should not be opened until starting the survey in privacy. Students will be asked to destroy the password when they have completed the survey. The CATI survey is the third choice for the first-year survey and would be completed when the child is home.

Follow-up Survey Procedures (Years 2 through 4). Participants will complete follow-up surveys annually for three years after the initial in-school baseline survey. To accomplish this, an email will be sent to each participant with a secured, designated link to the online survey. Participants without access to a computer will be given the opportunity to complete the survey through a computer-assisted telephone interview. Our studies show that there were no significant differences on the HBSC survey due to response mode (HBSC forum; Seville, Spain, 2008).

Anthropometric Assessments (Years 1 through 3 ) and Saliva Sample Collection (Year 1): Informed parent consent and student assent will be obtained to assess height, weight and waist circumference and to obtain saliva samples in addition to consent for completing the survey. Saliva samples will be conducted to obtain genetic material (see Attachment 13). Extraction, amplification, analysis, and storage of the genetic samples will be arranged through an independent project assuring that those doing the genetic analyses will not be able to link the results to individual students. Informal surveys of adolescents and previous research indicate that female students prefer to be measured by female adults and that males have no preference. Gender of the assessors will be matched with the gender of the student and female assessors will conduct the measurements when matching is not feasible. Two adults will be present during all assessments.

Assessments will take place in private rooms supplied by the school or, when such rooms are not available, in private areas created with privacy screens in larger spaces when they are not in use (e.g., lunchroom, gymnasium) (see Attachments 9, 11, and 12). Separate stations will be created to measure height, weight, and waist circumference using standard protocols (e.g., The HEALTHY Study Group, in press; Pratt et al., 2008).

Anthropometric Assessment Absentee Procedures: In-school assessments of height, weight, and waist circumference will be repeated annually using the same procedures as those for the First Year assessment. No attempt will be made to collect these measures from absentees or students who have moved to schools other than the initial target school nor will these assessments be repeated in the fourth year, after the majority of students have graduated from high school.

Online Dietary Recalls of the In-Home Sample (Years 1 through 4). Although the NEXT in-school and online surveys have a number of questions about diet, including eating at fast food restaurants and a brief food-frequency assessment for consumption of a few healthful and unhealthful foods, limits in the length of the survey do not permit estimates of daily caloric intake, proportion of calories from fat, carbohydrates, and protein, or whether daily intake meets dietary guidelines. To obtain these estimates, the In-Home sample will provide an additional dietary assessment each year. In Years 1 through 4, the In-Home Sample (750 students with 375 normal weight and 375 overweight or obese) will complete the NCI ASA24, an online 24-hour dietary recall, for three days (random selection of two weekdays and one weekend day) each year. This method is completely consistent with NCI’s recommendations for use of the ASA24 dietary recall. The ASA24 was developed by NCI to be consistent with the methods used in NHANES in-person 24-hour dietary interviews conducted by trained dieticians. More details on ASA24 can be found at: http://riskfactor.cancer.gov/tools/instruments/asa24.html and a demonstration of the instrument can be found at: https://asa24.westat.com/). The ASA24 was recently developed by NCI and has been shown to have good reliability and validity for assessment of all nutrient groups. Parent consent and youth assent will be obtained for these online assessments which will be conducted independent of the NEXT longitudinal survey.

Assessment of Physical Activity, Sedentary Behavior, and Sleep of the In-Home Sample (Years 1 through 4).

As is the case with dietary intake, because of within-individual variability of physical activity within a single day and across days, a single time sample may be inadequate to estimate individual levels of physical activity (Trost et al., 2000) and this variability may increase with age (Wickel et al., 2007). Although a single day may not be representative of a child’s level of activity, there can be patterns across days. For example, there may be individual tendencies for higher levels of physical activity at particular times of day or days of the week (Trost et al. 2000). Thus, a week-long period is likely to capture this variability. For these reasons it is important to assess physical activity at different times of day and across multiple weekdays and weekend days. The number of days, the length of observation within each day, and the time of day sampled necessary to obtain a reliable estimate depends on the method of assessment as well as the age of the children being assessed. The recommendations for accurate and generalizable assessment are for up to 10 to 12 hours of observation per day, for minimums of three to 15 days depending on the assessment method, the level of physical activity necessary to meet the criteria for a particular intensity, and the age of the youth (Baranowski et al., 2008; Sirard and Pate, 2001; Trost et al., 2000). When physical activity is assessed with accelerometers, recommendations are for five to nine days of monitoring for school-age children (Baranowski et al., 2008; Trost et al., 2000). We propose to assess physical activity using an accelerometer for seven consecutive days. Patterns of weekend activity can also vary across ages; thus, sampling weekend days may be important for estimates.

Using multiple methods to assess physical activity will increase the reliability, validity, and sensitivity of estimates of longitudinal changes in physical activity. Although the self-report items used in the NEXT survey have been shown to have good reliability and validity, self-report errors may be subject to systematic variation based on cognitive development; therefore, the errors associated with self-report may introduce an age bias in longitudinal changes based on self-report. In addition, physical activity can have significant daily variation. To address these potential problems, in Years 1 through 4 physical activity will be assessed over a 7-day period with an accelerometer with multi-day memory. During the home visit with the In-Home Cohort, the health researcher will explain the use of the accelerometer and provide the adolescent with an accelerometer, written instructions, and a paid return envelope. Participants will also be provided with a telephone number in order to provide answers to any questions that arise during the week-long assessment.

One limitation of the accelerometer is that it cannot be worn during some sports activities, in the water, or when the participant is sleeping. The ActiWatch does not have these limitations (although watches may not be permitted during competition in some sports). The ActiWatch can be worn the entire day without concern for it getting wet and it provides minimal discomfort during sleep. During the home visit, each participant will also be provided with an ActiWatch along with instructions on how to use it and how to return in the paid return envelope. Because the ActiWatch is worn on the wrist (the accelerometer is worn on the hip), it can over-estimate the energy expenditure of activities that primarily involve arm, rather than trunk, movement. The primary reason for providing the ActiWatch is to obtain data on sleep; recent research suggests that adolescent sleep patterns affect obesity and mental health.

The accelerometer and the ActiWatch provide data on the frequency, duration, and intensity of bouts of physical activity. However, they do not provide information about the type of physical activity. The activity diary will complement the activity monitor. For example, the diary will tell us the precise activity that is reflected in the readings of the activity monitor, e.g., whether vigorous physical activity was due to participation in a sport (basketball), a leisure activity (jogging), or active transport (biking to the store). The diary will differentiate going to bed, while the activity monitor will indicate going to sleep. The diary will also indicate the type of sedentary behavior (e.g., homework versus a video game). The diary provides context for specific behaviors (location, involvement of others) while the activity monitor provides a more precise measure of time of day, duration, and intensity. Together, they provide a much richer set of data on daily activity of the adolescents. Of course, these data can also be used for comparison of methodologies and contrast dimensions such as frequency, duration and intensity of physical activity when measured by self-report versus objectively. Each adolescent in the In-Home sample will be provided with a Physical Activity Recall form (see Attachment 10) and instructions on how to use it. The form has a date and time grid corresponding to the seven days when the accelerometer and ActiWatch are being worn. In addition to indicating dates and times the devices are worn, the adolescent will indicate the primary activity within each grid including low energy expenditure activities such as sleeping, watching television, playing computer games, using the internet, and text messaging. A list of standard activities will be provided. Data from the 7-day diary, the accelerometer, and the ActiWatch will be linked to provide insight into activity expenditure and the corresponding type of activities for the entire observation period.

In-Home Assessments of Adiposity, Cardiovascular Risk, and Metabolic Syndrome (Years 1 and 4). Home visits will be conducted in Year 1 and Year 4. Efforts at primary prevention of cardiovascular disease recognize the importance of serum cholesterol levels. Links between serum lipids and behaviors such as diet and exercise in adolescents deserve further research. Home visits will be conducted at a time and place to accommodate the preferences of the family. In order to obtain fasting blood samples, students will be asked to arrive at school on a specified day prior to eating breakfast to have their finger-stick blood collections completed. Immediately afterwards, they will then be provided breakfast. The blood sample will be collected at the home visit if a student is absent on the day of data collection. Because adolescents in this age group vary in the extent that they ‘sleep in’ on weekends, we anticipate that a home visit prior to breakfast can be arranged. However, most home visits will be scheduled at the families’ convenience (weekdays, weeknight, or weekends).

Following standard protocols (see Attachment 14), fasting serum samples will be obtained with a finger stick technique from the longitudinal in-home cohort and collected in microtainer devices. 250 uL of serum will be sufficient for the quantification of the lipid fractions and other assays. The biological markers obtained for obesity, cardiovascular disease risk, and metabolic syndrome include: fasting blood glucose, HbA1c, total cholesterol, triglycerides, LDL-C, HDL, C-reactive protein, uric acid, cotinine, height, weight, waist circumference, and blood pressure. Assessment of height, weight, and waist circumference will follow the same protocols as those used for in-school assessment of the entire longitudinal cohort. Blood pressure will be assessed with a portable automated system (see Attachment 15). Blood samples will be packed in ice and shipped to a central lab for analysis. Extraction, amplification, analysis, and storage of the genetic samples have been arranged.

Parent consent and youth assent will be obtained for all of these assessments. The only results that will be immediately available will be height, weight, waist circumference, and blood pressure. All other results will not be available until assays are conducted at the central laboratory. When blood pressures are in the at-risk range, parents (before the age of 18) or youth (ages 18 and older) will be told of these results along with a recommendation that they see their physician for subsequent evaluation and follow-up. When youth have high risk blood pressure values, parents (older youth) will be told to seek urgent care. When youth have at-risk and high risk levels of lipids and/or fasting blood glucose, parents (or youth ages 18 and older) will be contacted with similar recommendations for seeking additional care.

In-Home Surveys (Year 1). During the home visit, parents will complete a brief survey (see Attachment 8). The survey will include questions about the adolescent’s chronic illnesses and medicine use (prescription and over the counter) and family demographic information. Students will complete a brief survey about their prescription and over-the-counter medicine use (See Attachment 8).

B3. Methods to Maximize Response Rate and Deal with Non-Response

The following procedures, which have been employed with great success in previous national surveys, will be used to recruit schools.

Special Permission Districts. Prior to beginning recruitment activities, “special permission districts” that require research applications for studies such as the NEXT will be identified from the sample. Research applications will be submitted as soon as the sample districts have been identified to allow for sufficient time for districts to provide approval.

State-level Mailings. An initial study notification mailing will be sent to State education superintendents prior to contacting each school. The purpose of this initial mailing is to inform the top officials in each participating State about the survey and to inform them of the districts that have been selected for participation. The mailing will include a cover letter, a color brochure introducing the study, and letters of support from key organizations. The cover letter will briefly introduce the study, explain its purpose and importance, describe the expected use of study findings, and refer readers to the detailed information in the NEXT Survey Fact Sheet. The Fact Sheet will describe the following: objectives and importance of the study; desired study participants and eligibility requirements; data collection activities and schedule; incentives for participation; responsibilities of the data collection staff, schools, principals, and participants; and measures that will be taken to protect respondents’ privacy and confidentiality.

Mailing to Districts. An initial study notification mailing will be sent to district superintendents. This initial mailing will provide districts with information about the survey and our request to visit the schools selected in their district. The mailing will include a personalized cover letter similar to the one developed for state administrators and the NEXT Fact Sheet, describing the survey objectives and requirements. Mailings will also include letters of support from influential professional societies and organizations, a copy of a color brochure designed for parents, and a copy of the survey. To track the sampled districts, schools, principals, and classes, each will be assigned an ID code and entered into a tracking database from which status reports can be generated. Each mailing will have the ID code pre-printed to indicate the recipient.

Mailing to Principals. Once school districts have been notified, a mailing will be sent to the principals of the sampled schools. The mailing to principals will contain the same information that was sent to the district superintendents, including a cover letter and the NEXT Survey fact sheet describing the study and the role of the principal in its implementation, along with letters of support from influential professional societies and organizations similar to those noted for district mailings, a copy of a color brochure designed for parents, and a copy of the survey. The cover letter to principals will differ in that it will be targeted to the schools (rather than districts). The letter will be personalized and will again include contact information for the Survey Coordinators so that principals can call or e-mail them directly if they have questions. The mailing will inform the principals that the Survey Coordinators will contact them to discuss arrangements for NEXT survey administration.

Telephone Contact with Principals. Approximately one week after the estimated delivery of the mailing to school principals, the Survey Coordinators will initiate telephone calls to confirm that principals have received the mailing, answer any questions they might have, and obtain permission for NEXT administration. At any time during the recruitment process, if a school official is non-responsive or refuses participation, or expresses strong concerns about confidentiality to the Survey Coordinators, they will address any concerns and obtain the school’s participation. Weekly recruitment progress reports will be generated.

Recruitment of Students. Procedures for obtaining consent are consistent with those of similar U.S. national studies. Due to the longitudinal nature of NEXT and the sensitivity of questions relating to substance use and dating/intimate partner violence, active parental consent will be required. Parents will be provided a letter introducing the survey, a color brochure describing the study, an explanation of future commitments to follow-up activities, and the consent form requesting permission for their children to participate. The designated School Survey Liaison and classroom teacher are asked to distribute and collect consent forms. This same process has been successfully executed in HBSC where the overall consent response rate is 97.5%. In the past, recognition that this is an additional task for school staff and consistent with recent and past OMB-approved procedures for HBSC, the School Survey Liaison was offered an incentive for the time and effort they contribute to the study. Our experience has been that staff members were very willing to take on this task. In cases where a district specifically prohibits using school staff to assist with the distribution and collection of consent materials, NEXT health researchers would have visit the school and perform this task. In the current proposed study, consistent with current OMB recommendations, no incentives will be provided.

The consent forms and letters will be sent home with students and classroom teachers will be instructed to encourage students to return the consent forms as quickly as possible. Returns will be monitored by the SSL. For parents who have not returned signed consent forms within three days, a second consent form and letter will be sent home with the student, reminding the parent of the study and requesting that they return the form. This process will be repeated for parents who have still not returned the form after one week. Students will be informed verbally and in writing that they may skip any or all questions or refuse to participate in the survey, in which case an alternative activity approved by the school administration will be provided. Prior to the school visit the assigned NEXT project coordinator will work with the designated School Survey Liaison to identify the most appropriate alternate activity for students in a randomly selected classroom who do not participate in the study. Examples of alternate activities used in HBSC include: students complete word games at their desk (provided by the research team) while other students complete the survey; students perform school work at their desk while other students complete the survey; students are relocated to a study hall in a separate classroom while other students complete the survey. These same options will be discussed with the School Survey Liaison at schools participating in the NEXT Generation Health Study. Separate parental consent and student assent will be obtained from In-Home Study participants, first verbally during a scheduling telephone call and then in writing at the start of the home visit.

Main Study Incentives

SCHOOLS


Schools

Grade 10

$500

Grade 11

$250

Grade 12

$250






STUDENTS

Year of Participation

Completing Survey

Completing Weight, Height, and Waist Circumference Measurements (and Saliva Sample in Year 1)

Total by Year

10th grade

none

$ 10

$10

11th grade

$ 10

$ 10

$20

12th grade

$ 10

$ 10

$20

After high school

$ 10

No measurement conducted

$10

Overall Total

$ 30

$ 30

$60



SUBSTUDY STUDENTS


Year of Participation

Completing home visit (height, weight, blood pressure, waist circumference) and home surveys

Completing dietary questionnaire for three days

Wearing accelerometer and ActiWatch® for all seven days

Completing daily activity diary for all seven days


Fasting blood draw

Total by Year

10th grade

$10

$10/day

$25

$25

$10

$100

11th grade

No visit

$10/day

$25

$25

$10

$90

12th grade

No visit

$10/day

$25

$25

$10

$90

After high school

$10

$10/day

$25

$25

$10

$100

Overall Total

$20

$120 for 12 days

$100

$100

$40

$380



Response Rates and Nonresponse Bias Analysis


School Response Rate

We plan to select a sample of 3,150 students from a sample of 90 schools in 9 Census divisions. Because this is a longitudinal survey, we anticipate the response rate to be lower than a one-time survey. Therefore, we selected a sample of 135 schools. This includes 90 schools that we plan to contact and a reserve sample of 65 schools. This avoids the problem of going back to the sampling frame for the selection of schools in each Census division in case of schools not agreeing to participate in the survey.

We will make every effort to get the 90 schools in the first sample to participate in the survey. We do not expect to use all the 135 schools selected for the survey. We expect to get a higher than 67% school response rate mentioned in response to the question on school response rate. We hope to achieve a response rate of 80% for the school survey.

For the sampling of students at the second stage, we plan to select an entire class or classes in selected schools. We planned the sample size of students that we wanted to contact assuming a response rate of 80%. This is to have enough students in the sample in case there is a higher nonresponse than one-time surveys because of longitudinal nature of the survey. The assumed response rate is lower than what has been achieved in previous surveys in schools that agreed to participate. Since the entire class is selected and not a sample of students in the class, it is likely, that the response rate for the students for agreeing to participate will be more like 95% than 80%. Response rate for those who agree to participate is expected to be even higher than the assumed initial numbers,

As indicated earlier, the assumed response rates are more for estimating the required initial sample size. We expect the overall student response rate to be around 72% and not the very low response rate used to determine initial sampling.

Nonresponse Bias Analysis

We plan to do a nonresponse bias analysis as per the NCES guidelines.

School Nonresponse Bias

The Abt statisticians assigned to NEXT are Dr. K.P. Srinath and Dr. Martin Frankel. Dr. Srinath will oversee the sampling and weighting processes for NEXT, including the development and implementation of imputation procedures. As part of the NEXT team, Dr. Frankel will provide expert technical support to Dr. Srinath in the areas of sampling and weighting. Dr. Srinath and Dr. Frankel were responsible for the sampling and weighting process for the HBSC 2006 study.

Dr. K.P. Srinath currently oversees sampling and estimation procedures for a wide variety of projects, providing guidance regarding construction of sampling frames, stratification, sample size determination, sample allocation and sample selection, and developing detailed weighting specifications for programming staff. Dr. Srinath has contributed to a number of important methodological studies and analyses. Dr. Srinath holds a Ph.D. in biostatistics from the University of California, Los Angeles and is an elected member of the International Statistical Institute.

Dr. Martin R. Frankel, a senior statistical scientist at Abt, has 30 years of experience applying statistical sampling and analysis to social and business issues. He is nationally recognized for his expertise in the design, execution, and analysis of major national sample surveys for a number of government agencies and commercial enterprises. He is also well known for his designs of longitudinal surveys in the field of education and his knowledge of NCES Statistical Standards. Dr. Frankel served on an invited Standards Review Panel for NCES, in which capacity he was asked to provide advice to NCES that will help it make the Standards more effective. Dr. Frankel is the coauthor of two important books—Inference from Survey Samples, and Total Survey Error—and has done pioneering work in the construction of multistage samples for ED. He is one of a small number of statisticians whose work essentially sets standards in the survey industry. Dr. Frankel has a Ph.D. in mathematical sociology from the University of Michigan.

Nonresponse Bias Analysis in NEXT

Bias in a survey estimate because of nonresponse consists of two components. The first is the nonresponse rate and the second is the difference between respondents and nonrespondents in the population parameter that is being estimated. For example, if we are estimating a population percentage by selecting a simple random sample and computing the sample percentage and there is nonresponse, the bias in the sample percentage due to nonresponse is given by



where is the sample percentage based on respondents, is the response rate, is the population percentage among the respondents and is the population percentage among the nonrespondents. Therefore, it is important to examine both the response rate and the differences between the responding and nonresponding groups in the analysis of bias in the estimates due to nonresponse. We describe below the steps that we intend to follow for nonresponse bias analysis due to nonresponse by some schools in the sample in NEXT. These steps are in accordance with the statistical standards set up by the National Center for Education Statistics (NCES) for nonresponse bias analysis (http://nces.ed.gov/StatProg/2002/std4_4.asp ).

1. Examination of Response Rates

We will examine both the overall response rate and the response rates for various subgroups as per the guideline 4-4-2A under NCES Statistical Standards. High response rates for the entire sample but also for subgroups might indicate that there is no need for further analysis of bias due to nonresponse (Bose, 2001). Large differences in the response rates for subgroups serve as indicators that potential bias may exist (Brick & Bose, 2001). We plan to examine school response rates by: (1) census division; (2) rural and urban; (3) enrollment (large schools vs. small schools); (4) proportion of minority students; (5) poverty index for schools; and (6) school type - public, Catholic and private schools. It is possible to look at the rates by subgroups as this information is available for both respondent and nonrespondent schools within the sampling frame. As an example, if the response rates for schools with high-income students (low poverty index) and schools with low-income (high poverty index) are very different, then any difference in characteristics of interest (like percent of students who are obese or who have low physical activity) between these schools would result in a bias in the estimates.


For each of these variables, we plan to examine selected characteristics, such as obesity, low physical activity, and tobacco use, based on the respondents in each group. If group differences are found for both the selected characteristic and response rates, there is reason to believe that there is bias in the estimates. We will also investigate the sampling frame characteristics of these schools in each subgroup. We will make appropriate weighting adjustments to reduce this bias.

2. Comparison of Sample and Frame Estimates

Per the NCES guideline 4-4-2C, we will use the sampling weight based on the probability of selection of responding schools without any nonresponse adjustment and the data from the responding schools to compute population estimates of some characteristics available (not used for stratification at the time of selection of schools) on the sampling frame. These estimates will be compared with the population values. For example, the total number of weighted students by grade based on the respondents can be compared to the number on the sampling frame. If there are large differences taking into account the sampling error, then this may indicate bias because of nonresponse. We will also get estimates of students in responding schools by race/ethnicity, and compare this to the total computed from the population of schools on the frame to determine whether there is any bias in the estimates.

3. Comparison of estimates based on respondents to estimates from external sources

Per the NCES guideline 4-4-2C, we will compare estimates of the prevalence of selected health behaviors from the 2009 Health Behavior in School-Age Children Survey, Youth Risk Behavior Survey (YRBS), and Monitoring the Future Survey to determine whether there is large difference in the survey estimates. A large difference which cannot be attributed to sampling error may indicate a bias in the estimates. This approach is limited as differences may not be solely due to nonresponse.

4. Comparisons of Respondents by Successive Levels of Recruitment Effort

As per the guideline 4-4-2D by NCES, we plan to compare schools that agree to participate in the survey after the first contact with those that agree after several attempts or those that refuse first and then later agree. Estimates of student level characteristics will be computed based on each successive wave of participating schools (i.e., adding respondents in the order of level of effort used to recruit the school) and the sampling weights based on probabilities of selection. If the estimates based on the initial sample and successively larger samples have a trend of either increasing or decreasing, this may be an indication of bias because of nonresponse.

For example, if the percentage of students who are obese increases significantly as the number of responding schools increase, this might indicate that we are underestimating the percent of students who are obese.

5. Nonresponse Propensity Model

As suggested in NCES guideline 4-4-2B, we will examine the possibility of constructing a propensity score model to estimate the probability of a school in the sample responding to the survey both for respondents and nonrespondents. This is called a propensity score. The estimated propensity scores come from a logistic regression model. The survey statisticians at Abt Associates have experience working with propensity score models for dealing with problems of noncoverage and nonrespoonse (Srinath et al, 2009). The model will be based on variables which are available both for nonresponding and responding schools. Census division, rural/urban, enrollment, Catholic/private/public, proportion minority, poverty index are some of the variables that will be considered. Schools will be grouped using the estimated propensity scores. Within each group we will compare the frame characteristics of responding and nonresponding schools. This may help to determine the survey characteristics of students in schools that do not respond. For example if nonresponding schools with low propensity scores happened to be rural and low income schools, then the characteristics of the responding schools will provide information on the bias because of these nonresponding schools. This grouping in addition to assessing the bias will also provide a method of forming weighting classes for adjusting the weights to reduce the bias due to nonresponse.

Student Nonresponse

For the analysis of possible bias due to nonresponse to the survey by students, we plan to record the reasons for nonresponse. This may be due to the longitudinal nature of the survey or related to characteristics of interest in the survey. We plan to keep track of these rates by various subgroups like students in rural schools versus urban schools.

The SSL in each school will provide student characteristics within each classroom which will enable us to examine potential differences between responding and nonresponding students. We will follow-up nonresponding students to see whether these students are different with respect to important characteristics of interest. It will be done on a subsample of students if not the entire sample of nonrespondents.

We plan to adjust the sampling weights of responding schools and students to account for nonresponding schools and students. We will look at the possibility of forming weighting classes to adjust the weights.


B4. Tests of Procedures or Methods to be Undertaken

The 10th-grade student survey can be completed in less than 40 minutes. We have conducted a pilot assessment with a sample of 9 volunteers ages 14 to 15. Based on interviews with these volunteers we made changes to the survey wording and length in order to assure that students understand the questions and can complete the survey in a timely manner. Procedures for anthropometric measures and saliva sample collection were also pilot tested.

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The role of outside consultants collecting the data, performing preliminary analyses, and staffing of this project is discussed in earlier sections of this application (B2, B3).

In addition to the Prevention Research Branch, our Division includes the Epidemiology and the Biostatistics and Bioinformatics Branches. All proposals undergo extensive methodological review within the Division before moving forward.

Other consultants for this study include the research members of 40 HBSC countries who reviewed and recommended questions from the 2009/2010 HBSC study (OMB No.: 0925-0557, exp. date: 1/31/2012) according to their specialty interests, as members of HBSC focus groups. Those HBSC member names have not been included here. The HBSC Scientific Development Group required that all of the HBSC questions be piloted and reviewed externally before the questions could be included in the HBSC protocol. Besides the review of focus group questions, global external review was required under the HBSC protocol for significance of research topics, concepts, clarity of language used, and validity of measures to address those topics. Many of these reviews were completed by e-mail.

Consultations for this research project have been obtained incrementally since its inception. The most recent consultations for this 2009/2010 survey occurred between October 2007 and June 2009. The initial concept and subsequent proposal were reviewed by two different External expert panels who evaluated the justification, design, and methods of the study. A third of the panel included research methodologists or research statisticians. NICHD obtained external statistical review of five proposals for both methods and sample designs. The protocol, methods and assessments were also reviewed by the NICHD Director of Intramural Research and a panel of independent extramural investigators selected by the Director. Several levels of review and evaluation have been completed by participating institutes (NHBLI, NIAAA) including reviews by experts both internal and external to the National Institutes of Health. For example, in the review by the NHLBI Board of External Experts, the approval was near unanimous (with one dissenting voter requesting additional measures of sub-clinical indicators which would have required complex and expensive procedures which would have substantially increased respondent burden). The proposal was also reviewed by the NHLBI advisory council.

Finally, as part of the IRB process, the proposal received another external review at NICHD organized by the Office of Intramural Research. Reviewers were drawn from three categories: longitudinal methodology; pediatrics; and pediatric cardiology. Two of the reviews were glowing and without any criticisms of the methodology. The third arrived after the OMB application was submitted and provided recommendations for improving the clarity of the proposal. This reviewer requested additional details about sampling and recruitment; however, consistent with other reviews, this reviewer was very positive about our original incentive plan

In addition, all assessment procedures were distributed for review, comment, and endorsement to representatives of the broader education and health promotion community at the national, state, and local education agencies and those involved in the health and welfare of children. These consultations included 31 representatives of state, local, and national education agencies.

The protocols and surveys have been approved by the NICHD Institutional Review Board (IRB).


1


File Typeapplication/msword
AuthorO'Brien Family
Last Modified ByRonald Iannotti
File Modified2010-01-01
File Created2009-08-11

© 2024 OMB.report | Privacy Policy