HES Behavioral Part B Draft Submission (2)

HES Behavioral Part B Draft Submission (2).docx

Hurricane Evacuation Behavioral Survey

OMB: 0710-0016

Document [docx]
Download: docx | pdf

Supporting Statement B


Programmatic Review for USACE Sponsored Hurricane Evacuation Public Behavioral Surveys


OMB Control Number XXX-XXX




Collections of Information Employing Statistical Methods



The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When the question “Does this ICR contain surveys, censuses, or employ statistical methods?” is checked "Yes," the following documentation should be included in Supporting Statement B to the extent that it applies to the methods proposed:


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.



The potential respondent universe will consist of residents living in coastal areas where public officials may call for an evacuation when a hurricane threatens. All study proposals must include a description of a survey’s particular respondent universe. The sample population is typically identified using available hurricane risk data, including data on areas at risk from hurricane storm surge flooding, previous hurricane evacuation studies or hurricane response plans, established hurricane evacuation zones, and in coordination with State and Local governments within the study area who are responsible for hurricane emergency management and evacuation decision making.


Based on experience with previous hurricane evacuation public behavior surveys, we estimate that there will be approximately 1500-3000 completed surveys for each hurricane evacuation behavioral study effort, which will be conducted either by telephone, online survey, mail back, or face-to-face. In recent evacuation behavioral surveys, using commercially available published listings of landline telephone numbers with geocoded physical addresses, 37.5% of the numbers were reached successfully. The attrition is partly due to numbers that have been disconnected, commercial numbers that were incorrectly classified as residential, and so forth. But most of the attrition is due to residents not answering their phone. Of those who do answer, approximately 40% have most recently agreed to participate in the survey, resulting in an overall response rate of 15%. The 15% figure is better than the national average for landline surveys of 9%, mainly due to greater interest in the subject matter of the survey. Response rates for future survey efforts are expected to be similar and at or above levels needed to obtain statistically viable results. USACE Districts will be encouraged to consider a range of approaches that can be used to maximize response rates and ensure that these are integrated as part of the survey methodology in the supporting statement for each information collection to be considered under the clearance.


For the Southeast Louisiana Hurricane Public Evacuation Behavior Survey, which is being used as a supporting example for this submittal and is representative of future evacuation surveys that will be conducted in other hurricane evacuation study areas, the study will collect 2,600 responses from residents of 13 Louisiana Parishes: Plaquemines, St. Bernard, Jefferson, Orleans, Lafourche, Terrebonne, St. Charles, St. John the Baptist, St. Tammany, St. James, Tangipahoa, Livingston, and Ascension.


Data will be collected via combination of random digit dialing of landline phones, random digit dialing of cell phones, listed sample, and an online survey (with a link to the online survey received in the mail.)



2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


All submissions will be carefully evaluated to ensure consistency with the intent, requirements, and boundaries of this programmatic clearance. Proposed collection instruments and procedures must comply with OMB guidance in “Guidance on Agency Survey and Statistical Information Collections (January 20, 2006).” The sampling methods and reporting statistical data must include a specific description of:

  • the sampling plan and sampling procedure (including stratification and selection methods for individual respondents);

  • how the instrument will be administered to respondents;

  • the planned analysis; and

  • expected confidence intervals.



Districts submitting information collection requests under this programmatic clearance process are strongly encouraged to pretest any information collection instruments to be used. Further, we will strongly encourage use of the programmatic clearance to obtain approval to conduct any pretesting that falls under the requirements of the Paperwork Reduction Act (i.e., more than nine individuals are being surveyed, etc.). In these cases, requests for approval to pretest surveys will be subject to the same requirements (i.e., a supporting statement, copy of the instrument, etc.) as a standard information collection.


All submissions under the program of expedited approval must fully describe the survey methodology. The description must be specific and describe, as appropriate, each of the following: (a) respondent universe, (b) the sampling plan and all sampling procedures, including how individual respondents will be selected, (c) how the instrument will be administered, (d) expected response rate and confidence, and (e) strategies for dealing with potential non-response bias. A description of any pre-testing and peer review of the methods and/or instrument is highly recommended. Further, all submissions under this clearance process will describe how data will be presented to managers and any others that will use results of the surveys, particularly in cases where response rates were lower than anticipated. In these cases, the Districts must take steps to ensure that the appropriate that the results will not be generalized outside the population of interest and explanations are provided with data presentations and reports so that users of the data understand any possible biases associated with the data.


The primary purpose of collections to be conducted under this clearance is to provide data which will be used in conjunction with other information to derive numerical values of certain evacuation behaviors which in turn will be used in transportation modeling of evacuation clearance times, along with shelter planning and public outreach. In general all collections under this clearance will be designed based upon accepted statistical practices and sampling methodologies, will gather consistent and valid data that are representative of the target population, address non-response bias issues, and achieve response rates needed to obtain statistically useful results.


Although the accuracy of population estimates is important, estimates calculated from the sample survey will not be the sole determinant in deriving numerical values for modeling and planning. They will be used in conjunction with a broader set of empirically-derived generalizations, equations, and adjustments from the evacuation behavior literature.


In general, greater importance is placed on estimates for evacuation zones which will evacuate more frequently and would be most hazardous if residents failed to evacuate. Thus, the population of each zone is not highly relevant to the targeted sample size. Because it is critical to know the evacuation zone of the respondent prior to making the call, landline telephone numbers are used for the bulk of the data collection. Conducting a random sample and assigning respondents to zones after the fact would not yield the desired spatial distribution of respondents, grossly over-representing residents inland of any evacuation zone. GIS shapefiles of evacuation zone boundaries are “overlaid” onto a database of geocoded landline telephone numbers. This strategy has been successfully employed in studies of this type throughout the U.S. coastal zone for more than a decade. The more general method, without GIS software, has been used successfully for more than three decades.


For the Southeast Louisiana Hurricane Public Evacuation Behavior Survey, which is being used as a supporting example for this submittal and is representative of future evacuation surveys that will be conducted in other hurricane evacuation study areas, the sampling and completed surveys will be broken out by Parish as shown in the table below. Breakout by Parish allows for the study results to be used effectively not only by Federal and State Emergency managers but also by Local Government Emergency Managers.




Parish

Population

Expected

Completion Rate

Sample Size

Plaquemines

23,879

15%

100

St. Bernard

41,567

15%

100

Jefferson

434,123

15%

200

Orleans

369,888

15%

200

Lafourche

96,965

15%

100

Terrebonne

111,713

15%

100

St. Charles

52,502

15%

100

St. John the Baptist

44,787

15%

100

St. Tammany

239,193

15%

200

St. James

21,717

15%

100

Tangipahoa (South of I-12 only)

123,662

15%

100

Livingston (South of I-12 only)

131,865

15%

100

Ascension

112,126

15%

100





Items of note for the Southeast Louisiana Survey:


  • 1,600 of the target 2,600 completes are made up of the parish minimum completion requirements.

  • We will deliver at least 100 completes in all parishes, with 200 completes in the 3 most populated parishes.

  • For the sample for the additional 1,000 completes, we will order landline and cell phone sample that is proportional to the population by parish of our total target population.

  • So for example, 21% of the sample will be targeting Orleans parish, 24% will be targeting Jefferson parish, etc.

  • This does not mean we will obtain 210 additional completes from Orleans parish. This is merely the sampling purchase strategy.

  • Finally, while the majority of the sample is designed to be Random Digit Dialing (RDD), we may need to purchase listed sample for the sparsely populated parishes of Plaquemines and St. James.

    • If after we have been RDD calling over a month, and we are not coming close to the target of 100 in each parish, we may buy listed sample to ensure we hit the target.


Sample will be collected via the following methods:


  • We plan to collect no more than 1820 (70%) completes via a combination of random digit dialing of landline numbers, listed sample, and online completes.

  • We plan to collect at least 780 (30%) completes via random digit dialing of cell phone numbers. 

    • Cell phone sample is less targeted in terms of geography and we’ll buy RDD sample that comes closest to matching our target area.

  • All RDD phone numbers and listed phone numbers will receive at least 8-attempts (2 evening, 2 day, 2 weekend, 2-anytime) or until they receive a final disposition.

  • Online/Listed Sample Details:

    • A limited number of respondents will receive a letter in the mail, which explains the survey, and provides them with a link to go to in order to fill the survey out online.

      • And the letter will tell them to expect a call if they have not yet filled the survey out online.

    • For this subset of respondents we will be using listed registered voter files of respondents in 10 of the 13 selected parishes who have phone numbers whose area codes do not match the area codes common to the area.

      • We will not be buying registered voter sample for Tangipahoa, Livingston, and Ascension parish as we are only targeting a geographic sub-set of respondent in that parish (such as South of I-12 for Livingston) and the sample vendor for the listed sample may not have capability of geo-targeting in that regard.

      • We may buy a small amount of listed sample for Ascension as we should be able to exclude the 70769 zip code.

    • Registered voter files are useful in that they contain home address, name, and phone number.

    • We will buy 2,000 such records, (or as many records as possible that meet the criteria if less than 2,000 are available) - All records will receive the letter, the survey link, and the follow up phone call.

      • If more than 2,000 records are available, we’ll work with the sample vendor to buy proportional sample, for example, 21% of the purchase 2,000 records will be in Orleans if possible.

    • Why are we buying a small portion of listed sample?

      • As the # of households in the US that are wireless only continues to grow, we are missing out on many respondents via traditional RDD sampling methods.

      • Respondents who are wireless only AND have cell phone numbers from outside of the area in which they live, are not captured in RDD-only polling.

      • Therefore if a cell phone only respondent does not have an area code associated with LA - they will not be reached by random digit dialing, this (using listed sample) is the only way to reach them.

      • For example - many Orleans Parish residents are cell-phone only, but have a 281 (Houston) area code, from when they evacuated for Katrina. They got a cell in Houston, and kept the number.












3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


For surveys designed to infer from a sample to a population, the Corps requires that proposals address issues of potential non-response. Surveys must incorporate best practices to maximize initial response rates (i.e., multiple follow-ups or call-backs, minimal hour burden). Further, specific strategies for detecting and analyzing non-response bias are to be included in the submission form accompanying survey instruments. These may involve the use of survey logs in which observable characteristics of all those initially contacted on-site are recorded and/or a short interview asking a small number of questions to survey respondents and non-respondents.


The Corps requires that the results of non-response bias analyses be included in technical reports, and that the likely effects of this bias (if any) on the interpretation of data must be made clear to decision makers. In some cases, it may be feasible to balance or post-weight a sample to align important sample statistics, e.g., demographic or zip code characteristics, with known population parameters. However, this does not guarantee that there will not be non-response bias in attitude, knowledge, or belief variables.


For the Southeast Louisiana Hurricane Public Evacuation Behavior Survey, which is being used as a supporting example for this submittal and is representative of future evacuation surveys that will be conducted in other hurricane evacuation study areas, additional considerations are listed below:


  • All RDD phone numbers and listed phone numbers will receive at least 8-attempts (2 evening, 2 day, 2 weekend, 2-anytime) or until they receive a final disposition.

    • Making more than 8 call attempts does not result in a substantially increased response rate.

    • We use the WinCATI system (http://www.sawtooth.com/index.php/software/wincati) to track when call attempts were made in order to ensure we have a higher probability of reaching respondents when they are home/available.

    • Calls will be made between 9am to 9pm on Weekdays, 10am-6pm on Saturdays, and Noon-5pm on Sundays.

  • For reference, at the LSU Public Policy Research Lab (PPRL, who will be conducting this survey), we have been surveying Louisiana daily since our founding in 2002. Since 2008 we have conducted the Behavioral Risk Factor Surveillance System (BRFSS) for Louisiana on behalf the Department of Health and Hospitals and CDC.

    • One of the requirements of BRFSS is making 15 call attempts on each number (or until a final disposition is reached.)

    • Only 12% of our completes for BRFSS (on average) occur past the 9th attempt on a number. Roughly 33% of our completes occur on the first attempt.

    • We therefore know that is a drain on resources (and an annoyance to constituents/respondents) to call a number more than 8 times.

    • The budget for this project is too low to justify making more than 8 attempts, as it will not result in a noticeable change in response rate.

  • As for increasing response rates:

    • At PPRL there are 54 call stations all running the latest version of industry standard WinCATI.

    • We have roughly 60 employees (and we’ll likely have over 100 for this project, as we’re currently increasing staffing) and all are trained in soft-refusal conversion.

    • All employees begin their work at PPRL with a 3-hour training session, which focuses on the CATI system itself, and soft refusal conversion (including paper based training in soft refusal conversion, CATI based training, and several different role-playing scenarios.)

      • All callers are trained in how to keep respondents interested. As most of our surveys are for government agencies, non-profit groups, and educators, our callers do this by emphasizing how the survey is designed to help the respondent’s community.

      • For BRFSS, the survey is intended to inform government as to what health ailments disproportionally affect certain areas. As the callers are trained to say, for example “if this survey uncovers that the rate of Diabetes (for example) is very high in St. Bernard parish (for example) then local, state, and federal government will then invest more money in that specific area in diabetes prevention and treatment. So, are you willing to help your local community?”

      • The callers use other examples that are relevant to the survey in question.

      • This survey will be easy for the callers to communicate the benefits to respondents. For example: “This survey is designed to help lawmakers evaluate current hurricane evacuation and preparedness plans and find ways to improve them”

    • A soft-refusal conversion is built into the current survey script.

    • Our callers are also trained to be reluctant to code a valid phone number as a “hard refusal” (which would remove it from the sample) unless the respondent is very adamant (or belligerent) with their refusal to take the survey.

  • Additionally, this project is designed to deal with potential under representation of the cell-only-no-local-number population. Which according to estimates could be as high as 9% in Louisiana (http://jssam.oxfordjournals.org/content/1/1/45.full.pdf+html). Hence why we are using a small amount of listed sample, and including a letter/online-option for those respondents.

  • Lastly, while the target is at least 30% of responses coming from cell-phone numbers, we’ll be buying a significant amount of cell phone sample, and we are willing to obtain more than 30% of our final completes via cell phone calling. Current estimates of the cell-only population in Louisiana stand at 33%. (http://www.cdc.gov/nchs/data/nhsr/nhsr061.pdf)


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.



Before surveys are conducted the data collection instruments and the survey process are carefully reviewed and pretested for simplicity/relevance as a means to reduce respondent burden and maximize the validity of responses. In addition, pre-testing will help to verify respondent comprehension, identify sources of measurement error, and refine estimates of burden hours. Pretests are primarily done on groups of less than ten respondents. Ideally, participants in pre-tests should be drawn from the same respondent universe as the full sample. However, if this is not feasible, a similar respondent universe should be used. Training for interviewers is usually held prior to the implementation of the survey and typically include role-playing and an actual field survey under supervised conditions. Once the interview process is begun, field supervisors periodically de-brief interviewers to identify any problems encountered including any unnecessary burdens being placed on respondents.


The sampling strategy generally described in this supporting statement has been applied successfully, and the results have been employed as intended in repeated surveys of this nature since the mid-1980’s. Refinements have been made to the method as databases and software have evolved. Almost all questions that will be included in the Southeast Louisiana hurricane public behavior survey questionnaire, and in future surveys for other hurricane evacuation study areas to be conducted under this clearance, have been used in previous USACE surveys either for hurricane evacuation studies or for post-hurricane evacuation assessment surveys. Some minor variations and additions have and will be made to tailor the questionnaires to the specific geographical study area. New questions are generally tested internally by the research firm that will be executing the survey.


Items of note for the Southeast Louisiana Survey:


  • We will briefly test the survey with a target of obtaining 5 completes from live calls.

    • The testing is intended to ensure that data collection & retention is working within the WinCATI system.

    • The testing is also intended to check the ‘flow’ of the survey instrument itself. If there are any sections or questions that are unclear or confusing to respondents, we may consider some minor edits for clarity.

      • Or we may fix the issues by adding additional training for our callers around the found issues.

    • If we make no changes to the instrument after this test, we will keep the 5 records in the final data set.

    • If we do make changes to the instrument after this test, we will delete these 5 records and not count them toward the final target of 2,600 completes.





5. Provide the names and telephone numbers of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The names and contact information of the responsible USACE liaison and the principal investigator(s) who will collect and analyze the data are included on all submission forms received under the programmatic approval. In addition, the following individuals were consulted on statistical and other design aspects of this program.


The commander of each Corps Division is ultimately responsible for approval of the sampling strategy, questionnaire, and analysis plan for surveys conducted in his or her division. Corps District staffs will consult with experts from local universities and/or contractors in developing specific survey and analytical plans.


The names and contact information for the Southeast Louisiana Hurricane Public Evacuation Behavior Survey, which is being used as a supporting example for this submittal and is representative of future evacuation surveys that will be conducted in other hurricane evacuation study areas, are listed below:


Crorey Lawton

Plan Formulator

USACE – New Orleans District

Phone: (504) 862-1281

Email: James.M.Lawton@usace.army.mil


Carla Quinn

Program Manager

USACE- Baltimore District

Phone: (410) 962-2941

Email: Carla.m.Quinn@usace.army.mil


Robert Massey

Director, Hurricane and Emergency Management Program

Dewberry

Phone: (678) 537-8637

Email: bmassey@Dewberry.com


Dr. Betty Morrow

Professor Emeritus

Florida International University

Phone: (305) 385-7364

Email: betty@bmorrow.com







Dr. Hugh Gladwin

Associate Professor

Florida International University

Phone: (305) 919-5839

Email: gladwin@fiu.edu


Dr. Shirley Laska

Professor Emerita

University of New Orleans

Phone: (504) 280-1254

Email: slaska@uno.edu


Brant Mitchell

Director of Research and Operations

Stephenson Disaster Management Institute – LSU

Phone: (225) 578-5939

Email: bmitch9@lsu.edu



Michael A. Climek

Operations Manager

LSU Public Policy Research Lab

Phone: (225) 578-7499

Email: mclimek@lsu.edu

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorE1PLXCPX
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy