0710-0017_ssb_12.7.2020

0710-0017_SSB_12.7.2020.docx

Flood and Coastal Storm Damage Surveys

OMB: 0710-0017

Document [docx]
Download: docx | pdf

Supporting Statement Part B

U.S. Army Corps of Engineers

Flood and Coastal Storm Damage Survey – OMB 0710-0017

B.  COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

If the collection of information employs statistical methods, it should be indicated in Item 17 of OMB Form 83-I, and the following information should be provided in this Supporting Statement:

1. 
Description of the Activity

The potential respondent universe will consist of business owners, public officials, and residents of communities experiencing flood or coastal storm damages or who is at risk of flooding. All study proposals must include a description of a survey’s particular respondent universe.


Based on experience with the portfolio of Flood Risk Management Projects and with Corps surveys which had been previously used under the now-discontinued Programmatic Clearance (0710-0001), we estimate that there will be approximately 3,000 annually completed surveys. The majority of these would be conducted through face-to-face interviews but will also include some mail back and online survey respondents.

The target population for this collection is homeowners and business owners in various floodplains throughout the U.S. Random samples of property owners will be selected using address-based sampling (ABS) and the United States Postal Service’s Computerized Delivery Sequence File (CDSF). The Corps of Engineers will use floodplains developed by FEMA as well as local flood control agencies to pinpoint properties at risk of flooding.

The number of entities in the universe covered by the survey, and in any sample of that universe (including each stratum of samples which are stratified) will be provided in tabular form. Expected response rates for survey data collection will be indicated, based on past results of similar surveys and other factors such as the length and complexity of the survey questionnaire.

2.  Procedures for the Collection of Information

a. Statistical methodologies for stratification and sample selection;

The sample selection and stratification may vary depending on the characteristics of the floodplain. For example, a homogenous floodplain containing identical properties may enable smaller samples to be drawn. Likewise, a floodplain containing a variety of diverse properties (e.g., ranch homes alongside multi-story apartment buildings) would likely require stratified samples to reasonably capture the flood-prone properties which a key part of the Corps’ economic analysis. Since most communities contain a greater number of residential structures than other building types, it is likely that stratified samples would be applied in order to capture the non-residential structures. In addition, areas closest to the river or shore (flood impact area) would be sampled at a higher rate than those furthest from the river or shore.

a. Estimation procedures;

The estimation procedures may vary depending on the characteristics of the floodplain as well as the size of the potential flood event or in the aftermath of a specific flood event.

b. Degree of accuracy needed for the purpose discussed in the justification;

The survey responses will provide meaningful data on flood damages which are used in the formulation and justification of flood improvement projects as well as reporting on the damages prevented by Corps infrastructure projects. The confidence levels for these estimates will vary with the type of estimate and with the precision of the associated floodplains. While the precision of these parameters is difficult to predict in advance, based on past experience with similar models, the study team believes that reasonably precise estimates can be obtained with 200 or more responses.

c. Unusual problems requiring specialized sampling procedures; and

No specialized sampling procedures will be used. Households will be selected using stratified random sampling of households and/or business owners from several strata (type of structure, proximity to river, etc.)

d. Use of periodic or cyclical data collections to reduce respondent burden.

Most surveys are conducted in response to a special study need and are, therefore, one-time requests, not requiring annual or even periodic reporting.

3.  Maximization of Response Rates, Non-response, and Reliability

Response rates are maximized through careful consideration to detail in the development, not only of the survey questionnaire, but also of the entire survey implementation process. For example, for mail questionnaires, Don Dillman’s “Total Design Method (TDM)” is typically used including: (1) “multi-wave” mailings of the original questionnaire, postcard reminders, and follow-up mailings to those still not responding; (2) authoritative and informative and persuasive cover letters; and (3) carefully worded and formatted questionnaires. The TDM has long served as a general framework for designing both mail and telephone surveys. In recent years, it has been recast as the tailored design method and applied to the design of Internet and mixed-mode surveys as well as postal surveys.


Based on our experience with the Corps of Engineers administering Flood Risk Management surveys, previously under the now-discontinued Programmatic Clearance 0001, we anticipate response rates at or above levels needed to obtain statistically viable results. Response rates for flood damage surveys following storms in Houston, Sacramento, North Carolina and the Upper Midwest have typically ranged from 30% for businesses to 50% for residential surveys, with much higher rates for public damage surveys. Response rates are boosted by repeat visits in order to contact respondents for face-to-face interviews during alternative weekdays and weekends and evening hours as well as daytime hours.

Several measures will be taken to encourage sampled individuals to respond to the mail survey, including:

  • Branded survey materials with color USACE logos;

  • Multiple follow-up reminders after the initial invitation; and,

  • Provision of a toll-free number in survey correspondence to address any questions.

Despite these measures, the response rate for the web survey may be as low as 30%, raising potential concerns about non-response bias. Demographic differences between respondents and non-respondents will be addressed by calibrating design weights through iterative proportion fitting, or “raking” (Kolenikov 2014; Battaglia, Hoaglin, and Frankel 2009) to match demographic controls from the American Community Survey (e.g., gender, age, ethnicity, and education) within each of the sampling strata.

Even after controlling for demographic differences between respondents and non-respondents, flood victims who have easy access to adjustors claims and who have adequate time to respond to the survey, would be more likely to respond to the survey as well as providing accurate information. The potential for this type of bias will be investigated through a targeted non-respondent follow-up survey (NRFU). The NRFU survey will consist of a subset of questions from the main survey, including general questions about participation in outdoor recreation and demographics. The survey will be formatted as an oversized postcard and sent to a sample of 500 non-respondents via priority mail. Responses to the NRFU survey will be compared to responses to the main survey to assess the potential for non-response bias.

Finally, to further examine bias, flood damage estimates generated from the survey data will be compared to estimates generated through other Corps studies having similar flood characteristics and floodplain inventories. If large differences are observed, the survey data may be calibrated to align more closely with those found in the other studies.

4.  Tests of Procedures

Before surveys are conducted the questionnaires and the survey process are carefully reviewed and pretested for simplicity and relevance. Pretests are primarily done on groups of less than ten respondents. Training for interviewers is usually held prior to the implementation of the survey and typically include role-playing in an actual field setting under supervised conditions. Once the interview process is begun, field supervisors periodically de-brief interviewers to identify any problems encountered including any unnecessary burdens being placed on respondents. Pretests are primarily done on groups of less than ten respondents.


Most all of the survey questions in this package have been used in previous survey efforts and many cases modified based on the review from previous survey experience. The bank of questions was also shared with the Corps of Engineers’ Flood Risk Management Planning Center of Expertise and the Economics Community of Practice. Before new surveys are conducted, the questionnaires and the survey process are carefully reviewed and pretested for simplicity and relevance. For example, the recent generic survey now includes questions on social media as disseminators of information.

5.  Statistical Consultation and Information Analysis

a. Provide names and telephone number of individual(s) consulted on statistical aspects of the design.

Survey design and statistical analysis experts consulted in developing Corps guidance manuals include:


  • Dr. Allan Mills of Virginia Commonwealth University (804) 828-0100,


  • Jason Weiss of AECOM Corporation (301) 502-8457,


  • Kurt Keilman of the South Pacific Division, Corps of Engineers (415) 503-6596,


  • Meredith Bridgers and Kevin Knight of the Institute for Water Resources, Corps of Engineers and (703) 428-8458 and (703) 428-7250, respectively.



  • Lance Awsumb of Northwestern Division, Corps of Engineers (312) 846-5588.

b. Provide name and organization of person(s) who will actually collect and analyze the collected information.

The commander of each Corps Division is ultimately responsible for approval of the sampling strategy, questionnaire, and analysis plan for surveys conducted in his or her division. Corps District staffs will consult with experts from local universities and/or contractors in developing specific survey and analytical plans.


References

Battaglia, M. P., D. C. Hoaglin, and M. R. Frankel. 2009. “Practical Considerations in Raking Survey Data,” Survey Practice 2(5).



3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPatricia Toppings
File Modified0000-00-00
File Created2021-03-02

© 2024 OMB.report | Privacy Policy