FRM Supporting Statement Part B FINAL COPY 3.25.2016

FRM Supporting Statement Part B FINAL COPY 3.25.2016.docx

Flood Risk Management

OMB: 0710-0017

Document [docx]
Download: docx | pdf

Supporting Statement B


Programmatic Review for USACE Sponsored Flood and Coastal Storm Damage Surveys


OMB Control Number 0710-XXXX



Collections of Information Employing Statistical Methods


The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When the question “Does this ICR contain surveys, censuses, or employ statistical methods?” is checked "Yes," the following documentation should be included in Supporting Statement B to the extent that it applies to the methods proposed:


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The potential respondent universe will consist of business owners, public officials, and residents of communities experiencing flood or storm damages. All study proposals must include a description of a survey’s particular respondent universe.


Based on experience with the existing Programmatic Clearance (0710-0001), we estimate that there will be approximately 7,000 annually completed surveys that are primarily face-to-face but will also include some mail back and online survey respondents. Respondent types will primarily include business owners, public officials, and residents who have been or are potential victims of flooding.


The number of entities in the universe covered by the survey, and in any sample of that universe (including each stratum of samples which are stratified) will be provided in tabular form. Expected response rates for survey data collection will be indicated, based on past results of similar surveys and other factors such as the length and complexity of the survey questionnaire.



2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

All submissions will be carefully evaluated to ensure consistency with the intent, requirements, and boundaries of this programmatic clearance. Proposed collection instruments and procedures must comply with OMB guidance in “Guidance on Agency Survey and Statistical Information Collections (January 20, 2006).” The sampling methods and reporting statistical data must include a specific description of:


  • the sampling plan and sampling procedure (including stratification and selection methods for individual respondents);

  • how the instrument will be administered to respondents;

  • the planned analysis; and

  • desired confidence intervals and estimation procedures.


Districts submitting information collection requests under this programmatic clearance process are strongly encouraged to pretest any information collection instruments to be used. Pretests will normally emphasize intensive debriefing of less that 10 respondents , to identify questionnaire problems. Further, we will strongly encourage use of the programmatic clearance to obtain approval to conduct any pretesting that falls under the requirements of the Paperwork Reduction Act (i.e., more than nine individuals are being surveyed, etc.). This will normally be required to pilot test survey implementation procedures. In these cases, requests for approval to pretest surveys will be subject to the same requirements (i.e., a supporting statement, copy of the instrument, etc.) as a standard information collection.


The Corps of Engineers division offices will conduct an administrative review of each request and oversee technical reviews of each request to ensure statistical validity and soundness. All information collection instruments will be designed and deployed based upon acceptable statistical practices and sampling methodologies, where appropriate, and will be used to obtain consistent, valid data that are representative of the target populations, account for non-response bias, and achieve response rates and sample sizes at or above levels needed to obtain statistically useful results.


All submissions under the program of expedited approval must fully describe the survey methodology. The description must be specific and describe, as appropriate, each of the following: (a) respondent universe, (b) the sampling plan and all sampling procedures, including how individual respondents will be selected, (When appropriate, the sampling plan shall require either a simple random sample or a systematic sample with a random starting point. For systematic samples, the sampling interval shall be determined based on the desired sample size.) (c) how the instrument will be administered, (d) desired response rate and confidence, and (e) strategies for dealing with potential non-response bias. A description of any pre-testing and peer review of the methods and/or instrument is highly recommended. Further, all submissions under this clearance process will describe how data will be presented to managers and any others who will use results of the surveys, particularly in cases where response rates are lower than anticipated. In these cases, program managers must take steps to ensure that the results will not be generalized outside the population of interest and explanations are provided with data presentations and reports so that users of the data understand any possible biases associated with the data.


In its technical and administrative review, the Corps will work with researchers to ensure that information-collection procedures are appropriate for the intended uses of the data, including selection of the appropriate unit of analysis.


Stratification is often used to increase the efficiency and effectiveness of the sampling design and to reduce the survey burden. For example, for residential flood damage surveys, stratification by residential structure type (e.g., number of floors and with or without basement) is always used. Most surveys are conducted in response to a special study need and are, therefore, one-time requests, not requiring annual reporting.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


For surveys designed to infer from a sample to a population, the Corps requires that proposals address issues of potential non-response. Surveys must incorporate best practices to maximize initial response rates (i.e., multiple follow-ups or call-backs, minimal hour burden). Further, specific strategies for detecting and analyzing non-response bias are to be included in the submission form accompanying survey instruments. These may involve the use of survey logs in which observable characteristics of all those initially contacted on-site are recorded and/or a short interview asking a small number of questions to survey respondents and non-respondents. Investigators conducting telephone surveys may use their most experienced interviewers to convert “soft refusals” to completed interviews in order to maximize response rates.


The Corps requires that the results of non-response bias analyses be included in technical reports, and that the likely effects of this bias (if any) on the interpretation of data must be made clear to managers. In some cases, it may be feasible to balance or post-weight a sample to align important sample statistics with known population parameters e.g., demographic or zip code characteristics. However, this does not guarantee that there will not be non-response bias in attitude, knowledge, or belief variables.


Response rates are maximized through careful consideration to detail in the development, not only of the survey questionnaire, but also of the entire survey implementation process. For example, for mail questionnaires, Dillman’s “Total Design Method” is typically used including: 1) “multi-wave” mailings of the original questionnaire, postcard reminders, and follow-up mailings to those still not responding; 2) authoritative and informative and persuasive cover letters; and 3) carefully worded and formatted questionnaires.


Based on our experience with the existing Programmatic Clearance, we anticipate response rates at or above levels needed to obtain statistically viable results. Response rates for flood damage surveys have typically ranged from 30% for businesses to 50% for residential surveys, with much higher rates for public damage surveys. Response rates are boosted by repeat visits in order to contact respondents for face-to-face interviews during alternative weekdays and weekends and evening hours as well as daytime hours.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Before surveys are conducted the questionnaires and the survey process are carefully reviewed and pretested for simplicity and relevance. Pretests are primarily done on groups of less than ten respondents. Training for interviewers is usually held prior to the implementation of the survey and typically include role-playing in an actual field setting under supervised conditions. Once the interview process is begun, field supervisors periodically de-brief interviewers to identify any problems encountered including any unnecessary burdens being placed on respondents. Pretests are primarily done on groups of less than ten respondents.


Most all of the survey questions in this package have been used before in previous survey efforts and many cases modified based on the review from previous survey experience. Before new surveys are conducted the questionnaires and the survey process are carefully reviewed and pretested for simplicity and relevance. Training for interviewers is usually held prior to the implementation of the survey and typically include role-playing and an actual field survey under supervised conditions. Once the interview process is begun, field supervisors periodically de-brief interviewers to identify any problems encountered including any unnecessary burdens being placed on respondents. Pretests are primarily done on groups of less than ten respondents.


5. Provide the names and telephone numbers of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The commander of each Corps Division is ultimately responsible for approval of the sampling strategy, questionnaire, and analysis plan for surveys conducted in his or her division. Corps District staffs will consult with experts from local universities and/or contractors in developing specific survey and analytical plans. Survey design and statistical analysis experts consulted in developing Corps guidance manuals include: Dr. Allan Mills of Virginia Commonwealth University, Jason Weiss of URS Corporation, Kurt Keilman of the South Pacific Division, Corps of Engineers, and Lance Awsumb of the St. Paul District, Corps of Engineers


3

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement for Programmatic Clearance for NPS-sponsored Public Surveys
Authormmcbride
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy