1660-0105 Supporting Statement B - 2021 03 18 clean

1660-0105 Supporting Statement B - 2021 03 18 clean.docx

National Household Survey on Disaster Preparedness

OMB: 1660-0105

Document [docx]
Download: docx | pdf

March 11, 2021


Supporting Statement for

Paperwork Reduction Act Submissions



OMB Control Number: 1660 - 0105

Title: National Household Survey on Disaster Preparedness

Form Number(s): FEMA Form 008-0-FY-21-103 Telephone

FEMA Form 008-0-FY-21-104 Web


General Instructions


A Supporting Statement, including the text of the notice to the public required by 5 CFR 1320.5(a)(i)(iv) and its actual or estimated date of publication in the Federal Register, must accompany each request for approval of a collection of information. The Supporting Statement must be prepared in the format described below, and must contain the information specified in Section A below. If an item is not applicable, provide a brief explanation. When Item 17 or the OMB Form 83-I is checked “Yes”, Section B of the Supporting Statement must be completed. OMB reserves the right to require the submission of additional information with respect to any request for approval.


Specific Instructions


B. Collections of Information Employing Statistical Methods.



When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:


1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.


The Federal Emergency Management Agency (FEMA) Individual and Community Preparedness Division (ICPD) will collect preparedness information from the public via web-based and/or telephone surveys on an annual basis. This collection of information, which began in 2007, is necessary to increase the effectiveness of awareness and recruitment campaigns, messaging and public information, community outreach efforts, and strategic planning initiatives. Each year, the National Household Survey on Disaster Preparedness will measure the public’s knowledge, attitudes, and behaviors relative to preparing for all hazards and for specific hazards. The specific hazards selected may differ in each annual administration. They may include but are not limited to the following natural and human-made hazards: tornado, hurricane, flood, earthquakes, wildfire, extreme heat, winter storms and extreme cold, volcano, tsunami, landslide, power outages, pandemic, and urban events. General knowledge, attitudes, and behaviors for all hazards will be assessed at the national level using a nationally representative sample (national sample), while knowledge of specific hazards will be assessed at a target area level (regions or hazard areas).

The annual survey will consist of up to 7,000 respondents. The mix of respondents across national, regional, and hazard-specific areas and combination of modality implementation (landline and cell phone, and web-based) will vary across years. Different sampling methods will be employed for the national core sample and for the regional and hazard-specific samples. Additionally, samples will be drawn independently for the landline, cell phone, and web-based surveys. Decisions for modality implementations for subsequent years will be based on comparability and variability of responses as well as response rates. For example, while the program office plans for a web-based only implementation in Years 2 and 3, it may choose to use landline and cell phone-based samples if the web-based cohort proves not comparable to the phone-based cohort in Year 1.

Table 1. Example of Survey Implementation

Sample Type

Number of Interviews per Collection Method

Totals


Web Based

(FEMA Form 008-0-FY-104 Web)

Telephone Based

(FEMA Form 008-0-FY-103 Telephone)

TOTAL

Mixed-mode survey Option

National Sample

2,000

2,000

4,000

Hazard-specific sample (per Hazard area)

3,000 (500)

N/A

3,000

TOTALS Year 1

5,000

2,000

7,000

National Sample

2,000

N/A

2,000

Regional sample (per states)

4,500 (500)

N/A

4,500

Hazard-specific sample

500

N/A

500

TOTALS Year 2

7,000

0

7,000

National Sample

2,000

N/A

2,000

Regional sample (per states)

4,000 (500)

N/A

4,000

Hazard-specific sample (per Hazard area)

1,000

(500)

N/A

1,000

TOTALS Year 3

7,000

0

7,000



The potential respondent pool for the entire sampling frame includes the entire civilian non-institutionalized U.S. adult population (aged 18 years and older) having internet access, owning a cell phone, or residing in telephone-equipped dwellings. This population does not include adults in penal or mental institutions, or other institutionalized adults living in a dwelling without a telephone, cell phone, or internet access; and/or adults who do not speak English or Spanish well enough to be interviewed.

In previous years, the survey was implemented by telephone only, both landline and cell phone. The 2020 telephone survey response rate was less than one percent. Starting in 2021, the program office will implement a mixed mode approach using telephone based (both landline and cell phone) and/or web-based surveys. With the adjusted fielding approach, the expected response rate for the collection as a whole is three-to-five percent.


2. Describe the procedures for the collection of information including:


-Statistical methodology for stratification and sample selection:


National Sample

To ensure sufficient National representation in the National Core Sample, a proportional geographic stratification approach will be implemented. In total, ten strata will be created based on nine U.S. Census Bureau divisions and one territorial stratum.

  • U.S. Census Strata: Pacific, Mountain, West North Central, West South Central, East North Central, East South Central, New England, Middle Atlantic, and South Atlantic.

  • Territorial Strata: U.S. Territories not included in U.S. Census Strata.

The sample allocation across the ten strata will be proportional to the population size of each strata. Using proportional sample allocation, a target will be set for the number of surveys to be completed in each stratum. In addition, each stratum should strive for proportional representation for the following demographic variables when possible: age, gender, language, race, ethnicity, disability, education, income, employment, household composition, and geographic settlement types.

FEMA Region-Specific Sampling Approach

States and territories of the United States are divided into ten FEMA Regions. Each FEMA Region has between four and nine states or territories.1 To ensure sufficient representation across states and territories in Region-specific samples, a proportional geographic stratification approach may be implemented by state.

The sample allocation across the states may be proportional to the population size of the state or territory. Using proportional sample allocation, a target may be set for the number of surveys to be completed in each state or territory within the region. The total number of surveys for each Region sample may be informed by the number of states in the region with a guideline of 500 respondents per state. For example, a state with five regions could include 2,500 respondents, but the number of respondents in each state or territory may vary based on the population distribution among states and territories in that Region.

Hazard-Specific Samples


The hazard samples will be selected independently from the National sample and may be selected independently from the Regional sample. Strategic hazard selection will be based on the frequency of disaster declarations and/or emerging needs or priorities. Accordingly, hazard selections will vary from year to year. Sample populations for each hazard may be defined based on the National Risk Index2, recent events, or emerging needs and priorities. For each hazard, FIPS or zip codes will be defined from which to select eligible respondents. Each hazard sample will include approximately 500 completed survey responses.

Telephone Sampling. The sample of telephone numbers will be selected (without replacement) from all area code exchange combinations for the corresponding geographic area, following the list-assisted telephone sampling method. The cell phone sample will be drawn separately (without replacement) from dedicated exchanges for cell phones for the targeted areas. Phone numbers for businesses or other commercial establishments will be dropped from consideration for the survey. Due to continuous porting of numbers from landline to cell and cell to landline, some numbers from landline exchanges may turn out to be cell phones and conversely, some numbers sampled from the cell phone exchanges may be landline numbers. It is expected that such numbers will be relatively rare and the vast majority of landline and cell phone numbers will be from the corresponding frames. It is also possible that an individual respondent may have a telephone number in one region while he/she may actually be living in another region. The physical location of respondents and what type of phone they are using will be confirmed based on their response to survey questions.

A sufficient number of phone numbers will be acquired to account for the high proportion of inactive, ineligible, and non-responsive/refusals that are anticipated. We assume an overall 40:1 completion ratio with respect to the number of phone numbers that will need to be dialed in order to obtain one survey. The sample will comprise of roughly 70 percent cell phones and 30 percent landlines.

For landline telephones, there will be an additional level of within-household sampling. A modified “next birthday” method will be used to select one eligible person from all eligible adults in each household to participate in the survey. If that person is not available, then the person answering the phone will be asked to complete the survey, if otherwise eligible to participate. This is much less intrusive than the purely random selection method or grid selection that requires enumeration of all household members to make a respondent selection. This method also reduces selection bias which would be observed by individuals most motivated to answer the phone. For respondents reached on cell phones, there will not be any additional stage of sampling. The person answering the call will be selected for the survey if he/she is found otherwise eligible. A quota sampling method is used until the target number of interviews is completed within each category (geographic region, landline/cell).

Web-based Sampling. Potential web-based survey participants will be randomly selected across online panels, using a web-based application programming interface. Potential participants will not know ahead of time if they will be selected to participate in a certain survey. Invitations are sent to members of the panels that have previously indicated they are residents of the United States. Residency information will be reconfirmed to be sure they are still residents of the United States and its territories, along with location information so the data can be properly weighted based upon the proportional population breakouts. Random sampling is based on select demographic variables (e.g. region, gender, age, race). To ensure the best basis for conformity between the sample and the target group, quota sampling is used as needed. When the required criteria for the survey have been established a random sample is pulled to receive the email invitations to participate in the survey. Quotas on age, gender, and geographic region can be pre-set to limit the number of responses within each category.

Panel recruitment utilizes probability-based recruitment and comes from a variety of sources. The online panels have been recruited through a diversified network rather than through a single source to avoid 'professional' panelists. To ensure that the recruitment is as broad, diversified, and as exhaustive as possible, a wide range of different methods and sources are used for recruitment. Both broadly targeted and more narrowly targeted campaigns are used to ensure necessary diversity, and to ensure that specific hard-to-reach target groups are represented on the panel.


-Estimation procedure:


Survey Weights

Post-data collection statistical methods will be employed to combat known biases inherent in sampling methods, increasing the overall generalizability of results. Base weights will be constructed by using the inverse probability of selection based on the sampling frame. For example, for national sampling, aggregating the population counts from the U.S. Census Bureau over states for each census division (including the constructed region for territories) and calculating the relative population proportion of each census division and then taking the inverse.







Table 2. Base Weight Construction Example for National Sample


Count

Proportion

Weight

Census Division 1 Population: Sum of State Populations for Division 1

X1

X1 /Y = p1

1/p1== w1

Census Division 2 Population: Sum of State Populations for Division 2

X2

X2 /Y = p2

1/p2== w2



Total US +Territories Population

Y

1

Y



We anticipate distinct non-overlapping sampling frames for national, regional, and hazard. However, in the case that there is overlap in areas selected. Distinct weights will be constructed similarly for national, regional, and hazard samples. Similar to using U.S. Census Bureau division populations for national base weights, we would use state populations for regional base weights and population by county for hazard base weights. No separate weighting methods based on collection modality (telephone vs. web) are necessary.

Post Stratification Weighting

For each set of base weights we will run post-stratification weighting to adjust for the following demographic variables when possible: age, gender, language, race, ethnicity, disability, education, income, employment, household composition, and geographic settlement types. For the national sample, we use the U.S. Census Bureau division proportions of these demographics to generate the raking procedure for the post-stratification weights. Similarly, we would use regional, state, territory, or county demographics to rake for regional or hazard weights.

-Degree of accuracy needed for the purpose described in the justification:


We plan to complete about 5,000 telephone interviews per administration including about 2,000 interviews using a national level sample and around 500 interviews in each of six selected hazard profiles. The survey estimates of unknown population parameters (for example, population proportions) based on a sample size of 5,000 will have a precision (margin of error) of about +1.4 percentage points at 95 percent level of significance. This is under the assumption of no design effect and under the most conservative assumption that the unknown population proportion is around 50 percent. The margin of error (MOE) for estimating the unknown population proportion ‘P’ at the 95 percent confidence level can be derived based on the following formula:


MOE = 1.96 * where “n” is the sample size (i.e. the number of completed surveys).


The sampling error of estimates for this survey will be computed using special software (SPSS, SAS, etc.)) that calculates standard errors of estimates by considering the complexity, if any, in the sample design and the resulting set of unequal sample weights.

-Unusual problems requiring specialized sampling procedures:


Unusual problems requiring specialized sampling procedures are not anticipated at this time. If response rates fall below the expected levels, additional samples will be released to generate the targeted number of surveys. However, all necessary steps to maximize response rates will be taken throughout the data collection period and hence such situations are not anticipated.

-Any use of periodic (less frequent than annual) data collection cycles to reduce burden:



During each annual administration of the survey, independent samples will be drawn and so the probability of selecting the same respondent in multiple administrations will be extremely low.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


Maximize response for phone surveys. The survey team will develop a comprehensive plan to maximize response rates across all telephone-based surveys, to include the following:

  • A call design that will ensure call attempts are made at different times of the day and different days of the week to maximize contact rates

  • Conducting an extensive interviewer briefing prior to the field period that educates interviewers about the content of the survey as well as how to handle reluctance and refusals

  • Strong supervision plan that will ensure that high-quality data are collected throughout the field period

  • Using troubleshooting teams to attack specific data collection problems that may occur during the field period

  • Customizing refusal aversion techniques


A maximum of six calls will be made on the phone number to reach the specific person we are attempting to contact. In the rare circumstances that an individual does not complete the interview, follow-up calls will be made to complete the interview with that selected person.

Maximize response for web-based surveys. The web-based survey will utilize standard techniques to maximize response rates. Potential respondents will receive a pre-notification letter detailing the purpose of the survey, followed by an email survey notification containing an embedded hyperlink to where the survey can be completed. The survey questionnaire will provide clear instructions for completing the survey and reiterate the purpose of the survey and how results will be used. The survey will be open for a period of six-to-eight weeks, to allow sufficient time for response. Reminder email notification(s) will be sent to non-respondents each week, providing the survey hyperlink and encouraging response. The survey will allow respondents to resume completing the survey where they left off. A web-based graphic user interface will be utilized to facilitate navigation and completion of the survey across multiple devices. Web-based participants will receive points for participation from the online panels they are recruited from.

Issues of Non-Response. Survey based estimates for this study will be weighted to minimize any potential bias, including any bias that may be associated with unit level nonresponse. All estimates will be weighted to reduce bias and it will be possible to calculate the sampling error associated with any subgroup estimate in order to ensure that the accuracy and reliability is adequate for intended uses of any such estimate. Based on prior experience from conducting similar surveys and given that the mode of data collection for the proposed survey is telephone and web-based, the extent of missing data at the item level is expected to be minimal.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


The telephone-based questionnaires and processes have been used in previous data collection efforts (2013 - 2020), with minor adjustments made to the survey questions. Pilot tests for both phone and web-based surveys will be conducted with no more than nine respondents for each survey instrument. The pilot test will check for correct skip patterns and procedures, as well as comprehensibility of questions. Data and feedback from the pilot surveys will be analyzed and used to make necessary revisions to the survey tools to enhance the user-friendliness, reliability, and utility of the questionnaires.

5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Deloitte Consulting LLP has been contracted by the FEMA’s Individual and Community Preparedness Division to conduct this survey data collection and analyze the results. The individuals consulted on the statistical aspects of design are:


Joseph M. Faulk, MPA

Preparedness Data Lead

Federal Emergency Management Agency

Office: (202) 212-7723

[email protected]


Steve Precker

Teracore, Inc.

700 12th St NW

Washington, DC 20005

Mobile: (571)338-9101

[email protected]


Leanne Goldstein, DrPH

Survey Statistician

Deloitte Consulting

1919 N Lynn St.

Arlington, VA 22209

(202) 714-4609

[email protected]


Marc Penz

System Administrator

Zobgy Analytics

901 Broad Street Suite 307

Utica, NY 13501

[email protected]



1 https://www.fema.gov/about/organization/regions

2https://hazards.geoplatform.gov/portal/apps/MapSeries/index.html?appid=ddf915a24fb24dc8863eed96bc3345f8

7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-04-22

© 2024 OMB.report | Privacy Policy