Supplemental Questions for DOC/NOAA Customer Survey Clearance
(OMB Control Number 0648-0342)
An evaluation of the Impact Based Warning tool in the
National Weather Service Central Region
1. Explain who will be conducting this survey. What program office will be conducting the survey? What services does this program provide? Who are the customers? How are these services provided to the customer?
The proposed information collection will be conducted by a team of social scientists from a collection of Sea Grant College Programs (Sea Grant) and the NOAA Coastal Services Center to evaluate the effectiveness of the Impact Based Warning (IBW) tool in the National Weather Service (NWS) Central Region. Specifically, the information will be guided by Caitie McCoy (Illinois/Indiana Sea Grant), Hilarie Sorensen (Minnesota Sea Grant), Katherine Bunting-Howarth (New York Sea Grant), Jane Harrison (Wisconsin Sea Grant), and Chris Ellis (NOAA Coastal Services Center (Center)).
Environmental stewardship, long-term economic development and responsible use of America’s coastal, ocean and Great Lakes resources are at the heart of Sea Grant’s mission. Sea Grant is a nationwide network (administered through NOAA), of 32 university-based programs that work with coastal communities. The National Sea Grant College Program engages this network of the nation’s top universities in conducting scientific research, education, training, and extension projects designed to foster science-based decisions about the use and conservation of our aquatic resources. One of Sea Grant’s four primary focus areas is hazard resilient coastal communities, of which a primary audience is state, regional, and local emergency managers. This focus area ties directly to the NWS Central region geography, which includes the Great Lakes states of Minnesota, Wisconsin, Illinois, Indiana, Michigan, and northwest portions of Ohio.
The Center serves the needs of coastal and natural resource management programs and professionals (e.g., state natural resource management agencies and staff, conservation organization staff, as well as colleagues throughout NOAA line offices) through development and delivery of data and information products, decision-support tools, professional development training, evaluation, and technical assistance on a variety of topics. Data and information products and decision support tools are delivered per customer requests via online systems (e.g., clearing house, direct download). Professional development training is offered in three distinct areas: geospatial technology, coastal issues, and process skills. Delivery methods include face-to-face instructor-led training held at the Center and local host sites, self-guided web-based training, and instructor-led web-based training. Evaluation and technical assistance services are conducted electronically, or remotely (on-site) depending on the needs of the specific customer and the specific topic.
2. Explain how this survey was developed. With whom did you consult during the development of this survey on content? Statistics? What suggestions did you get about improving the survey?
The proposed research was piloted and evaluated in five NWS Weather Forecast Offices (WFOs) in 2012. The proposed work is based upon the methodology used in the pilot study by Gallupi, Losego, and Montoz (2013). The survey instrument and focus group questions for the proposed study were developed by reviewing the list of cleared questions for the Generic Clearance, OMB Control No. 0648-0342, and through consultation with Gallupi, et al. They identified areas in which the pilot survey possessed certain limitations in question clarity, order, variable measurement, and ability for respondents to meaningfully respond to question items. Thoughtful effort has been made to keep the proposed list of questions as short as possible, inquire only upon tangible actions, and to ensure relevance to the primary evaluation user audience, which is the National Weather Service.
Further, our research team possesses extensive expertise in the areas of survey design and administration, focus group administration, and product evaluation and have closely collaborated in producing the proposed survey and focus group question list.
3. Explain how the survey will be conducted. How will the customers be sampled (if fewer than all customers will be surveyed)? What percentage of customers asked to take the survey will respond? What actions are planned to increase the response rate? (Web-based surveys are not an acceptable method of sampling a broad population. Web-based surveys must be limited to services provided by Web.)
The survey will be created using the SurveyMonkey® software program (for which the Coastal Services Center has received NOS Webmaster approval for use) and administered via an emailed link, which will direct sampled participants to the SurveyMonkey® website. Since this work is based upon evaluating a tool used for extreme weather events in the NWS Central Region, sampling will take place only in those states. The research team will obtain staff listings from the NWS WFOs responsible for releasing the Impact Based Warning information, as well as staff listings for the state, regional, and local emergency management offices, and local media outlets within the broadcast area of the event. The NWS Central Region headquarters will provide the staff listing information. A random sample will be obtained from these staff listings. The number of responses and content will be recorded. It will be possible to calculate an actual response rate for reporting purposes. The number of questions has been made as brief as possible in an effort to achieve a higher response rate. The total estimated sample size is 300. Based on previous interactions with comparable audiences, the expected response rate is expected to be relatively high (approximately 70%).
A purposive sample will be used to select focus group participants from each of the three target audience sub-categories (NWS weather forecast office staff, regional and local emergency managers, and broadcast media personnel). Because the target population possesses great diversity in professional roles and responsibilities, it is important to select those individuals that have the greatest involvement with messaging activities around Impact Based Warnings to the public. Only individuals that have experienced an extreme weather event (e.g. tornado or severe thunderstorm warning) will be targeted for focus group participation. The purpose of the focus groups are to facilitate discussion around the actual IBW message content in order to ascertain which components were understandable, actionable, meaningful, and applicable to the event type. Such data, following the survey collection, shall provide rich context to the quantitative survey responses, which will describe the general utility of the IBW.
4. Describe how the results of this survey will be analyzed and used. If the customer population is sampled, what statistical techniques will be used to generalize the results to the entire customer population? Is this survey intended to measure a GPRA performance measure? (If so, please include an excerpt from the appropriate document.)
Analysis of survey data will be undertaken through basic descriptive statistics (e.g., mean, median scores) and correlation with relevant variables. This information collection seeks to assess the message utility of the NWS Impact Based Warning tool and determine which factors affect message utility. Because the respondent universe is quite small, it will include a census of individuals within the target population for each extreme weather event location.
A purposive sample will be used to select focus group participants from each of the three target audience sub-categories (NWS weather forecast office staff, regional and local emergency managers, and broadcast media personnel). Because the target population possesses great diversity in professional roles and responsibilities, it is important to select those individuals that have the greatest involvement with messaging activities around Impact Based Warnings to the public. In the event there are an absence of events this season, we propose to conduct a comparable information collection based on hypothetical/scenario-based event, though it is most certain that extreme weather events will be present to conduct the described work as planned.
Data from this information collection will be aggregated with other data for GPRA reporting:
GPRA Measure: Percentage of US coastal states and territories demonstrating 20% or more annual improvement in resilience capacity to weather and climate hazards (%/year).
PART B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g. establishments, State and local governmental units, households, or persons) in the universe and the corresponding sample are to be provided in tabular form. The tabulation must also include expected response rates for the collection as a whole. If the collection has been conducted before, provide the actual response rate achieved.
The population for the resource being evaluated consists of federal employees (National Weather Service, Weather Forecast Office staff), regional and local emergency managers (designated below as local government), and broadcast media providers. Following is the breakdown of the population of interest.
Population (Based on a 70% response rate) |
% of Participants (total 100%) |
Federal Government |
30% (n=82 survey respondents) |
Local Government |
50% (n=112 survey, 24 focus group) |
Broadcast Media |
20% (n=31 survey, 24 focus group) |
We anticipate a 70% response rate based on similar, past survey work. For the focus group, we would generally hope for greater participation, but since focus group activity will take place post-extreme event, we anticipate a number of folks having logistical conflicts in attending and participating in a focus group.
This survey will be distributed via email. The focus group questionnaire will be administered in focus groups, in person. Both will inquire on comprehension and effectiveness of information shared through impact based warning communications. The estimated time necessary for each respondent to complete the questionnaire is 20 minutes, based on trials with a small (less than ten) pilot sample. Total estimated public burden associated with this information collection is 147 hours (survey response = 225 participants X 20 minutes; focus group response = 48 participants x 90 minutes). The SurveyMonkey® software program will keep track of the total number of complete survey responses.
2. Describe the procedures for the collection, including: the statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose described in the justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The survey will be created using SurveyMonkey® and administered via an emailed link, which will direct respondents to the SurveyMonkey® website. Completed surveys received via SurveyMonkey® will be downloaded to a password-protected work space at the Coastal Services Center, accessible only by staff particular to this project. Respondents will be instructed not to provide identifying information on the survey (names, social security numbers, dates of birth, etc.), and any identifying information placed on surveys will be removed.
A purposive sample will be used to select focus group participants from each of the three target audience sub-categories (NWS weather forecast office staff, regional and local emergency managers, and broadcast media personnel). Because the target population possesses great diversity in professional roles and responsibilities, it is important to select those individuals that have the greatest involvement with messaging activities around Impact Based Warnings to the public. The NWS forecast office staff will provide assistance in the participant sampling process, based on their intimate knowledge of local emergency management and broadcast media staff in their local areas.
3. Describe the methods used to maximize response rates and to deal with nonresponse. The accuracy and reliability of the information collected must be shown to be adequate for the intended uses. For collections based on sampling, a special justification must be provided if they will not yield "reliable" data that can be generalized to the universe studied.
The intent of this information collection is to assess user feedback on the utility on the NWS Impact Based Warning communication. In order to improve response rates for this information collection, the survey has been made as concise as possible. Nonresponse testing will be a challenge in that no identifying information will be collected that will allow for follow-up activities. However, the demographic information collected on organizations would give some indication of which organizations were less likely to respond, based on the known percentages of most groups. The intended approach will yield an informed, representative sample of the respondent universe – and the information gained will be extremely valuable in making severe weather communication improvements. This survey allows equal and independent opportunity for the target population to provide feedback on the Impact Based Warning tool.
4. Describe any tests of procedures or methods to be undertaken. Tests are encouraged as effective means to refine collections, but if ten or more test respondents are involved OMB must give prior approval.
Draft versions of this survey were circulated for review and comment to nine representative members of the target population. Reviewers were asked to offer feedback on the length, appropriateness and clarity of questions, content, or other aspects to improve the questionnaire. Comments from reviewers were helpful and resulted in design, and content changes to clarify questions and simplify instructions.
5. Provide the name and telephone number of individuals consulted on the statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
The implementation of the information collection and data analysis will be coordinated by Dr.
Chris Ellis at the NOAA Coastal Services Center, available by telephone at (843) 740-1195 or by email at [email protected].
File Type | application/msword |
Author | Chris Ellis |
Last Modified By | Sarah Brabson |
File Modified | 2013-05-30 |
File Created | 2013-05-29 |