Supplemental and Part B Questions

DPCF SupplementaryQuestions.docx

NOAA Customer Surveys

Supplemental and Part B Questions

OMB: 0648-0342

Document [docx]
Download: docx | pdf
  1. Supplemental Questions for DOC/NOAA Customer Survey Clearance
    (OMB Control Number 0648-0342)


  1. Explain who will be conducting this survey. What program office will be
    conducting the survey? What services does this program provide? Who are the customers? How are these services provided to the customer?


The National Ocean Service (NOS) Center for Operational Oceanographic Products and Services (CO-OPS) will be conducting the study. CO-OPS provides a multitude of services internally and externally including a network of water level gauges, real-time oceanographic data for navigation, and observation of tidal currents. They serve a broad range of customers including coastal program managers, emergency management, NWS meteorologists, and even the general public. These services are provided to our customers primarily through our website. Specific to this survey, CO-OPS will be seeking feedback from the Sentinel Sites program, a program which currently has 5 locations across the U.S. and is comprised of coastal and emergency managers, insurance industry, researchers and many others (http://oceanservice.noaa.gov/sentinelsites/). We created a prototype of a ‘Dashboard’ product upon their request. This Dashboard displays tidal, flood, and alert information at each of our NWLON (National Water Level Observation Network) stations. The risk communication specialist within CO-OPS will be conducting the survey. She has extensive knowledge and experience with surveys through her doctoral program.


  1. Explain how this survey was developed. With whom did you consult during the development of this survey on content? statistics? What suggestions did you get about improving the survey?


This survey was developed by listening to preliminary, informal feedback from Sentinel Site members throughout the process, working with the project lead, and applying best practices for survey design. We sought to determine overall satisfaction with the product as well as more detailed opinions and understanding on specific aspects. The risk communication specialist took on the bulk of development due to her knowledge in survey design, but she consulted with the prototype project lead (Christopher Paternostro), the project lead for phase 2 (Paul Fannelli), and the CO-OPS Resilience program manager (Audra Luscher). They provided scientific guidance, wording changes, and general advice.


  1. Explain how the survey will be conducted. How will the customers be sampled (if fewer than all customers will be surveyed)? What percentage of customers asked to take the survey will respond? What actions are planned to increase the response rate? (Web-based surveys are not an acceptable method of sampling a broad population. Web-based surveys must be limited to services provided by Web.)


The survey will be conducted via web-based software if available or through email. We will survey 34 Sentinel Site members who are on regular calls and act as representatives for their areas and/or organizations. We expect that nearly all of these Sentinel Site members will participate in the survey, but if our response rate is low we understand that it may not be generalized to the entire population. We also understand that there may be other tertiary Sentinel Site members that are not represented and results cannot be generalized to include them. To increase response rate we will discuss the survey with participants in an upcoming call and provide follow-up emails after the survey has been distributed.


  1. Describe how the results of this survey will be analyzed and used. If the customer population is sampled, what statistical techniques will be used to generalize the results to the entire customer population? Is this survey intended to measure a GPRA performance measure? (If so, please include an excerpt from the appropriate document.)


The results of the survey will be informally analyzed qualitatively to search for themes and categories in open-ended questions and quantitatively to run basic statistics (mean, frequencies) on ratings. Results will be used to guide further development of the prototype as CO-OPS continues with a second phase of the project. The logistics and scope of this phase are currently being explored and developed, but the ultimate goal will likely be expansion to other locations and more tailored, effective design. Because this project is in such an early stage, it’s crucial to feedback from our users. Since we do not have a need to understand the entire costumer population (all those involved in the Sentinel Sites program), we will not be attempting to generalize our results using statistical techniques. We are most interested in feedback from members closely involved in the development of Dashboard. This survey is not meant to measure a GPRA performance measure.



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g. establishments, State and local governmental units, households, or persons) in the universe and the corresponding sample are to be provided in tabular form. The tabulation must also include expected response rates for the collection as a whole. If the collection has been conducted before, provide the actual response rate achieved.


We are not attempting to generalize to a larger population; we are only interested in the 34 Sentinel Site members that attend regular calls and participated in Dashboard development.


Organization/Affiliation

Number of Participants

NOAA

19

United States Geological Survey (USGS)

2

Academic

3

National Estuarine Research Reserve System (NERRS)

4

NGO (The Nature Conservancy and Dauphin Island Sea Lab)

2

State (NC and CA)

2

Fish and Wildlife Service (FWS)

2



2. Describe the procedures for the collection, including: the statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose described in the justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


This will be a census of the interested 34 Sentinel Site members that attend regular calls and participated in Dashboard development. As the project expands, we may consider more complex strategies to allow for generalizability.


3. Describe the methods used to maximize response rates and to deal with nonresponse. The accuracy and reliability of the information collected must be shown to be adequate for the intended uses. For collections based on sampling, a special justification must be provided if they will not yield "reliable" data that can be generalized to the universe studied.


To maximize response rate, we will discuss the survey with participants in an upcoming call explaining our intentions and the importance of participation. We will also provide regular follow-up emails after the survey has been distributed.


4. Describe any tests of procedures or methods to be undertaken. Tests are encouraged as effective means to refine collections, but if ten or more test respondents are involved OMB must give prior approval.


We will not be doing any tests on the current survey, other than the processes we went through to develop the instrument.


5. Provide the name and telephone number of individuals consulted on the statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


As risk communication specialist for CO-OPS and having a strong background in statistics and mathematics (2 degrees in meteorology and 1 degree in a social science), I (Danielle Nagele, agency contact) will be leading collection and analysis of the survey. Because we will only be doing very basic statistics on the survey results, we did not need to consult a statistical expert. I plan to determine the mean and frequencies for the rated questions and analyse the qualitative questions for themes. We are not interested in replicability, generalizability, or publication of this data. It is only meant to provide a basic understanding of customer feedback for a specific user.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleDPCF Supplementary Questions (1).docx
AuthorDanielle Nagele
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy