Suppllemental and Part B

CPC Supp and Part B questiions_Updated_V2.docx

NOAA Customer Surveys

Suppllemental and Part B

OMB: 0648-0342

Document [docx]
Download: docx | pdf
  1. Supplemental Questions for DOC/NOAA Customer Survey Clearance
    (OMB Control Number 0648-0342)


  1. Explain who will be conducting this survey. What program office will be
    conducting the survey? What services does this program provide? Who are the customers? How are these services provided to the customer?


The survey will be conducted by the University of Maryland’s Environmental Decision Support Science Lab at the Earth System Science Interdisciplinary Center for National Oceanic and Atmospheric Administration’s (NOAA) Climate Prediction Center (CPC). CPC provides medium and long term temperature and precipitation outlooks to visitors of the CPC webpage. The survey will be conducted under the Cooperative Institute for Climate and Satellites (CICS) grant “Identifying Users, Diagnosing Understandability Challenges, and Developing Prototype Solutions for NOAA Climate Prediction Center’s Temperature and Precipitation Outlooks (NA14NES4320003).” This study examines how to better communicate risk and uncertainty associated with the forecasts of NOAA National Weather Service CPC with the goal to inform the potential adjustments or revision of the CPC’s temperature and precipitation outlook visuals.


  1. Explain how this survey was developed. With whom did you consult during the development of this survey on content? statistics? What suggestions did you get about improving the survey?


The survey was developed by the Environmental Decision Support Science Lab who have extensive experience in developing, conducting, and analysing research surveys. The survey has undergone

internal pre-tests conducted by the Environmental Decision Support Science Lab. Results of these tests identified errors in wording or implementation of survey structure and in a few cases modifications to more clearly articulate survey questions or options.


  1. Explain how the survey will be conducted. How will the customers be sampled (if fewer than all customers will be surveyed)? What percentage of customers asked to take the survey will respond? What actions are planned to increase the response rate?


All visitors to the CPC website will be invited to participate in the survey. The survey respondents opt-in if they wish to provide feedback, so a traditional response rate percentage or methods to increase response rate are not relevant given the sampling approach. When the link is clicked, respondents are randomly assigned to either the baseline or a modified image, in one of two sets of five outlook time ranges, temperature and precipitation.


  1. Describe how the results of this survey will be analyzed and used. If the customer population is sampled, what statistical techniques will be used to generalize the results to the entire customer population? Is this survey intended to measure a GPRA performance measure? (If so, please include an excerpt from the appropriate document.)


The survey is intended to discover difference between responses to the baseline outlook visuals resulting from any mismatch between existing visualizations and user needs, and those to the modified images. Diagnosis methods to determine modifications needed incorporated the design problem taxonomy developed by Dasgupta et al. (2015), which divides design problems into encoding and decoding steps. Encoding problems mostly depend on design choices in mapping data to visual attributes (e.g., color, shape). They include, among other things, issues associated with choice of color maps or chart type. Decoding problems move beyond design choices and mainly impact the effectiveness or efficiency of information extraction. These can be issues associated with screen resolution or lack of annotations to place appropriate emphasis. The survey includes a set of multiple choice questions that assess whether the respondents correctly interpret various aspects of the outlooks, both baseline and modified. This survey is not intended to inform a GPRA performance measure.



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g. establishments, State and local governmental units, households, or persons) in the universe and the corresponding sample are to be provided in tabular form. The tabulation must also include expected response rates for the collection as a whole. If the collection has been conducted before, provide the actual response rate achieved.


The respondent universe includes all users of CPC products on the website. The survey link will be removed from CPC’s website either when statistically significant trends emerge regarding a difference between correct responses on the baseline images and correct responses on the modified images or on the intended completion date of March 31, 2018.


  1. Describe the procedures for the collection, including: the statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose described in the justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


No stratification methods are used as the link can be used by any user of the CPC website. When a respondent clicks on the link the survey software randomly assigns the respondent to the baseline or a modified image. The statistic calculated is the percentage of respondents who correctly answer questions about the image they are shown. This is a one-time data collection, with the potential for a follow-up survey that will help determine the effectiveness of the modifications identified by the initial survey results.


  1. Describe the methods used to maximize response rates and to deal with nonresponse. The accuracy and reliability of the information collected must be shown to be adequate for the intended uses. For collections based on sampling, a special justification must be provided if they will not yield "reliable" data that can be generalized to the universe studied.


As mentioned above, our main concern is to collect a large enough sample of responses to get statistically significant changes in understability results. We will not know how many people visited the website and saw the survey invitation, so we won’t be able to determine a precise response rate and we are not concerned about non-responses as long as we get an adequate number of responses within our planned time frame.


  1. Describe any tests of procedures or methods to be undertaken. Tests are encouraged as effective means to refine collections, but if ten or more test respondents are involved OMB must give prior approval.


There will be no tests of procedures or methods with respondents from the CPC website.


  1. Provide the name and telephone number of individuals consulted on the statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Dr. Michael D. Gerst, +1 301 405 9908

Dr. Melissa A. Kenney, +1 301 405 3226

University of Maryland’s Environmental Decision Support Science Lab

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDavid DeWitt
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy