0342 ss 042715 Part A

0342 ss 042715 Part A.doc

NOAA Customer Surveys

OMB: 0648-0342

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT

NOAA CUSTOMER SURVEYS

OMB CONTROL NO. 0648-0342



A. JUSTIFICATION


1. Explain the circumstances that make the collection of information necessary.


This is a request for extension of a generic clearance for voluntary customer surveys to be conducted by NOAA program offices. In accordance with Executive Order 12862, the

National Partnership for Reinventing Government, and good management practices, NOAA offices seek to be able to continue to gather customer feedback on services and/or products, which can be used in planning for service/product modification and prioritization.


Under this generic clearance, individual offices would continue use of approved questionnaires and develop new questionnaires, as needed, by selecting subsets of the approved set of

collection questions and tailoring those specific questions to be meaningful for their

particular programs. These proposed questionnaires would then be submitted through a fast-track request for approval process. A proposed questionnaire would then be submitted to the NOAA Clearance Officer. If the latter finds that the proposal appears to be consistent with the generic clearance, the proposal would be forwarded through the Department of Commerce to NOAA’s OMB Desk Officer for fast-track review. The generic clearance will not be used to survey any bodies NOAA regulates unless precautions are taken to ensure that the respondents believe that they are not under any risk for ot responding or for the contents of their responses; e.g. in no survey of such a population will the names and addresses of respondents be required. Currently there are no such surveys being submitted for approval.


Two sets of survey questions (included as supplementary documents) are used for generation of program-level questionnaires:


1) “Quantitative Questions” seeks to obtain numerical ratings from respondents on their satisfaction with various aspects of the product or service they obtained – satisfaction with the quality of the product, the courtesy of the staff, the format of and documentation for data received, and similar standard types of questions. The offices using such questions are able to determine which aspects of their program need improvement, or have improved. The rating system is intended to aid the respondents in identifying their relative level of satisfaction in particular areas, and is not generally intended to be used to establish numerical performance goals or as part of any complex statistical analyses over time. The potential benefits of the latter are outweighed by the difficulties in ensuring that the data is unbiased and fully representational of customers.


2) “Qualitative Questions” are more focused on who is using the product and service, how it is being used, and the medium or format in which the respondent would like to see data provided. The respondent is also given an opportunity to make specific suggestions on what new products or services should be offered or on how existing products or services could be improved.



2. Explain how, by whom, how frequently, and for what purpose the information will be used. If the information collected will be disseminated to the public or used to support information that will be disseminated to the public, then explain how the collection complies with all applicable Information Quality Guidelines.


The responses to the quantitative questionnaires will be used by the sponsoring program office to

determine the customers’ satisfaction with the level of service and products delivered,

and identifying perceived weaknesses in those products or services or gaps in services. Information such as this will be used to help direct program improvement efforts.


The uses of the qualitative questions are somewhat different. Rather than seeking information on the degree of customer satisfaction, the objectives are more complex. Questions 1, 5, 6, and 7 seek information on what product/service was received, suggestions about improving the product or its format, and suggestions for other products or services. This information will assist the program office in better identifying the needs of customers by providing more specific data. For instance, responses concerning formats will be used to help determine which products the users are most interested in seeing through the program Web site. Questions 2, 3, 4, and 9 seek information that will help the program office identify the types of users for specific products and how they use those products.


These NOAA line offices: National Ocean Service (NOS), National Marine Fisheries Service (NMFS), National Environmental Satellite, Data and Information Service (NESDIS), National Weather Service (NWS) and Oceanic and Atmospheric Research (OAR) have been expending significant effort to review, report on, and act on the information gathered from their surveys. Many programs have used the NOAA Web site Customer Survey, posted on their individual program area sites, soliciting responses specifically about those sites. Response information utilization includes a wide variety of modifications made to the content, scope and navigation of the program Web sites.


Some surveys solicit comments on how to maintain or improve access to program data. Others provide information about customer usage and their diversity, and allow notification to subsets of user respondents of program or data changes applicable to them, rather than broadcasting emails to the complete user universe. Still others gather feedback on experimental products, to be used in product modification as indicated.


The currently approved and ongoing surveys for which NOAA is requesting renewal are listed in the table below and described in following pages. In 2014 and early 2015, there were eight new surveys approved by OMB under this generic collection: three from NOS, four from NWS and one from NESDIS.










Survey Name


Annual/

Annualized Responses


Burden Hours


1. NOAA Web site Customer Satisfaction (ongoing - administered through NOS, all line offices other than NWS, results sorted by and available to each participating program)

72,000 (5 min.)

6,000


NWS - Experimental Products/Services (ongoing)

93,000 (5 min.)

7,750


3. Oceanic and Atmospheric Research (OAR) - Tropical Atmosphere Ocean Array Web Data Distribution (ongoing)

6,259 (1 min)

104


4. NOS – Chart Users Survey (new survey every year)

2,600 (10 min.)

433

5. NOS Coastal Services Center for NOS and also in collaboration with NWS for extreme weather event surveys (averages 3 short-term surveys annually): Currently underway and ongoing is an NWS service assessment survey. See total estimated annual burden estimate to the right.

1,000 (average 10 min.)

167

6. NOS Center for Operational Oceanographic Products and Services multi-phase survey

100 (average 30 minutes)

50

7. NWS Hazardous Weather Product Surveys. Currently underway:  Arrival of Tropical Storm Force Winds Interviews, Data Collection for Hazardous Simplification Project/Significant Weather Event Survey, Data Collection for Extra-Tropical Surge, Hurricane Arthur Assessment 


1,539 (average 12 minutes

308

Additional burden estimated for new surveys in 2015

5,700

2,000


TOTALS

(182,198)

(16,812)

TOTALS for 3 years

(546,594)

(50,436)


The NOAA Web site Customer Satisfaction survey is administered by the NOS and implemented on 18 separate Web sites, for programs under NOS, NMFS, NESDIS and OAR. Data is collected into a single database, separated by the individual Web site on which it is implemented. An administrator is able to view survey results for each Web site and act upon these results as appropriate.


National Weather Service


Web site Experimental Products/Services Surveys


NWS makes extensive use of the survey to collect user feedback on proposed changes, additions, or terminations of Official and Experimental Products/Services. Under NOAA’s Partnership Policy, we are required to collect user feedback on changes to environmental information and services. For proposed new products and services, the survey responses have provided invaluable feedback from private and commercial users, as well as government partners, to local, Regional, and National decision makers. Survey allows for fairness and openness in proposed changes, and assists the decision makers in determining what actions and services are appropriate for NWS to provide. All survey responses are carefully evaluated and considered in determining the appropriate action. For proposed termination of services/products, the surveys have provided user input, allowing NWS to ensure data requirements were met while still being able to consolidate Web services. Since July 31, 2012 there have been approximately 280,000 Customer Survey responses for Official and Experimental Products/Services.


Oceanic and Atmospheric Research (OAR)


The Tropical Atmosphere Ocean (TAO) Project survey is at

http://www.pmel.noaa.gov/tao/data_deliv/reg.html.

The TAO Array Web Data Distribution feedback request form is used as a metric of customer usage and diversity of users of TAO/TRITON/PIRATA/RAMA data.   Feedback helps us improve the product and the Web site delivery mechanisms to better meet users specific research needs. 

National Ocean Service (NOS)


Chart Users Survey: NOAA is responsible for producing and distributing the nautical charting products covering the coastal waterways of the United States and its territories.  The users of these navigational products can be broadly described as commercial mariners and recreational boaters.

This survey is conducted every year, with an average of 2,500 respondents each time. Updated surveys are submitted for OMB approval before dissemination. Answers to survey questions by users of NOAA's nautical products have been used to revise/modify these products and services to better meet user needs. The 2015 survey was approved in 4/15.


Dashboard Prototype Customer Feedback approved 1-16-15. The National Ocean Service (NOS) Center for Operational Oceanographic Products and Services (CO-OPS) will be conducting the study. CO-OPS provides a multitude of services internally and externally including a network of water level gauges, real-time oceanographic data for navigation, and observation of tidal currents. They serve a broad range of customers including coastal program managers, emergency management, NWS meteorologists, and even the general public. These services are provided to our customers primarily through our website. Specific to this survey, CO-OPS will be seeking feedback from the Sentinel Sites program, a program which currently has 5 locations across the U.S. and is comprised of coastal and emergency managers, insurance industry, researchers and many others (http://oceanservice.noaa.gov/sentinelsites/). We created a prototype of a ‘Dashboard’ product upon their request. This Dashboard displays tidal, flood, and alert information at each of our NWLON (National Water Level Observation Network) stations.


NOS with NWS


Service Assessments approved 9-12-13

The ongoing information collection is being led by Dr. Vankita Brown of the NWS, supported by an interdisciplinary team of social scientists representing various NOAA line offices, other federal agencies and academic institutions. The program office responsible for the proposed information collection is the NWS Office of Climate Water, and Weather Services.

The NWS conducts Service Assessments (both nationally and regionally focused) to evaluate its

performance after significant hydrometeorological, oceanographic, or geological events.


NWS also collects routine feedback around other events that generally lead to more localized impacts (e.g, dense fog, frost/freeze, high surf, etc.). Service Assessments may be initiated when one or more of the following criteria are met:

Major economic impact on a large area or population

Multiple fatalities or numerous serious injuries

Extensive national public interest or media coverage

Unusual level of attention to NWS performance

Assessment


National Weather Service


Extra-tropical (ET) Surge and Inundation Social Science Research Project approved 10-24-14


The NWS is planning to issue a new product, a storm surge watch/warning, to communicate life threatening storm surges. The goal is for this new product to be introduced for tropical systems in 2015 and for ET systems in 2017. The NWS wants to ensure smooth implementation of this new product and is therefore seeking approval to collect information on how ET storm surge dangers are currently communicated to communities (both in text and graphics); to understand where potential misunderstandings or confusion now occur—or could occur if a storm surge watch/warning product were to be implemented starting in 2017; and to discern any other barriers and conflicts that NWS should be aware of from both an operational and communications perspective prior to implementation of the storm surge watch/warning for ET events. The primary customers of this new product are broadcast meteorologists, emergency managers, and groups that make decisions that affect public safety (e.g., school boards, transportation managers, fire and rescue, tribal authorities).


Significant Weather Event Communication Survey approved 12-8-14


This survey will show respondents a few images that the NWS is considering to convey significant weather events. The survey will ask for respondents’ opinions on how well the images are providing important information about the events.


Arrival of Tropical Storm Force Winds Social Science Research Project approved 1-13-15


The NWS is seeking input on what its partners need in—and how they would use—a graphic and data depicting the arrival of tropical storm force winds. The primary customers of this new product are broadcast meteorologists and emergency managers.


Hurricane Arthur Social Science Research Project approved 1-23-15


In 2014, the National Hurricane Center (NHC) issued a new, experimental storm surge graphic (the “Potential Storm Surge Flooding Map") for the first time—during Hurricane Arthur, the first named storm of the 2014 hurricane season. The graphic depicts areas along the Gulf and Atlantic coasts of the United States that have a significant risk of life-threatening inundation by storm surge from a tropical depression, storm, or hurricane. It urges those in the marked areas to follow evacuation and other instructions from local officials. During Hurricane Arthur, the NHC issued the map at right, which outlined coastal areas of North Carolina that had the potential to see storm surges of greater than 3 feet above ground level during the storm.


The NHC is seeking input on how the map was used by its partners (primarily decision-makers, broadcast meteorologists, emergency managers, and community members) during Hurricane Arthur. This input will be useful in helping the NHC address any confusion or concerns identified by partners in using or interpreting the map, so as to ensure the continued development and implementation of this experimental product.


NESDIS


Tropical Cyclone Program survey approved 1-26-15


The NESDIS Tropical team works on projects/tasks to help improve the products we provide to our global customers. These projects also benefit the satellite analyst at the Satellite Analysis Branch by making the analysis less difficult. Michael Turk, the Tropical Team Lead and Jamie Kibler, the User Service Lead for the Satellite Analysis Branch (SAB), a division of NESDIS, will be conducting and monitoring the Tropical survey with the assistance of the World

Meteorological Organization (WMO), the official United Nations' authoritative voice on

weather. The WMO will be disseminating the survey by mail to customers for the SAB Tropical

Team. The survey pertains to the programs products including tropical bulletins, and Microwave

classifications and others,


NOAA will retain control over the information and safeguard it from improper access, modification, and destruction, consistent with NOAA standards for confidentiality, privacy, and electronic information. See response to Question 10 of this Supporting Statement for more information on confidentiality and privacy. The information collection is designed to yield data that meet all applicable information quality guidelines. Prior to dissemination, the information will be subjected to quality control measures and a pre-dissemination review pursuant to Section 515 of Public Law 106-554.




3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological techniques or other forms of information technology.


Currently, most surveys are conducted via email. Some surveys are mailed to their customer lists. Web site customer satisfaction and some product satisfaction surveys are posted on the applicable Web sites, with monitoring to eliminate most, if not all, frivolous responses.


4. Describe efforts to identify duplication.


A team with representatives from all of NOAA’s major organizations helped to develop the

questions and identify any current efforts. While there may be other customer surveys planned

that will be the subject of separate clearance requests, NOAA is confident that the procedures in

place ensure that no current or future survey will duplicate any other similar survey within the

program area involved. The NOAA PRA Clearance Officer alerts those planning a survey, through this OMB Control No. or others, about similar/overlapping surveys being planned or conducted.


Three years ago, DOC implemented an overlapping generic information collection which includes customer surveys (OMB Control No. 0690-0030). NOAA uses this vehicle for surveys for which less robust results are needed, as very little description/documentation is needed.


5. If the collection of information involves small businesses or other small entities, describe the methods used to minimize burden.


While small businesses will be respondents to some of the surveys, the burden on any respondent

is expected to be minimal. Response to all surveys will continue to be voluntary.


6. Describe the consequences to the Federal program or policy activities if the collection is not conducted or is conducted less frequently.


If these surveys were not conducted, the program offices have significantly less information for determining which areas of their programs should be modified, and how they might be modified, to provide better service to the public. The frequency of surveys will vary. Some will be conducted once a year, while others will be ongoing. The ongoing approach is deemed especially useful when asking questions about specific products and formats, rather than about general satisfaction with a program. This more frequent feedback may allow the program office to get helpful information from respondents at the time a product is received or a Home Page is used, rather than later as part of an annual survey.


7. Explain any special circumstances that require the collection to be conducted in a manner inconsistent with OMB guidelines.


Respondents who choose to complete surveys on the Web may be responding in less than thirty calendar days from when they receive the request. In those surveys where a questionnaire is attached to each product delivery, a person who frequently orders products will receive more than one request quarterly, but all responses are voluntary.


8. Provide information on the PRA Federal Register Notice that solicited public comments on the information collection prior to this submission. Summarize the public comments received in response to that notice and describe the actions taken by the agency in response to those comments. Describe the efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


A Federal Register Notice soliciting public comments was published on February 17, 2015 (80 FR 8288). No comments were received.


Comments were sought from participants in several of the Arrival of Tropical Storm Force Winds Interviews, part of the ongoing NWS Hazardous Weather Product Surveys.


An Alaska emergency manager commenter had "no negative impressions" of the interview.  This far removed he does not have a detailed memory of the specific discussions, though he remembers the discussions were meaningful.   One discussion he specifically remembered was learning how we send out information and the role of commercial companies.  He now tells people if they have a complaint (with the weather they get from an application) they should to go the NWS website - because that's where the private company gets their information (from the NWS).  He also remembered discussion about the zone vs municipality boundary and how that can cause his system to alert as if all of Anchorage was to receive heavy snow when really it's just the snow-prone portion of the municipality (the temporal rain forest portion).  


From a Minnesota Broadcast Meteorologist, “No comments”.


9. Explain any decisions to provide payments or gifts to respondents, other than remuneration of contractors or grantees.


No payment or gift will be given to any respondent.


10. Describe any assurance of confidentiality provided to respondents and the basis for assurance in statute, regulation, or agency policy.


There will be no assurance of confidentiality; however, provision of contact information is optional.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.


No sensitive questions will be asked.


12. Provide an estimate in hours of the burden of the collection of information.


A total of 182,198 individual responses is expected annually, with an average response time of five-six minutes, resulting in 16,812 burden hours. The three-year total, as entered in ROCIS per instructions, will be 564,594 responses and 50,436 hours.

It is possible that there may be more than one response per respondent per year; this information is not tracked, but each entry on Web-based surveys has a time stamp.  More than one entry per respondent can be reasonably expected as new information and products appear. However, entries in close succession, which generally could be construed as frivolous, are eliminated.


13. Provide an estimate of the total annual cost burden to the respondents or record-keepers resulting from the collection (excluding the value of the burden hours in Question 12 above).


There will be no cost to respondents. For mailed surveys, envelopes with pre-paid postage will be supplied.


14. Provide estimates of annualized cost to the Federal government.


All surveys will be conducted and analyzed in-house as part of program planning and thus there is no additional cost beyond regular staff time.


15. Explain the reasons for any program changes or adjustments.


Adjustments:


Several of the surveys included in the previous collection are not to be renewed and others have been approved since the last renewal.


This generic clearance continues to be used for a wide variety of surveys, especially for NWS Web sites; however, also for NWS, a private company is now managing their Web site customer survey. Thus, there are slight decreases in total responses and hours estimated in 2012.


16. For collections whose results will be published, outline the plans for tabulation and publication.


Aggregated results will be posted on the applicable Web sites as needed, to share with users as part of information to be imparted about planned program or product changes.

17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons why display would be inappropriate.


All surveys will display the OMB expiration date.


18. Explain each exception to the certification statement.


No exceptions are requested.


3


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorRichard Roberts
Last Modified BySarah Brabson
File Modified2015-04-27
File Created2015-04-07

© 2024 OMB.report | Privacy Policy