0342 SS Part A 070108

0342 SS Part A 070108.pdf

NOAA Customer Surveys

OMB: 0648-0342

Document [pdf]
Download: pdf | pdf
SUPPORTING STATEMENT
NOAA CUSTOMER SURVEYS
OMB CONTROL NO. 0648-0342

A.

JUSTIFICATION

1. Explain the circumstances that make the collection of information necessary.
This is a request for renewal of a generic clearance for voluntary customer surveys to be
conducted by NOAA program offices. In accordance with Executive Order 12862, the
National Performance Review, and good management practices, NOAA offices seek to be able
to continue to gather customer feedback on services and/or products, which can be used in
planning for service/product modification and prioritization.
Under this generic clearance, individual offices would continue use of approved questionnaires
and develop new questionnaires, as needed, by selecting subsets of the approved set of
collection questions and tailoring those specific questions to be meaningful for their
particular programs. These proposed questionnaires would then be submitted through a fast-track
request for approval process. A proposed questionnaire would then be submitted to the NOAA
Clearance Officer. If the latter finds that the proposal appears to be consistent with the generic
clearance, the proposal would be forwarded through the Department of Commerce to NOAA’s
OMB Desk Officer for fast-track review. The generic clearance will not be used to survey any
bodies NOAA regulates unless precautions are taken to ensure that the respondents believe that
they are not under any risk for not responding or for the contents of their responses; e.g. in no
survey to such a population will the names and addresses of respondents be required. Currently
there are no such surveys being submitted for approval.
Two sets of survey questions (attached at the end of this document) are used for generation of
program-level questionnaires:
1) “Quantitative Questions” seeks to obtain numerical ratings from respondents on their
satisfaction with various aspects of the product or service they obtained – satisfaction with the
quality of the product, the courtesy of the staff, the format of and documentation for data
received, and similar standard types of questions. The offices using such questions are able to
determine which aspects of their program need improvement, or have improved. The rating
system is intended to aid the respondents in identifying their relative level of satisfaction in
particular areas, and is not generally intended to be used to establish numerical performance
goals or as part of any complex statistical analyses over time. The potential benefits of the latter
are outweighed by the difficulties in ensuring that the data is unbiased and fully representational
of customers.
2) “Qualitative Questions” are more focused on who is using the product and service, how it is
being used, and the medium or format in which the respondent would like to see data provided.
The respondent is also given an opportunity to make specific suggestions on what new products
or services should be offered or on how existing products or services could be improved.

1

2. Explain how, by whom, how frequently, and for what purpose the information will be
used. If the information collected will be disseminated to the public or used to support
information that will be disseminated to the public, then explain how the collection
complies with all applicable Information Quality Guidelines.
The responses to the quantitative questionnaires will be used by the sponsoring program office to
determine the customers’ satisfaction with the level of service and products delivered,
identifying perceived weaknesses in those products or services. Information such as this will be
used to help direct program improvement efforts.
The uses of the qualitative questions are somewhat different. Rather than seeking information on
the degree of customer satisfaction, the objectives are more complex. Questions 1, 5, 6, and 7
seek information on what product/service was received, suggestions about improving the product
or its format, and suggestions for other products or services. This information will assist the
program office in better identifying the needs of customers by providing more specific data. For
instance, responses concerning formats will be used to help determine which products the users
are most interested in seeing through the program website. Questions 2, 3, 4, and 9 seek
information that will help the program office identify the types of users for specific products and
how they use those products.
The NOAA line offices (National Ocean Service (NOS), National Marine Fisheries Service
(NMFS), National Environmental Satellite, Data and Information Service (NESDIS), National
Weather Service (NWS) and Oceanic and Atmospheric Research (OAR) have been expending
significant effort to review, report on, and act on the information gathered from their surveys.
Many programs have used the NOAA Website Customer Survey, posted on their individual
program area sites, soliciting responses specifically about those sites. Examples of how the
response information is utilized include a wide variety of modifications made to the content,
scope and navigation of the program websites.
Some surveys solicit comments on how to maintain or improve access to program data. Others
provide information about customer usage and their diversity, and allow notification to subsets of
user respondents of program or data changes applicable to them, rather than broadcasting emails
to the complete user universe. Still others gather feedback on experimental products, to be used
in product modification as indicated.
The currently approved surveys for which NOAA is requesting renewal are listed in the table
below. Following are examples of how information collected by specific surveys has been used.
Copies of all surveys have been posted in ROCIS.

2

Survey Name

Annual/
Annualized
Responses

Burden
Hours

1. NOAA Website Customer Satisfaction (administered
through National Ocean Service (NOS), all line offices other
than National Weather Service (NWS), results sorted by and
available to each participating program)

1,839 (5 min)

153

2. NOAA Website Customer Satisfaction (NWS)

7,010 (5 min)

584

24,050 (5
min)
30 (5 min)

2,004

67 (20 min)

22

6,259 (1 min)

104

515 (1 min)

9

1,000 (15
min)
184 (20 min)

250

400 (15 min)
17,610 (3
min)
38(15 min)

100
881

98 (10 min)

16

59,100

4,196

3. NWS - Experimental Products/Services
4. NWS National Climatic Data Center - North American
Drought Monitor
5. NWS International Flight Folder Documentation Program
6. Oceanic and Atmospheric Research (OAR) - Tropical
Atmosphere Ocean Array Web Data Distribution
7. OAR - Ocean Surface Current Analyses – Real Time Data
Feedback Request Form
8. NOS – Chart Users Survey
9. NOS Coastal Services Center - Coastal Decision-Support
Tool, Data and Information Resource, and Technical
Assistance Customer Survey – approved 10-4-07, change
approved 5-7-08, no results to report yet
10. NOS National Geodetic Survey County Scorecard, Phase 3
11. National Environmental Satellite, Data and Information
Service (NESDIS) Data Center Customer Satisfaction Survey
12. NESDIS Washington Volcanic Ash Advisory Center
Products and Services Customer Survey – recently approved
(6-3-08)
13. NESDIS Office of Satellite Data Processing and
Distribution - National/Naval Ice Center Customer Feedback
Survey – recently approved (3-31-08)
TOTALS

2

61

10

The NOAA Website Customer Satisfaction survey is administered by the NOS and
implemented on 18 separate Websites, for programs under NOS, NMFS, NESDIS and OAR.
Data is collected into a single database, separated by the individual Website on which it is
implemented. An administrator is able to view survey results for each Website and act upon
3

these results as appropriate. The following provide examples as to how the survey data has or
will be used:
Survey implemented on: http://oceanexplorer.noaa.gov
One example of how the customer satisfaction survey results have been used to better the Ocean
Explorer Website, came from a text response to Question 12, Do you have suggestions about
improving the content and organization of our site? “In the FAQ there was reference to a "Large
Images" collection, in which could be found high resolution images. I was unable to find that
section.”. The respondent asked for an easier way to find high resolution images. As a result, an
addition was made to the captions of all images in the galleries that had high resolution versions
available. This addition was: (HR) = “High Resolution” image available.
Survey implemented on: National Ocean Service http://oceanservice.noaa.gov
The National Ocean Service Website was last redesigned five years ago. Beginning the summer
of 2008 we will begin work to completely overhaul the site and will use comments from the
survey to begin our work. The following two comments were received from a text response to
Question 12, “I would like to find a way to contact you via internet or phone.” and “make contact
information a little more easier, with the ability to upgrade or change information”. Both
comments received will make us focus on how we provide contact information on our Website
and to make sure that it is clearly accessible to our visitors. There are other comments from text
responses to question 12 that tell us that we should focus on social media on our site to make it
more accessible. We will continue to use the survey results to validate the changes we make and
continue to update to meet the demands of our visitors.
National Weather Service
Website Customer Satisfaction and Experimental Products/Services Surveys
The generic website surveys are used by local web content providers, and Regional and National
web managers, to improve usability of National Weather Service (NWS) Web pages to better
meet user needs and expectations. The surveys have been used to support greater standardization
of navigation across multiple office sites, to improve user interfaces to forecast and warning
information, and to collect comments from users on accessibility of data and information. Many
times, the survey comments have been useful in improving Web applications and page coding to
improve performance and reduce server demands. Since July 31, 2005 there have been 7,009
generic website satisfaction survey responses submitted.
NWS also makes extensive use of the survey to collect user feedback on proposed changes,
additions, or terminations of Official and Experimental Products/Services. Under NOAA’s
Partnership Policy, we are required to collect user feedback on changes to environmental
information and services. For proposed new products and services, the survey responses have
provided invaluable feedback from private and commercial users, as well as government
partners, to local, Regional, and National decision makers. Survey allows for fairness and
openness in proposed changes, and assists the decision makers in determining what actions and
services are appropriate for NWS to provide. All survey responses are carefully evaluated and
considered in determining the appropriate action. For proposed termination of services/products,
4

the surveys have provided user input, allowing NWS to ensure data requirements were met while
still being able to consolidate Web services. Since July 31, 2005 there have been 24,050
Customer Survey responses for Official and Experimental Products/Services.
The North American Drought Monitor Customer survey is available at:
http://www.ncdc.noaa.gov/oa/climate/monitoring/drought/nadm/survey.php.
The North American Drought Monitor (NADM) customer survey form is being used to build a
database of user impressions regarding the usefulness and accessibility of the NADM website
and associated drought monitoring products. To date it has proven helpful in gaining a clearer
understanding of the usefulness of the NADM monthly drought monitoring products to NOAA’s
customers and for the identification of areas for improvement to better meet user needs. Included
among areas targeted for improvement based on user feedback is more timely dissemination of
the monthly NADM maps and increased frequency from monthly to bi-weekly products.
International Flight Folder Documentation Program
This survey was first completed during 2006, and will be repeated periodically. The first
issuance of this survey was primarily used to gauge which of the program’s weather products
were important to their customers’ international flight operations and if they were aware of the
automated product request line. There were also questions that asked what other methods for
obtaining weather they use, if any. From the results, the program learned which products were
essential to customers’ weather packages and it was clear that the service was important to
customers’ operations. Results also showed that several users do not solely rely on this program
for their weather information and most have other sources for supplementary or backup
purposes. These sources were paid providers, various internet sites, or their own internal
company weather offices. Nearly a quarter of the responders did not know about the automated
request line. This prompted the program to submit instructions on its use to all customers and to
post them on the program website.
Oceanic and Atmospheric Research (OAR)
The Tropical Atmosphere Ocean (TAO) Project survey is at
http://www.pmel.noaa.gov/tao/data_deliv/reg.html.
The TAO Array Web Data Distribution feedback request form is used as a metric of customer
usage and diversity of users of TAO/TRITON/PIRATA/RAMA data. Feedback helps us
improve the product and the website delivery mechanisms to better meet users specific research
needs. In one recent instance, survey data allowed us to communicate with a small number of
impacted users over an error identified in a small subset of older data, resulting in a corrected
dataset being made available to them.
The Ocean Surface Current Analyses – Real Time (OSCAR) survey is at
http://www.oscar.noaa.gov/datadisplay/datadownload.htm.
The OSCAR Data feedback request form is used as a metric of customer usage and of users
diversity. The information has allowed the OSCAR project to assess the uses to which the data

5

is being put in order to evaluate the current usefulness of the data and the ways in which the
website can be improved to provide better customer service and satisfaction. User suggestions
have resulted in our expanding the geographic coverage of the dataset, the resolution of the data
provided, and improved the website functionality.
National Ocean Service (NOS)
Ecosystem-Based Management (EBM) Training Assessment
This survey is has been completed and is not included in this renewal request; however, the
following summary demonstrates the planning value of the survey.
1. Need for Training
The results from this survey showed that 88% percent of participants think their organizations, or
organizations with whom they work, are in need of EBM professional development training.
Almost 92% of respondents indicated that they or someone in their organization would attend a
two-to-four day EBM course.
2. Training Format
The preferred course format indicated in this survey was for participants from a particular place
to learn how to formulate a strategic plan for implementing EBM. It would seem that the most
benefit might derive from structured interactive workshops. These could be structured as part of
training for community members, science experts, and relevant agency staff members from a
defined area, followed by an interactive problem-solving workshop to actually identify and
address locally relevant issues. Another format suggestion in our survey results was a preference
for specific, real-world examples as the primary technique for the training:“…the inclusion of
practical applications and real-world examples of EBM. These include examples of success and
failure, and how EBM worked, caused the problem, or may have thwarted a problem if
implemented.”
3. Training Content
This survey indicated a wide array of training needs. Similarly, a study conducted by the EBM
Tools (EBMT) Network (a Non-Government Organization) found a strong interest in developing
capacity in almost all EBM sectors and processes. Thus, the Coastal Services Center audience
possesses needs similar to those in broader coastal and terrestrial settings.
The following content areas were prevalent and may serve as the beginning focus of
content development areas:
a. Collaborative Process
Seventy-seven percent of respondents said that they would like to apply improved
skills from EBM training to resolving complex issues through the collaborative
process. The EBMT survey reported a strong interest in developing capacity in
engaging community and stakeholders in group decision making. Collaborative
process may also be a useful tool to help with a key reported obstacle to
implementation of EBM. In this survey, 72% of respondents said that getting
different local, state, and federal agencies with different institutional climates and
6

mandates to work together was the biggest obstacle to implementing EBM. The
focus on a common vision and goal established through collaborative process can
foster positive governance and institutional relationships. This call for
collaborative process capacity building fits well with 31 existing NOAA Coastal
Service Center strengths and reinforces a section of course development the
Center began before gathering the survey information.
b. Ecosystem Function and Sustainability
The highest training need in this survey was how to incorporate dynamic
ecosystem processes or ecological sustainability into EBM decision making. One
approach that training could take to address this issue is to include instruction that
would facilitate the development of conceptual ecological models: models that
would help identify desired ecosystem attributes and services, the primary drivers
and stressors on the system, and the connections of stressors to attributes and
services through causal linkages. These models would help practitioners identify
what they know and don’t know about the system, become the basis for adaptive
management actions, and provide a way to move forward without complete
ecosystem function knowledge.
c. EBM Process
Also very high on the list of training needs of survey respondents were how to
plan and develop an EBM approach to management, and how to implement such
an approach. The EBMT survey results reported the lack of established methods
for implementing EBM as one of the most severe implementation obstacles. A
conceptual EBM procedural framework is currently under development and
should soon be available for use (Kimberly Heiman, Communication Partnership
for Science and the Sea, personal communication). Draft versions of the
framework indicate that it will work well as a training aid and as the center of a
module on a practical EBM process.
This survey clearance has provided the Coastal Services Center with extremely useful
information as we move forward with EBM course development.
Chart Users Survey: NOAA is responsible for producing and distributing the nautical charting
products covering the coastal waterways of the United States and its territories. The users of
these navigational products can be broadly described as commercial mariners and recreational
boaters.
This survey was conducted in 2006 and will be conducted again in 2008. Answers to survey
questions by users of NOAA's nautical products have been used to revise/modify these products
and services to better meet user needs. One specific example is that the program has made the
decision to move forward with a modernization effort for the US Coast Pilot publication, moving
it to an online format.
National Climatic Data Center’s National Geodetic Survey (NGS) County Scorecard
Using the information gathered from this survey in the past two years, NGS has:
1. Reviewed and strengthened the NGS Workshop Program based on feedback.
7

2. Scheduled a State Plane Coordinates and Datum Transformations Workshop on May 20, 2008
in Upper Michigan in direct response to the analysis of feedback from local geospatial
representatives.
3. Responded to the specific questions from survey respondents.
4. Given survey results to State Advisors to use for local analysis.
National Environmental Satellite, Data and Information Service (NESDIS)
NESDIS Data Center Customer Survey: The survey served as an initial step to initiate open
communication and receive feedback from our users. The results were very valuable in enabling
NOAA's National Data Centers to evaluate the quality of our services, products and accessibility
to users in all categories. The survey was followed up by a NESDIS Data Users' Workshop
during which users made 500 recommendations which were consolidated into approximately 180
common recommendations. An action plan was developed which addressed each of the
recommendations.
The survey question regarding the ease with which data were retrieved from our website
indicated that many customers thought access was easy while others did not. Based on the fact
that some of our customers responded that the access was not as easy as it could have been and
additional comments about ease of use, NOAA's National Climatic Data Center formed a team
that worked on and revised our homepage. While we still get customers who have difficulty
navigating the website, it is far better than before the improvements were made. We are
continuing to work on the user friendliness of the website and have implemented the use of some
map tools.
Other comments indicated that customers wanted to be able to view the data before ordering it,
online. We have implemented this on some of our systems and customers are able to view
products available before they actually order the certified copies. Many more products are also
available for downloading at no charge.
As explained in the preceding paragraphs, the information gathered has utility. NOAA Fisheries
will retain control over the information and safeguard it from improper access, modification, and
destruction, consistent with NOAA standards for confidentiality, privacy, and electronic
information. See response #10 of this Supporting Statement for more information on
confidentiality and privacy. The information collection is designed to yield data that meet all
applicable information quality guidelines. Prior to dissemination, the information will be
subjected to quality control measures and a pre-dissemination review pursuant to Section 515 of
Public Law 106-554.
3. Describe whether, and to what extent, the collection of information involves the use of
automated, electronic, mechanical, or other technological techniques or other forms of
information technology.
Currently, most surveys are conducted via email. Some surveys are mailed to their customer
lists. Website customer satisfaction and some product satisfaction surveys are posted on the
applicable websites, with monitoring to eliminate most, if not all, frivolous responses.
8

4. Describe efforts to identify duplication.
A team with representatives from all of NOAA’s major organizations helped to develop the
questions and identify any current efforts. While there may be other customer surveys planned
that will be the subject of separate clearance requests, NOAA is confident that the procedures in
place ensure that no current or future survey will duplicate any other similar survey within the
program area involved.
5. If the collection of information involves small businesses or other small entities, describe
the methods used to minimize burden.
While small businesses will be respondents to some of the surveys, the burden on any respondent
is expected to be minimal. Response to all surveys will continue to be voluntary.
6. Describe the consequences to the Federal program or policy activities if the collection is
not conducted or is conducted less frequently.
If these surveys were not conducted, the program offices have significantly less information for
determining which areas of their programs should be modified, and how they might be modified,
to provide better service to the public. The frequency of surveys will vary. Some will be
conducted once a year, while others will be ongoing (such as Data Center questions sent out with
deliveries of data and some Website questionnaires). The ongoing approach is deemed especially
useful when asking questions about specific products and formats, rather than about general
satisfaction with a program. This more frequent feedback may allow the program office to get
helpful information from respondents at the time a product is received or a Home Page is used,
rather than later as part of an annual survey.
7. Explain any special circumstances that require the collection to be conducted in a
manner inconsistent with OMB guidelines.
Respondents who choose to complete surveys on the Web will obviously be responding in less
than thirty calendar days from when they receive the request. In those surveys where a
questionnaire is attached to each product delivery, a person who frequently orders products will
receive more than one request quarterly, but all responses are voluntary.
8. Provide information on the PRA Federal Register Notice that solicited public comments
on the information collection prior to this submission. Summarize the public comments
received in response to that notice and describe the actions taken by the agency in response
to those comments. Describe the efforts to consult with persons outside the agency to
obtain their views on the availability of data, frequency of collection, the clarity of
instructions and recordkeeping, disclosure, or reporting format (if any), and on the data
elements to be recorded, disclosed, or reported.
A Federal Register Notice soliciting public comments was published on April 15, 2008 (73 FR
20256). No comments were received.

9

9. Explain any decisions to provide payments or gifts to respondents, other than
remuneration of contractors or grantees.
No payment or gift will be given to any respondent.
10. Describe any assurance of confidentiality provided to respondents and the basis for
assurance in statute, regulation, or agency policy.
There will be no assurance of confidentiality; however, provision of contact information is
optional.
11. Provide additional justification for any questions of a sensitive nature, such as sexual
behavior and attitudes, religious beliefs, and other matters that are commonly considered
private.
No sensitive questions will be asked.
12. Provide an estimate in hours of the burden of the collection of information.
A total of 59,100 individual responses is expected annually, with an average response time of
five minutes, resulting in 4,196 burden hours.
It is possible that there may be more than one response per respondent per year; this information
is not tracked, but each entry on web-based surveys has a time stamp. More than one entry per
respondent can be reasonably expected as new information and products appear. However,
entries in close succession, which generally could be construed as frivolous, are eliminated.
13. Provide an estimate of the total annual cost burden to the respondents or recordkeepers resulting from the collection (excluding the value of the burden hours in #12
above).
There will be no cost to respondents. For mailed surveys, envelopes with pre-paid postage will
be supplied.
14. Provide estimates of annualized cost to the Federal government.
All surveys will be conducted and analyzed in-house as part of program planning and thus there
is no additional cost.
15. Explain the reasons for any program changes or adjustments reported in Items 13 or
14 of the OMB 83-I.
Several of the surveys included in the previous collection are not to be renewed and others have
been approved since the last renewal. This generic clearance is used for a wider variety of
surveys than was the case three years ago, as program staff contacting the NOAA PRA
Clearance Officer regarding proposed surveys are routinely referred to this clearance if it appears
it may be applicable. Thus, the estimated burden hours have increased from 1,337 to 4,196, an
increase of 2,859 hours.
10

16. For collections whose results will be published, outline the plans for tabulation and
publication.
Aggregated results will be posted on the applicable websites as needed, to share with users as
part of information to be imparted about planned program or product changes.
17. If seeking approval to not display the expiration date for OMB approval of the
information collection, explain the reasons why display would be inappropriate.
All surveys will display the OMB expiration date.
18. Explain each exception to the certification statement identified in Item 19 of the
OMB 83-I.
No exceptions are requested.

11


File Typeapplication/pdf
File TitleSUPPORTING STATEMENT
AuthorRichard Roberts
File Modified2008-07-27
File Created2008-07-27

© 2024 OMB.report | Privacy Policy