0308 ss_050709 part A rev

0308 ss_050709 part A rev.pdf

NOAA Coastal Services Center Coastal Resource Management Customer Survey

OMB: 0648-0308

Document [pdf]
Download: pdf | pdf
SUPPORTING STATEMENT
2009 NOAA COASTAL SERVICES CENTER COASTAL RESOURCE MANAGEMENT
CUSTOMER SURVEYOMB CONTROL NO. 0648-0308

A.

JUSTIFICATION

1. Explain the circumstances that make the collection of information necessary.
One of the fundamentals of the NOAA Coastal Services Center (Center) is the commitment to
serve the technology and informational needs of our customers, the coastal resource management
community. To obtain this service goal, customer feedback is a critical need. We receive such
input through various means. The most important is the Coastal Resource Management (CRM)
Customer Survey (CRM Survey).
In continuing compliance with Executive Order 12862, Setting Customer Service Standards, this
survey will be used by the NOAA Coastal Services Center to obtain information from its
customers—state and territorial coastal and marine resource managers—regarding their
information needs based on their coastal resource management responsibilities, technology and
information management capabilities, and critical resource management issues.
The CRM Survey will allow the Center to determine specific services its customers want,
understand the customers’ level of technical expertise, and document priority issues most
relevant to their missions. The information will be used by the Center to guide strategic
planning, professional development, and informed delivery of future products and services for
the Nation’s coastal resource management community.
Other offices within NOAA and the National Ocean Service (NOS) have collected information
from segments of the respondent universe that this survey addresses. However, great effort has
been made to tailor the questions herein to pertain specifically to the types of functions the
Center supports, such as geographic information systems (GIS), remote sensing, and social
science. No other office within NOAA has collected the same information from the same
universe of respondents.
This request is for a renewal of this information collection.
2. Explain how, by whom, how frequently, and for what purpose the information will be
used. If the information collected will be disseminated to the public or used to support
information that will be disseminated to the public, then explain how the collection
complies with all applicable Information Quality Guidelines.
Purpose, Delivery, and Frequency
The 2009 survey represents the Center’s fifth triennial effort to assess customer management
issues, technical capabilities, and technical assistance needs at the national level. Previous CRM
surveys were administered in 1996, 1999, 2002, and 2006. The 1996 CRM Survey targeted the
1

information management sector of the coastal resource management community, the primary
customer base of the Center’s formerly named Coastal Information Services Branch. The 1999
CRM Survey targeted the information management and program or site management sections of
the coastal resource management community. The 2002 CRM Survey targeted coastal resource
management staff responsible for a broader array of coastal resource management topic:
respondents included lead staff responsible for program management, education and outreach,
research, natural resource management, planning, permitting and regulatory enforcement, and
information technology (GIS or remote sensing). The 2006 CRM Survey began seeking
information related to previous collections in order to effectively monitor trends of priority
coastal management issues, applications of technology, and key product and service needs that
the Center is uniquely equipped to address. The 2009 version seeks to accomplish comparable
objectives to the 2006 survey’s in a much more concise and user-friendly format.
Use of Past Results
In the planning and development stages for the 2009 CRM Survey, a cross-Center survey
working group was established with representatives from each of the Center’s four divisions.
The Center survey team and Center management staff critically reviewed the questions and
results of previous CRM Surveys in terms of their past, present, and future usefulness to
individual Center program areas and their application to the Center as a whole. The team’s
approach was that if they could not justify the usefulness of a question to their specific program
or to the Center, it should be omitted from the survey. Specific questions or content areas from
the previous 2006 CRM Survey that were identified as obsolete or of reduced utility were
omitted from use in the current 2009 version. Key results from the previous CRM surveys have
been cited repeatedly during presentations about the Center and have influenced the development
of new program initiatives. The following paragraphs describe instances of which results have
been used and how they have been applied.
The 1996 CRM Survey was designed primarily to collect information about the computer
systems and software that the coastal resource management community uses. The Center has
used this information to determine which software programs and formats should be incorporated
into the production of products and for designing technical training. The information collected
about GIS has been critically important to the Center, as it has been used to seek and acquire
support from partners for several locally-based GIS support projects, such as the Protected Areas
GIS.
The first section of the 1996 CRM Survey asked about “Coastal Information Management,
Problems, and Opportunities.” This question contained questions about the respondent agency’s
coastal responsibilities, the types of problems for which it managed data, and about management
obstacles limiting their efforts. In reviewing the results for this section, it was apparent that
further information about the agency’s management responsibilities would help the Center better
plan and develop products and services. This led to the changed design of the 1999 CRM
Survey—one part focusing on management issues and the second part focusing on technology
and information management.
The section on “Communication Pathways and Data/Information Exchange” asked about the
customer’s use of metadata and their methods of transmitting and sharing data. This information
was used also to direct product development. Using the results, the Center was able to determine
what formats and amounts/volumes of data could effectively be transmitted via the Center’s Web
2

site. The section also asked whether Center customers had coastal data or information that could
be added to the Center’s Coastal Information Directory, an electronic database for accessing
coastal data and information.
The “Current and Planned Activities and Products” section asked customers to rate their level of
interest in various planned activities. This information had influence on the development of
training programs, a competitive funding program for innovative coastal resource management
solutions, and ways the Center can better meet the needs of customers.
The 1999 CRM Survey focused on re-evaluating the level of use and need for technology-based
decision support in terms of GIS and remote sensing, gaining a better understanding of Center
coastal resource manager customers’ natural resource management roles, responsibilities, and
issues, and gauging interest and need for technical training, including:
•
•
•

technology tools (e.g., ArcView, remote sensing)
process skills (e.g., facilitation training, needs assessment, conflict management), and
content-specific areas (e.g., integrated coastal resource management, Coastal Zone
Management Act, smart growth).

Both the information management and natural resource management portions of the 1999 CRM
Survey asked in-depth questions about five broad categories of coastal issues: habitat, coastal
development, hazards, water quality, and resource management. The information management
portion asked about the offices’ collection, derivation, use, and management of spatial data
pertinent to these issues. Most respondents indicated that they use spatial data that has been
collected, derived, or managed by others. Seventy-two percent of the respondents indicated that
they use spatial data to manage habitat issues. Approximately half of the respondents use spatial
data to manage the other coastal issues (e.g., coastal development, hazards, water quality, natural
resource management).
The remainder of the information management portion of the 1999 survey assessed the level of
use, kinds of software, numbers of staff and level of expertise related to the use of spatial data
and geographic information technologies, development of Federal Geographic Data Committee
(FGDC) metadata, and training needs to Center customers. The Center invests a great deal in the
development of specialized decision support tools using GIS. Understanding the software most
commonly used enables the Center to develop products most useful to the majority of its
customers. The overall level of GIS expertise within an office, including the number of people
who use it or have been trained to use it, suggests the level of investment that an office has put
into spatial data access. The combined results of these questions have helped the Center identify
where gaps exist and where to focus its resources. For example, respondents reported that when
GIS and remote sensing capabilities are not available in-house, they turn to partnerships with
federal, state, or local agencies or academia to obtain these technical services. This information
is useful in guiding efficient planning for Center programs, for building partnerships with
customers and other federal agencies, and for development of grant funding opportunities for the
coastal resource management community. Information pertaining to software use and Internet
access has been critical for the development of Web-based tools and practical applications,
extensions, and data for Center customers.

3

The natural resource management portion of the 1999 survey asked about the offices’ roles (i.e.,
lead, coordinating, or independent) in addressing coastal natural resources management issues,
technical resources most useful in addressing offices’ coastal resource management
responsibilities, education and outreach efforts, and training needs. In general, most offices
reported playing a coordinating role when managing coastal issues. With this role, many offices
emphasized the need for accurate coastal data to properly address issues. Respondents also
reported a high interest and need for technical training, both in technology tools and in process
skills and specific coastal issue content. A majority of respondents also reported having
developed education programs and volunteer programs. At least two thirds of respondents or
more reported interest in training for public involvement, outreach planning, communication
planning, and conflict management training. Since the 1999 survey, the Center has developed
training programs to meet the needs of Center customers offered both at the Center and in
coordination with Center customer offices “in the field” in different parts of the country.
The 2002 survey also asked respondents about collaboration with other allied organizations and
to indicate practice, preference, and needs for information sharing and training delivery.
Respondents indicated that partnership building and outreach and education were a high priority
for their offices and that communication with colleagues and attending professional meetings,
conferences, workshops, and trainings are the most frequent ways to share new ideas and
information. There is considerable interest in training on specific coastal zone management
issues and process skills offered both remotely and at the Center, with identified training needs in
the areas of needs assessment, resource valuation, and heritage resource management. These
results suggested that the Center should continue to dedicate resources toward development and
improvement of these products and services.
The 2006 survey was the first survey administered via Email. Similarly, it was conducted to
determine opinions on and interaction with the Center among coastal resource stakeholders. This
survey was a precursor to tracking specific trends in specific priority topic areas and preferred
data content, as compared to access, sharing, and decision support tool development. The results
of the 2006 survey were highly utilized throughout numerous NOAA NOA offices for strategic
planning purposes, including the Office of Coastal Resource Management (OCRM) and the
National Centers for Coastal and Ocean Science (NCCOS).
Projected Use of New Results
Understanding natural resource management issues, technical capabilities, and use of geospatial
data and geographic information technologies, and social science needs are important to the
Center. The Center is committed to helping the coastal resource community apply GIS and
remote sensing technology to support coastal decision-making. The Center is also committed to
providing high quality information, technical assistance, and training services. To meet these
challenges, the Center needs feedback about the issues facing the coastal resource management
community, how Center customers use and manage spatial data, what issues are being addressed,
and what priorities exist. The 2009 survey is substantially shorter than the 2006 iteration in an
effort to maximize response and refine Center inquiry to items most essential to effective future
service delivery. The sections of the new survey are described below.
Section 1 asks respondents to indicate their previous contact with the Center. This will help the
Center gauge the means by which respondents have interacted with Center products and services
4

(e.g., attending Center workshops, using Center data products) and a means by which to compare
responses across the survey.
Section 2 begins by asking respondents to report the priority level of a number of coastal
management issues. The information resulting from this list of topic areas will aid the Center in
developing products and services most useful to Center customers relative to each listed topic.
The results will also indicate what issues different offices are tasked with, and help the Center
prioritize future spatial data provision.
The section continues with questions from previous Center surveys in assessing the awareness,
use, usefulness, and preferences of various technology, technology tools (e.g., GIS, remote
sensing, visualization) and related services. This information will be used to continue to provide
products and services (e.g., information resources, demonstration projects, training) that match
needs and capabilities of the Center’s customers. Trends of patterns and use can be depicted
over the course of the five CRM Surveys. This information will aid the Center in evaluating its
ability to meet the needs of the coastal resource management community as the use of GIS,
remote sensing, and other technology gains ever-increasing presence in the natural resource
management arena and as this dynamic segment of the computer and technology industry
evolves.
Section 3 focuses on tools and information resources, including new and emerging technologies,
such as social networking, Web services, and Data warehouses and portals. This continues
information tracking since 2002 about respondent’s familiarity with applied social science
methods and management processes. Additionally, it will allow for tracking new technology
trends as they gain increasing use and prominence in the management community. Identification
of trends and patterns of use of these methods and processes can aid the Center in evaluating its
ability to meet the needs of the coastal resource management community as these methods and
processes become more widely practiced.
Section 4 asks respondents to offer information related to their office, their position, the region
of the country they serve, and their years of experience as coastal resource management
professionals. These questions will aid the Center in improving products and services that are
well-suited to the coastal resource management communities in terms of delivering information
content at appropriate technical levels for a particular segment (e.g., education and outreach
program leaders). These questions will also enable the Center to create products and services
more appropriately suited to the needs and expertise of the Center’s customer base. This also
provides a means by which to group and compare responses (e.g., based on position type, years
in the field). This section also provides an opportunity for respondents to offer additional
comments related to the survey itself, its administration process, its burden, or any other topic.
This can provide valuable feedback for future information collections. The Center has used this
information in the past to direct changes in the survey and other products and services.
Our intended sample for the 2009 survey has been refined to include all of the Center's core
audiences (e.g., Sea Grant College Programs, National Estuarine Research Reserves (NERRs),
National Estuary Programs (NEPs), State, regional, and local coastal managers) and fewer
peripheral audiences and partner groups (e.g., National Park Service, US Fish & Wildlife
Service, US Army Corps of Engineers). This reflects the Center’s efforts to distill efforts to its
most critical customers. This will enable the Center to more effectively target products and
5

services to core audiences of the coastal management community, which will result in greater
service to these audiences, and cost savings over time.
As explained in the preceding paragraphs, the information to be gathered has utility. NOAA’s
National Ocean Service will retain control over the information and safeguard it from improper
access, modification, and destruction, consistent with NOAA standards for confidentiality,
privacy, and electronic information. See response to Question10 of this Supporting Statement
for more information on confidentiality and privacy. The information collection is designed to
yield data that meet all applicable information quality guidelines. Although the information
collected is not expected to be disseminated directly to the public, results may be used in
scientific, management, technical or general information publications. Should NOAA’s National

6

Ocean Service decide to disseminate the information, it will be subject to the quality control
measures and pre-dissemination review pursuant to Section 515 of Public Law 106-554.
3. Describe whether, and to what extent, the collection of information involves the use of
automated, electronic, mechanical, or other technological techniques or other forms of
information technology.
The survey will be administered via a web survey instrument. Mail surveys will no longer be
used as they are an added expense and can require more time to complete. Invitations with a link
to the survey will be sent to members of the coastal resources, and related, communities. Upon
request, respondents will be mailed a paper version of the survey to complete and return in a
postage paid envelope. If requested, accommodations will also be made to facilitate completion
of the survey via telephone.
4. Describe efforts to identify duplication.
No other existing similar information collections were found. Other GIS-related survey reports
were identified, but none had coastal resource management-specific content. None addressed the
same assemblage of topics.
5. If the collection of information involves small businesses or other small entities, describe
the methods used to minimize burden.
The completion of the proposed collection will not have a significant economic impact on the
respondents.
6. Describe the consequences to the Federal program or policy activities if the collection is
not conducted or is conducted less frequently.
The Center’s first operating principle is to be “customer driven.” Without regular input from the
coastal resource management community, the Center would risk investing in projects and
services that have little relevance to the coastal resource management community’s needs or that
are delivered in formats not usable by the customer. Conducting this survey will provide the
Center with consistent information from its customer base, recognizing and tracking differences
in technical capability and management responsibility by agency type and by region. Survey
results enable the Center to be more efficient in the development of specific products and
services that meet the needs and capabilities of our customers. Given the rapid evolution of
technology, the collection could not be conducted any less frequently.
7. Explain any special circumstances that require the collection to be conducted in a
manner inconsistent with OMB guidelines.
The collection will be conducted consistently with OMB guidelines.

7

8. Provide information on the PRA Federal Register Notice that solicited public comments
on the information collection prior to this submission. Summarize the public comments
received in response to that notice and describe the actions taken by the agency in response
to those comments. Describe the efforts to consult with persons outside the agency to
obtain their views on the availability of data, frequency of collection, the clarity of
instructions and recordkeeping, disclosure, or reporting format (if any), and on the data
elements to be recorded, disclosed, or reported.
A Federal Register Notice published on January 30, 2009 (73 FR 19 5640) solicited public
comments. No substantive comments pertinent to the collection were received.
Center personnel trained in survey research and design, along with MRAG Americas, Inc.
created the survey instrument. Technical literature consulted in the planning and development of
the instrument and survey administration included How to Conduct our Own Survey (Salant and
Dillman, 1994) as well as numerous other survey instruments and technical references. Salant
and Dillman have conducted extensive research on all aspects of survey design and
implementation for over a decade and their methods of distribution and follow-up have
consistently achieved positive results.
Pilot testing of the instrument was completed in April 2009. Pilot participants included
representative members from across the coastal resource management community. Pilot testing
included timing of respondents, identification and discussion of unclear instructions and question
content, asking respondents about the length of the instrument, and discussing suggestions for
improvements. Fewer than 10 external, non-federal employees participated in the testing and
subsequent discussions.
9. Explain any decisions to provide payments or gifts to respondents, other than
remuneration of contractors or grantees.
No plans exist for payments or gifts to survey respondents.
10. Describe any assurance of confidentiality provided to respondents and the basis for
assurance in statute, regulation, or agency policy.
Each survey response will contain a survey identification number for tracking and response
calculations. Responses will not be reported individually, only in aggregate. Respondents are
assured that their names will not be placed on their completed surveys or subsequent reports. A
summary of results will be posted on the Center’s websites, similar to those of past surveys
(available at: http://www.csc.noaa.gov/survey/).
11. Provide additional justification for any questions of a sensitive nature, such as sexual
behavior and attitudes, religious beliefs, and other matters that are commonly considered
private.
The instrument contains no questions of a sensitive nature.

8

12. Provide an estimate in hours of the burden of the collection of information.
The estimated one-time burden estimate for the survey is 125 hours. This reflects 500
respondents with an average completion time of 15 minutes, including the time for reviewing
instructions and gathering the requested information.
Respondents are likely to be program managers, department heads, and content area specialists
within their respective organizations, equivalent to a Government Service Pay Grade 12 Step 1.
Using this grade to estimate the hourly rate of the respondent ($28.62), the maximum estimated
annualized cost to the respondent for the hour burden of each collection (i.e., 0.25 hours) is $7.16
per respondent; the maximum cost for the information collection for a 100 percent response rate
(i.e., 500 respondents) is $3,578.
13. Provide an estimate of the total annual cost burden to the respondents or recordkeepers resulting from the collection (excluding the value of the burden hours in Question
12 above).
Responding to the survey requires no cost or record keeping.
14. Provide estimates of annualized cost to the Federal government.
This information collection effort is supported through external contract services for data
collection and analysis and in-house staff time. The estimated annualized cost for this
information collection is $21,948 (i.e., contract services, in-house staff time, supplies).
Estimates presented below represent the costs per annum for the term of the approval.
Annualized Cost to Federal Government
Labor
$Cost
Supplies
$100
Printing
$100
Data management and database development
$16,166
Project staff (ZP4)
100 hrs @ $38.46/hr
3,846
Project supervisor (ZA4)
20 hrs @ 46.73/hr
$936
Administrative staff support
20 hrs @ 20.00/hr
$400
Cover design, layout & editing
10 hrs @ 40.00/hr
$400
TOTAL
$21,948
15. Explain the reasons for any program changes or adjustments.
The previously approved request had a maximum one time burden of 250 hours, based on a 30minute response time. This request adjusts the maximum burden to 125 hours, based on a 15minute response time, for the same maximum number of respondents.
16. For collections whose results will be published, outline the plans for tabulation and
publication.
A summary of results will be posted on the Center’s websites similar to past surveys (available
at: http://www.csc.noaa.gov/survey/).

9

17. If seeking approval to not display the expiration date for OMB approval of the
information collection, explain the reasons why display would be inappropriate.
The expiration date and OMB control number will be displayed on the survey instrument.
18. Explain each exception to the certification statement identified in Item 19 of the
OMB 83-I.
There are no exceptions to the certification statement identified in Item 19.

10


File Typeapplication/pdf
File TitleSUPPORTING STATEMENT
AuthorRichard Roberts
File Modified2009-05-26
File Created2009-05-26

© 2024 OMB.report | Privacy Policy