Supporting Statement B CALO 6-17-2016.final

Supporting Statement B CALO 6-17-2016.final.docx

Cape Lookout National Seashore Cultural Resource Values and Vulnerabilities Assessment

OMB: 1024-0278

Document [docx]
Download: docx | pdf

Supporting Statement B


Cape Lookout National Seashore Cultural Resource

Values and Vulnerabilities Assessment


OMB Control Number 1024-NEW



1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of organizations (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


We will use a census sampling method to conduct both surveys. For the partner survey, all members with valid email addresses will be considered as the entire universe for this sample. . Each member with a valid email address will be given the same opportunity to participate in this study For the expert survey all names provided during a “snowball” method will be considered as the entire universe for that segment of the study.


The universe for this collection will include a combined list of 480 individuals that will be used to represent the partners and experts in this sample. We acknowledge that the number of subject matter experts in this field is narrowed to a known and tight-knit community of researchers; therefore the universe for this collection is limited there for this will be a census survey in that we will survey all of the names in both sample populations. There will be no attempts to generalize the results outside the scope of this study and this universe of respondents.


Partners: the Friends of Portsmouth Island and the Core Sound Waterfowl Museum & Cultural Center are considered to have connections to the history and culture and have a vested interest in cultural resource management at CALO. All members (n=400) of both groups will make up the sample for the Partner survey.


Experts: We will use a modified Delphi (or snowballing) method to develop a list of historic preservation experts in the Piedmont and Coastal Plain of Virginia, North Carolina, South Carolina, Georgia, and Florida for the expert survey. The list will start by asking known NPS cultural resource to provide the names and the contact information of at least three professionals in each of the following categories: 1) Atlantic and Gulf States State Historic Preservation Offices; 2) Non-governmental organizations and consulting firms; 3) Academic investigators researching the topic in the geographical areas mentioned above. We will follow the protocol of contacting individuals on the developing list until we the names begin to repeat and the list is exhaustive.


The table below shows the approximate expected sample size, response rate and the expected number of responses for each version of the survey.



Table 1. Respondent universe and expected sample size



Sample

Respondent Universe

Response rate

Expected number of responses

Partner Survey

400

40%

160

Expert

(1) Atlantic and Gulf States State Historic Preservation Offices

(2) Non-governmental organizations and consulting firms

(3) Academics


15


45


20

80


40%


40%


40%


6


18


8

32

Total

480


192



2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


Statistical methodology for stratification and sample selection


Based on our estimations, we conservatively anticipate a response rate of 40% for the survey. For the partner survey, this response rate was estimated based on the response rate achieved (46%) in a 2012 National Park Foundation study that used census sampling of their member database. For the expert survey, this response rate was estimated based on the response rate achieved (46%) in a Delphi study of experts that required a similar time commitment (about 1 hour). This effort will produce sample sizes that are considered robust in the aggregate with acceptable margins of error of between ±5% and ± 2% at the 95% confidence level for all aggregate samples and potential sub-samples


Degree of accuracy for the purpose described in the justification

Partner Survey

Based on the results of similar surveys (National Park Foundation, 2012) using similar methods of administration (repeat email contact), and similar question content (knowledge and perceptions), we expect that a response rate of at least 40% resulting in total of 160 completed surveys. The expected standard errors associated with the simple survey results (proportions) would be + or – 0.025 at a 95% confidence interval (based on an estimated proportion of 0.5) or better.


Expert Survey

Based on the results of a study that used a Delphi process to elicit professional’s perceptions (Moss, Seekamp, & Sparling, 2013), we expect that a response rate of at least 40% resulting in 32 completed surveys for this collection. The expected standard errors associated with the simple survey results (proportions) would be + or – 0.025 at a 95% confidence interval (based on an estimated proportion of 0.5) or better.






3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


For the Partner Survey, we will rely on the following script printed in each participating organizations’ newsletter to serve as the announcement of the survey. The newsletter will be sent to everyone (n=400) on the mailing list used as the sample universe for this survey.



Dear member of the Friends of Portsmouth Island/Core Sound Waterfowl Museum and Heritage Center,


Cape Lookout National Seashore is seeking your input on the cultural resources within Portsmouth Village and Lookout Village. As a member of our organization, we are asking that you complete a survey that will arrive by email within the next few days (weeks?). The e-mail will contain a directions and a link to an online survey. To be included in this study, be sure that we have your current email address on file. The survey will include questions about:

      • Your connections to and values of the historic villages;

      • Your thoughts on the condition of the resources within the villages;

      • Your perceptions of the vulnerability of resources within the villages to changes like erosions, wind damage and sea level rise;

      • The extent of change you’ve noticed within the villages and how those changes impact your attachment to the villages; and

      • Your perspective on potential management strategies.



We will use a modified Dillman (2014)1 method to maximize the response rate for this survey. Following the newsletter, the partner organization will send their entire email lists of members an initial contact letter designed to introduce the study and provide a link to the survey via email. Three days following the initial email we will send a reminder email, then seven days afterwards, they will send another reminder that will include the URL and the survey end date. Finally on the fourteenth day a reminder will be sent. All of the email contacts will include an opportunity for individuals who do not wish to participate in the study to “opt-out.” These individuals will be excluded from further reminders, but will be sent a follow-up non-response survey request.


Addressing potential non-response bias

Based on previous experiences using an on-line survey op-out option, we estimate that 10% of the sample (n=40) will use this option; based upon this, we estimate that 25% (n= 10) will complete the non-response bias survey. All respondents sending an opt-out will receive a request to complete a non-response survey (see attached document with contact emails and scripts). It is estimated that a total of 3 minutes to complete the non-response bias survey.


Statistical tests will be conducted to determine if differences exist between the study population and the population of non-responders. If a non-response bias is found, the data will be weighted to reduce the effect of non-response bias. Results of the non-response will be reported and any implications for applicability of survey results to generalizations about the study population will be discussed.


Expert Survey


  1. A modified approach to Dillman’s (2014) Tailored Method Design for Internet surveys will be used to maximize response rates.

  • We will start by using the mailing list to send a pre-notice message via email. This message will be used to explain the purpose of the survey and to inform each participant that we will be following up with them via telephone, and to enable participants to provide an ideal telephone contact date and time.

  • We will then call all participants to explain the study and ask for participation in the study. This call will enhance response rates and enhance participant understanding about the extended time commitment of the survey. We will also explain that responses will be aggregated to the employment type category and to the group level.

  • We will send the initial email with survey link and two reminder emails (day 3 and day 7) to enhance response rates. Each email will include a survey close date. Specifically, three attempts will be made by researchers, including:

    1. An email with link to the survey.

    2. A reminder email with a link to the survey. (3 days later)

    3. A final reminder email with a link to the questionnaire. (7 days after the initial contact)

Addressing potential non-response bias

If a contacted individual decides not to participate in the study (n=48) during our solicitation calls, we will verbally ask them to answer the following three questions.

  1. Please tell us why you do not want to participate?

  2. How long have you worked in your current position?

  3. How long have you worked in cultural resource management?

Statistical tests will be conducted to determine if differences exist between the study population and the population of non-responders. Results of the non-response will be reported and any implications for applicability of survey results to generalizations about the study population will be discussed.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


A thorough literature review was conducted prior to designing the instruments. The “Place Attachment” scale questions (Q1 & Q2) contained in the Partner Survey were selected and/or modified from the 2013-2015 NPS Pool of Known Questions and existing literature, with the exception of three new Place Attachment question items (questions listed below), and tailored to cultural resources at Cape Lookout National Seashore. Q1a-c, Q1d-I, Q2a-d are standardized questionnaire items to measure specific constructs related to place attachment and are taken from the Place Attachment scale in Topic Area 4 within the Pool of Known Questions (pp. 39-40). Q2e-i were adapted from previous research2 that focus on communities adjacent to protected lands (i.e., not just visitor studies). The three new questions (Q1j-k, Q2j) were developed to represent a new dimension related to cultural resources (as place attachment questions have typically only been assessed related to individuals’ attachment to natural resources), which will be measured on a Likert-type scale of agreement:

  1. Cape Lookout is an important part of our history as a nation.

  2. Preserving the history and culture associated with the cultural resources at Cape Lookout is important for future generations.

  3. The history and culture associated with Cape Lookout is unique and unlike other historic barrier island communities in the region.

All other questions were developed from reviews of CALO’s Foundational Document, and other NPS cultural resource management documents (including the Preserving Coastal Heritage Workshop Summary). All questions were reviewed by NPS personnel with expertise in cultural resource management, climate adaptation planning, and the resources at CALO (Dr. Janet Cakir, Cat Hoffman, and Pat Kenney). Dr. Mae Davenport (social scientist, University of Minnesota) conducted a peer review. Pretesting of the Partner Interview was performed by 6 graduate students at NC State University. An additional review of the final Expert Survey instrument was performed by 2 historic preservation experts (Beth Byrd, National Trust for Historic Preservation/Washington Harbor District Alliance; and Cynthia Walton, NPS Historian, Southeast Regional Office) who were also asked to comment on the total number of structures that would be feasible to review during the survey (5 structures/informant, with each structure taking approximately 10 minutes to evaluate). It was determined that it is impractical to ask respondents to evaluate all of the historic structures at CALO; one reviewer suggested five structures/informant and the other reviewer suggested 10 structures/informant. We selected the recommendation for fewer structures and decreased time burden. A final pretest was conducted with an academic expert to ensure completion time of the survey was about 45 minutes. We made changes in question wording from the comments received during NPS review, peer review, and pretesting.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Collection and analysis grantees:


Principal Investigator

Erin Seekamp, Ph.D., Associate Professor

Department of Parks, Recreation and Tourism Management

NC State University

919-513-7407

Co-Principal Investigator

Jordan W. Smith, Ph.D., Assistant Professor

Department of Environment and Society

Utah State University

Logan, UT 84322



1 Dillman, D.A.and J.D. Smyth. 2014. Internet, Mail and Mixed Mode Surveys: The Tailored Design Method, forth ed. John Wiley and Sons, Inc., Hoboken, New Jersey.


2 Smith, J.W., Davenport, M.A., Anderson, D.H., & Leahy, J.E. (2011). Place meanings and desired management outcomes. Landscape and Urban Planning, 101(4), 359-370.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFWS User
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy