1024-0254 Part A

1024-0254 Part A.docx

National Park Service Centennial National Household Survey

OMB: 1024-0254

Document [docx]
Download: docx | pdf

Supporting Statement A


National Park Service Centennial National Household Survey


OMB Control Number 1024-0254

General Instructions


A completed Supporting Statement A must accompany each request for approval of a collection of information. The Supporting Statement must be prepared in the format described below, and must contain the information specified below. If an item is not applicable, provide a brief explanation. When the question “Does this ICR contain surveys, censuses, or employ statistical methods?” is checked "Yes," then a Supporting Statement B must be completed. OMB reserves the right to require the submission of additional information with respect to any request for approval.



1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.

In August 2016, the National Park Service (NPS) submitted an Information Collection Request (ICR) to reinstate OMB Control Number 1024-0254. At that time we were requesting approval to pretest the information that would be used in the final version of the survey. This ICR is for the permission to use to conduct the third iteration of the NPS Comprehensive Survey of the American Public.

2016 marked the 100th anniversary of the National Park Service (NPS). This defining moment offered an opportunity to reflect on and celebrate our accomplishments as we move forward into a new century of stewardship and engagement. Discussions concerning the relevancy of the National Parks have defined the need for a third comprehensive survey that will provide in addition to data that will be compared to the results of the last survey also insights into the issue of relevancy beyond visitation.

The first Comprehensive Survey of the American Public (CSAP1) was conducted in 2000 by Northern Arizona University. In 2006, The NPS Social Science Branch sponsored the second iteration of the survey (CSAP2). The surveys were designed to obtain information on public attitudes and behaviors related to services provided by national parks, as well as on demographic characteristics of recent visitors and non-visitors to the National Park System1. Telephone interviews with more than 4,000 respondents across the United States provided information. Both surveys addressed visitor and non-visitor’s behavior, perception, and knowledge related to the services and recreation opportunities offered in sites managed by the NPS.

On August 24th, 2006, President Bush issued a memorandum to Secretary Kempthorne calling on NPS to further enhance the national parks during the decade leading up to the 2016 centennial celebration. In his August 24 memorandum, the President stated:

Therefore, I direct you to establish specific performance goals for our national parks that when achieved, will help prepare them for another century of conservation, preservation, and enjoyment. These goals should integrate the assessments of the past five years used in monitoring natural resources and improving the condition of park facilities.

In 2007, Secretary Kempthorne proposed five overarching goals to guide the NPS over the next nine years leading up to its 100th anniversary:

Lead America in preserving and restoring treasured resources.

Demonstrate environmental leadership to the nation.

Offer superior recreational experiences where visitors explore and enjoy nature and the great outdoors, culture, and history.

Foster exceptional learning opportunities connecting diverse groups of people to parks.

Achieve management and partnership excellence to match the magnificence of the treasures entrusted to its care.

These goals were intended to enhance the future and relevancy of the NPS. All parks and programs were mandated to develop strategic plans that would embrace these goals to enhance the public’s experiences and level of awareness of our agency. With this in mind, we propose that a third iteration of the Comprehensive Survey of the American Public be conducted. This survey will include questions from the original surveys as well as updated questions that can be used to provide views from a national audience concerning the current relevancy of the NPS that would otherwise be unavailable.

This collection will consist of the following elements:

1) Household Survey: a telephone survey of a random sample of U.S. residents (adults), disproportionately stratified by the seven NPS administrative regions. The target total number of completed surveys will be 3,500.

2. Youth Engagement Survey: In addition to surveying adults, we will plan to interview youth (ages 12-17) living in the same households, where a survey with an adult was completed.

3. Non-response Bias Survey: all potential respondents who refuse to participate in the full survey will be asked to answer just a few questions that will be used in the non-response bias analysis.

Legal Authorities:

  • National Park Service Protection Interpretation and Research in System (54 U.S.C. §100702)

  • National Parks Omnibus Management Act of 1998 ( 16 U.S.C. §5931-5937)



2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. Be specific. If this collection is a form or a questionnaire, every question needs to be justified.


In the spring of 2017, the NPS requested approval to pretest the questions and methodology that would be used in the 2018 Comprehensive Survey of the American Public. Researchers at the University of Wyoming’s Wyoming Survey & Analysis Center (WYSAC) completed a two phase process to develop the final version of the questionniare. During the first phase, WYSAC conducted 30 cognitive interviews to test the wording of new questions that were not included in previous iterations of this survey. In the second phase, the findings from the cognitive interviews were used to pretest the entire questionnaire. During the pretest process, questions were not eliminated but were further refined to create the final survey instrument. In addition to refining the questions, the purpose of the pretest was to estimate respondent burden and inform any modifications to the methodology such that the final survey can be administered in about 18 minutes.


Findings:

  • lower than anticipated raw response rates for the full-length survey,

  • lower than anticipated yield of completed surveys with children aged 12 to 17,

  • higher than initially estimated yield of non-response bias interviews,

  • average duration of the telephone interviews was 27.5 minutes

The purpose of the final survey is to generate trend data comparable with the findings from CSAP1 and CSAP2. Additionally, the results of this iteration will be used to measure the value to the public of current NPS programs that were not in place during the CSAP1 and CSAP2 surveys. The purpose of each survey section is summarized in Table 1 below.



Table 1. Summary of Survey Sections and their Intended Purpose

Section

The purpose of the questions in each section

Household Survey

Introduction

The questions in this section will be used to gauge overall satisfaction with the quality of services offered and to assess the public’s opinion of how well the NPS is managing national park sites. In addition, this section identifies respondent location of residency, age, and the number of children between 12 and 17 living in the home. The response to this question will be used to prompt the request for participation in the Youth Survey.

Park Visitation

Using the same definition for visitor used in CSAP1 we will establish visitor and non-visitor sub-samples. All respondents will be asked about their intention to visit a national park in the next 12 months.

Questions of visitors (only) will be used to elicit responses related to:

  • reasons for the last visit,

  • resources used to plan the last visit,

  • the use and importance of in-park programs and services,

  • willingness-to-pay for park visitation, and

  • NPS relevancy.

Non-Visitation

For the purpose of this collection, Non-visitor is defined as those who have never visited, those whose last visit was over two years ago, or those who visited in the last two years but were unable to correctly name a unit of the National Park System. Questions will be asked to assess the various reasons for lack of visitation to National Park sites.

Program Awareness

Questions in this section will be used to understand respondents’ engagement with NPS programs outside of traditional park visits. The questions will explore the relevance and value these programs have to the public. The programs covered are education, preservation, conservation, and recreation.

Demographics

Questions in this section will estimate the representativeness of the sample, to enable proper weighting of the final data set and to allow for cross sectional analysis of the data.

Youth Engagement Survey


Questions in this section will measure youth (aged 12 to 17 years old) visitation to units of the National Park System, assess engagement with online content offered by NPS, and personal experience with both. Demographic questions include age and gender.

Non-Response Bias Survey


Questions in this set will be used to test for the presence of non-response bias.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden and specifically how this collection meets GPEA requirements.

This will be a telephone survey and all information will be collected using a Computer Assisted Telephone Interviewing (CATI) system. The CATI system has been selected for this collection because of its survey management functionality. This system logs interviewer activity, schedules repeat calls, selects interviewees randomly, removes numbers from the call queue, reassigns calls to bi-lingual interviews as needed, and produces operational reports. CATI permits direct electronic data entry (reducing processing, data entry error, time and costs) thereby offering quick data turnaround. Coding procedures can be programmed into the computer. This not only reduces the costs of office coding, but also allows for better data quality.

4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.

There is no duplication of efforts or information. Although the NPS conducts more than 20 information collections per year, this is the only national survey funded for the purpose of providing both visitors and non-visitors an opportunity to help the NPS improve their efforts at reaching new audiences.

Other federal recreation surveys, such as the National Survey of Hunting, Fishing, and Wildlife-associated Recreation (U.S. Fish and Wildlife Service) and the National Survey of Recreation and the Environment (U.S. Forest Service), provide information on outdoor recreation participation in general, but do not provide information that can be used to understand the issues of relevancy and the public’s perception of the NPS.

5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.

This information collection will not impact small businesses or other small entities.

6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.

The NPS Centennial Initiative called for an agency-wide commitment to reaching new audiences. The consequences of not collecting this information will be three-fold: 1) the NPS will continue to rely on outdated and anecdotal urban legends to address the issues of non-visitation of under-representation of diverse groups, 2) the NPS will lack reliable information needed to represent the views and opinions of a new generation of visitors that will assist in post centennial planning efforts and 3) the NPS cannot continue to rely on the only comprehensive information of the national public that is more than 15 years old to evaluate its relevancy among visitors and non-visitors.

7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.

No special circumstances exist.

8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice [and in response to the PRA statement associated with the collection over the past three years] and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.

In December 24, 2015 (80 FR 80384) a Federal Register Notice was published to announce that we would submit an information collection request to OMB for approval to pretest the questions and methodology for the Comprehensive Survey of the American Public. In that same notice we requested to submit the final version of the survey instrument to OMB-OIRA at the conclusion of the pretest. We solicited public comments for 60 days, ending February 22, 2016. We did not receive any comments related to this request.

In addition to the 60 day FRN, three individuals outside of the agency were asked to review survey instruments and methods, used for the pretest efforts. Based on their expertise, these individuals were asked to assess the clarity and overall effectiveness of the questions for the cognitive interviews and pretest. Based upon the reviews and editorial suggestions, WYSAC conducted 30 cognitive interviews to evaluate the wording of new questions, and completed 92 telephone surveys to pretest the entire questionnaire. Following the conclusion of the pretesting, the same individuals were asked to review the final version of the survey to weigh in on the proposed strategy (split sample design) to reduce the respondent burden during the telephone surveys. A report of the finding from the pretest is attached in ROCIS as a supplementary document in ROCIS.

Table 2. List of Non Federal Reviewers contacted outside of the agency to provide feedback

Associate Research Scientist, Wyoming Survey & Analysis Center, University of Wyoming

Professor of Statistics, Wyoming Survey & Analysis Center, University of Wyoming

Senior Director, Resource Systems Group (RSG)


The purpose of the pretest was to detect any problem with the questionnaire design leading to ambiguity of words, misinterpretation of questions, inability to answer a question, sensitive questions, and any other problems associated with the questionnaire as well as the process of administering the survey. It also provides an opportunity to give feedback to the interviewer to ensure that the proper protocols of data collection procedures are followed. Based upon the result of the pretest, we have incorporated edits to improve the current version of the instrument. The comments we received and the responses we provided are listed below.



Reviewer Feedback

Comment 1: Question PV7c requires further attention. Personal share should be clearly defined. Coming up with a working definition of personal share will be helpful.

BEFORE: As you know, some of the costs of travel such as gasoline, hotels, rental cars, and airline tickets often increase. Would you still have made your most recent visit to a national park if your personal share of total trip costs were $ [bid amount] more than the amount you spent on this trip?


Response 1: The question was changed to read as follows:

AFTER: As you know, some of the costs of travel such as gasoline, hotels, rental cars, and airline tickets often increase. Would you still have made your most recent visit to a national park if your total trip costs were $ [bid amount] more than the amount you spent on this trip?


Comment 2: It would help the data collection process and improve quality of the data collected, to elaborate the language used in the stem of Questions NV17-29, so that, to the extent possible, respondents will think in a similar way when answering the questions about the importance of leisure time activities—to include not just actual engagement but also how much they would like to engage.


BEFORE: The next series of questions are concerned with what people do during their leisure or free time. I’m going to list things that people might do during their leisure, or free time. We are interested in how important these are to you personally. For each one, tell me how important it is to you personally to engage in the activity during your leisure, or free time.


Response 2: The wording was changed as follows:

AFTER: I’m going to list things that people might do during their leisure, or free time. We are interested in how important these are to you personally. For each one, tell me how important it is to you to engage in the activity during your leisure, or free time. For some of the activities you may be already very much engaged (that is you dedicate significant time and attention to them). Some, you may think of as very important to you personally, even if currently you are not able to dedicate the time and attention to them that you would like.




Comment 3: The cognitive interviews revealed that respondents had problems with the indirect way of asking about race/ethnicity (Question NV12).

Respondent comments:

  • It sounded like it was talking about ethnicity. If it was then they should be more explicit.”

  • I think it’s ridiculous. It’s such a ridiculous question that I don’t even know how I should answer it.”

  • Instead of saying people who work there, it should say people who visit.

  • It's clear, it's just silly. It's just irrelevant. Who would go to a national park on the basis of who works there? It's just silly.”

  • “…is too vague, the question needs to be written better. Need to be very specific.”


This question should either be eliminated, or reworded.


BEFORE: We're interested in why people don't visit national parks or don’t visit more often. I'm going to read a series of statements. I'd like you to think of your own experiences, and tell me how much you agree or disagree with each statement


I don’t visit national parks or don’t visit more often because:

 “I don’t have much in common with people who work in the national parks.”


Response 3: Changed the response item to read:

AFTER: “The people who work in the national parks are of a very different racial/ethnic background than mine.”


Comment 4: Questions CP1-7 may benefit from providing guidance on the thinking process of those living in big metropolitan areas.

BEFORE: Next, I am going to ask about your awareness of your community’s experience with cultural programs provided by the National Park Service. Cultural programs include assistance with the preservation of local historic buildings and sites which commemorate American history and culture or significant events and people. Has your community…


Response 4: Changed wording as follows:

AFTER: The following questions are about your awareness of your community’s experience with cultural programs provided by the National Park Service.


When thinking about your community, think about your town, or city if you live in a small town, or city. Or, if you live in a metropolitan place like New York City, Los Angeles, Chicago, think about your section (or side) of town or area you live in. But do not limit your thinking to your immediate neighborhood.

Findings from the Pretest

Finding 1: The average duration of the pretest telephone interviews was 27.5 minutes. This finding dictates the need to shorten the duration of the interviews in order to get closer to the intended target length of 18 minutes, on average.

Recommendation: Use the Split Sample method of administering the questionnaire, so that not all respondents will get all questions. This approach will reduce the duration of an average interview.


Finding 2: The pretest proved invaluable in assessing the utility of the new (beyond park visitation) modules introduced to the survey, which include: Education Programs (EP), Cultural Heritage Programs (CP), Recreations Programs (RP), National Natural Landmarks (NNL) and Overall Program Awareness (PA). Analysis of the answers to those questions indicates that all five modules worked very well as an a way to measure NPS relevance and how the agency connects to the public. This finding supports the decision to keep all five modules in the final survey and resort to other approaches of reducing interview duration.

Recommendation: Since the NPS is mostly interested in understanding it current relevancy beyond the boundary of the areas it manages, the pretest suggests that the new questions have high utility value for future program development and project management.


Findings 3: The pretest suggests a higher than intially expected yield of non-response bias surveys, a lower than initially estimated raw response rate for the household survey, and a lower than initially estimated yield of youth-engagement surveys.

Recommendation: Adjust expected raw response rates, needed initial sample sizes and number of completions (final sample sizes) accordingly.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.

No payments or gifts will be provided to respondents.

10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


This work will be conducted in accordance with the Paperwork Reduction Act and under the guidance of the National Park Service. We will not provide any assurances of confidentiality, however, all responses will be anonymous. No personally identifiable information (name or telephone number) will appear in the context of the results nor in any of our reports or findings. The database containing all contact information and any information that could be used to identify individuals will be completely destroyed at the end of the data collection period.

11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.

No questions of a sensitive nature will be asked.


12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under “Annual Cost to Federal Government.”


This is a one-time collection that will consist of three distinct telephone surveys (Household, Youth Engagement, and Non-response Bias). A random sample of 140,000 telephone numbers (cell and land lines), will be obtained from the Marketing Systems Group (M-S-G). This number represents the estimated initial sample size needed in order to obtain the desired number of 3,500 completed full-length interviews. We anticipate a combined total of 8,443 completed responses over the three distinct surveys that is estimated to be 1,380 annual respondent burden hours (Table 3). This estimated burden is based upon the time to introduce, initiate and complete the telephone interview.



Household Survey. The sample for this collection will be disproportionately stratified by the seven NPS administrative regions. We will aim to complete not less than 3,500 household surveys (500 per region). Based on the results of the pretest and the decision to use the split sample method to reduce interview duration, it is estimated that on average the interviews will take about 18 minutes to complete. Assuming a 2.5% raw response rate, we estimate that we will need 140,000 phone numbers to meet the target number of completed household surveys. These assumptions translate to a total of roughly 3,500 completed household surveys.


Youth Engagement Survey –Based on the pretest, we estimate that 550 households will be eligible for the Youth Engagement Survey. Telephone interview surveys will be attempted with young people ages 12-17 living in the same residence as the adult completing the household survey. Based on the pretest we assume an effective response rate of about 30% (n=165). Based on the pre-test we estimate that on average it will take about 4 minutes to complete a Youth Engagement Survey.


Non-response Bias Survey During the initial contact, the interviewer will ask each respondent refusing to complete the full survey to complete a sample of questions taken directly from the survey that will be used to measure non-response bias. Based on the pretest we assume a raw response rate of about 3.5% and with that we expect to obtain 4,778 completed non-response bias surveys. It is estimated that the interviews will take about 4 minutes to complete.


Table 3: Estimates of hour burden


Sample size

Raw Response Rate

Total number of completed surveys

Time to complete responses (minutes)

Total annual burden (hours)

Household Survey

140,000

2.5%

3,500

18

1,050

Youth Engagement Survey

550

30%

165

4

11

Non-response Bias Survey

136,500

3.5%

4,778

4

319

TOTAL



8,443


1,380

*completion time includes initial contact time which includes the time to request participation, read instructions and to introduce the survey.

The U.S. Department of Labor, defines a volunteer is: an “individual who performs hours of service for civic, charitable, or humanitarian reasons, without promise, expectation or receipt of compensation for services rendered.”2 For the purposes of this collection, we are using the estimated value of volunteer time as calculated in the April 20, 2017 news release for the Independent Sector organization (http://independentsector.org/news-post/value-volunteer-time). According to this news release, the value of volunteer time of $24.14 is based on the average hourly earnings of all production and non-supervisory workers on private non-farm payrolls (based on yearly earnings provided by the Bureau of Labor Statistics).3 The Independent Sector does not include a multiplier for benefits for volunteers, therefore we will assume that $24.14 accounts for wages and benefits. The total estimated annual dollar value of the burden hours for this collection will be $33,313.

Table 4: Total Estimated Hour Burden and Dollar Value of Survey



Total Burden (Hours)

Dollar Value of Burden Hour Including the multiplier for benefits

Total Dollar Value of Burden Hours*

Household Survey

1,050

$24.14

$25,347

Youth Engagement Survey

11

$24.14

$266

Non-response Bias Survey

319

$24.14

$7,700

TOTAL

1,380

$24.14

$33,313


13. Provide an estimate of the total annual [non-hour] cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in items 12).

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information [including filing fees paid]. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government or (4) as part of customary and usual business or private practices.

There are no non-hour costs to respondents resulting from this collection of information.


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.

The cost to the Federal Government for this collection is estimated to be $236,662. This includes the cost to the Federal Government for salaries and benefits for administering this information collection ($3,818), and non-federal staff and operational expenses ($232,844). Table 5 below shows Federal staff and grade levels the tasks associated with this information collection. We used the Office of Personnel Management Salary Table 2017- DEN4 to determine the hourly rates for federal employees; and multiplied the hourly rate by 1.6 to account for benefits.


Table 5. Federal Employee Salaries and Benefits

Position

GS Level

Hourly Rate

Hourly Rate incl. benefits (1.6 x hourly pay rate)

Estimated time

(hours)

Annual Cost

Chief, NPS Social Science Program

14

$59.66

$95.46

40

$3,818


All employee costs are contracted through RSG and the University of Wyoming. The estimates below in Table 6 include the contracting and operational expenses associated with this collection totaling $232,844. To calculate labor cost, fully loaded actual wages of designated University of Wyoming personnel are used. The fully loaded wages include employer paid benefits and indirect cost rate of 17.5% as approved by the University of Wyoming for Cooperative Ecosystem Studies Units (CESU) projects.

Table 6. Non-Federal Employee and Operational Expenses

Non-Federal Employees

Cost

Senior Social Scientists

$23,000

Social Scientists

$57,682

Interview Specialists

$121,373

IT and Administrative support personnel

$4,416

Subtotal

$206,471

Operating Expenses


Travel (include airfare and lodging)

$1,363

Spanish translation of questionnaires

2,800

Cost of sample of telephone numbers

$13,250

Long distance telephone charges

$8,000

Software licenses (prorated)

$1,000

Subtotal

$26,413

TOTAL

$232,844


15. Explain the reasons for any program changes or adjustments in hour or cost burden.


This is a reinstatement of a previously approved collection that was used to conduct at national survey. After consulting with DOI and OIRA Desk Officer, NPS determined that the best approach would be to decouple the pretest from the final survey and submit them separately using the same Control Number.


The adjustment in the annual respondent burden is due to the removal of 54 previously approved burden hours and 153 respondents associated with the pretest effort. The request is for 1,380 additional hours that are needed to conduct the full version of the final survey. This request results in a net increase of 1,326 hours.

Table 7. Summary of Program Change



Completed Telephone Interviews


Annual Respondent Burden Hours

Previous

Number of Responses Approved

Current Number of Responses

Requested

Change


Previous

Burden Hours Approved

Current

Burden Hours Requested

Change

Cognitive Interviews

30

0

-30


13

0

-13

Household Survey

90

3,500

3,410


38

1,050

1,012

Youth Survey

9

165

156


1

11

10

Non-response survey

24

4,778

4,754


2

319

317

TOTAL

153

8,443

+8,290


54

1,380

+1,326



  1. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Table 8. Proposed time schedule


Anticipated Start Dates

Anticipated End Dates

Obtain sample of telephone numbers

February 2018

February 2018

Finalize CATI programming

February 2018

February 2018

Train Interviewers on specifics of survey

February 2018

March 2018

Conduct telephone interviews

March 2018

June 2018

Export data, clean data file and prepare data set for weighting

July, 2018

July, 2018

Complete post stratification/calibration (weighting)

August, 2018

August 2018

Data analysis

September, 2018

October, 2018

Report

November, 2018

December, 2018




  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The expiration date of OMB approval will be included in the introductory script associated with this collection so that it can be provided to respondents during the explanation of the study.


  1. Explain each exception to the topics of the certification statement identified in "Certification for Paperwork Reduction Act Submissions".

There are no exceptions to the certification statement.

1 http://www.nature.nps.gov/socialscience/docs/CompSurvey2008_2009TechReport.pdf

2 See: http://www.forpurposelaw.com/appreciating-volunteers/

3 http://independentsector.org/resource/the-value-of-volunteer-time/

4 https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2017/DEN.pdf

13


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSocial Science Assessment and Geographic Analysis of Marine Recreational Uses and Visitor Attitudes at Dry Tortugas Natural Rese
AuthorUniversit
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy