Supporting_Statement_PartA_10-17-07_Final

Supporting_Statement_PartA_10-17-07_Final.doc

2007 National Park Service Comprehensive Survey of the American Public

OMB: 1024-0254

Document [doc]
Download: doc | pdf



SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION

FOR THE “2007 National Park Service Comprehensive Survey of the American Public”











OMB Control Number -









Prepared by




Patricia A. Taylor, Ph.D.


University of Wyoming, the Department of Sociology and

The Wyoming Survey and Analysis Center


James H. Gramann, Ph.D.

Social Science Program

National Park Service






August 4, 2007











Supporting Statement for a New Collection RE: 2007 National Park Service Comprehensive Survey of the American Public


OMB Control Number: 1024-new


A. Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


From the Organic Act of 1916 to enabling legislation for specific parks, the National Park Service (NPS) has received a viable Congressional mandate for collecting information to assist in the management of national parks, monuments, and historic sites. Specifically, 16 U.S.C. 1 through 4 (NPS Organic Act of 1916) provides the authority for the Director of the NPS to manage the parks. Part 245 of the Department of the Interior Manual delegates to the Director of the NPS the Secretary of the Interior’s authority to supervise, manage, and operate the National Park System. The National Parks Omnibus Management Act of 1998 (Public Law 105-391, Section 202; 16 U.S.C. 5932) requires that units of the NPS be enhanced by the availability and utilization of a broad program of the highest quality science and information. The NPS Management Policies 2006, Section 8.11.1, further states that the NPS will facilitate social science studies that support the NPS mission by providing an understanding of park visitors, the non-visiting public, gateway communities and regions, and human interactions with park resources.


The 2007 NPS Comprehensive Survey of the American Public will produce high quality information for National Park System managers and policy-makers to aid in the further development of programs and resources within the park system. This survey continues information first gathered in 2000, when the first comprehensive survey of the American Public was conducted for the NPS.


Relevant documents are contained in the attachments to this statement. Attachment A provides a copy of The Organic Act of 1916. Attachment B contains a copy of the General Authorities Act of 1970. Attachment C contains the National Parks Omnibus Management Act of 1998. Attachment D contains section 8.11.1, “Social Science Studies,” of the NPS Management Policies. Attachment E contains the 60-day Federal Register notice published on December 6, 2006. The complete telephone interview script is included in Attachment F. Scripts for developmental work on the Comprehensive Survey, including cognitive interviewing scripts and focus group scripts, are included in Attachment G and Attachment H.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. [Be specific. If this collection is a form or a questionnaire, every question needs to be justified.]


The Comprehensive Survey of the American Public is the second such survey conducted by the NPS. The first was done in 2000. Unlike the first survey, the 2007 survey is envisioned as a baseline data collection for a series of similar surveys repeated every 5 years. Where appropriate, some questions from the 2000 survey are included in the 2007 information collection, but others have been significantly revised, and many new questions have been added.


The Comprehensive Survey of the American Public will be the only national information collection by the NPS that describes visitors and non-visitors to units of the National Park System. Information on non-visitors, including their demographic characteristics and reasons for their non-visitation, is especially important in designing programs to more effectively reach under-served populations.



Comprehensive Survey Cycle

Ideally, the Comprehensive Survey should be conducted every 5 years, a time period long enough to identify important trends in key measures. The 5-year interval is similar to the schedule adopted by the Fish and Wildlife Service for its “National Survey of Hunting, Fishing, and Wildlife-associated Recreation” and the cycle employed by the Forest Service in its “National Visitor Use Monitoring Program.”


General Uses by NPS and Stakeholders

The leadership of the NPS will use the information gathered from the Comprehensive Survey to help determine the perceptions of the American public regarding national parks and future policies for the NPS. Congressional committees referred to the earlier survey during hearings on visitation trends and as background for briefings on under-served populations. Copies of technical reports from the 2000 survey are requested frequently by the media, members of academia, and stakeholder groups who have interests in barriers to visitation, patterns of national park-going, and tourism and recreation behavior in general.


The NPS Comprehensive Survey is especially timely given recent concerns in the NPS, Congress, and communities near parks over declining visitation to the National Park System. The survey will provide strategic information that will assist the NPS in maintaining 21st century relevance by ensuring that parks, programs, and visitors reflect the diversity of America.


Structure of the Comprehensive Survey

The NPS Comprehensive Survey will be conducted as a national telephone survey in 2 waves: 1,750 in October and November 2007 and 1,750 in February and March 2008, resulting in a national sample of 3,500 completed interviews (500 in each of the 7 administrative regions of the NPS) These waves will allow for both summer and winter park visitors to be surveyed within a short time of their park experience.

The Comprehensive Survey is divided into two sections: 1) questions asked of all respondents who agree to participate in the survey, and 2) additional questions asked only of “recent visitors” to the National Park System, as determined by screening questions. Automatic branching programmed into the computer-assisted telephone interviewing routine will ensure that the correct questions are asked of the appropriate respondents.

The 2007 survey will replicate as closely as possible the sampling, methodology, and some of the questions from the 2000 survey. This will allow important trend data to be collected, such as the percentage of US households making recent visits to the National Park System. In addition, many new questions have been included at the request of various programs within NPS to provide information on issues of current importance. Replicated questions and new questions are indicated in the telephone interview script in Attachment F.


In addition to the core variables, the new questions include:

  • Leisure travel patterns (Questions #7, #8, #10, #11, #12, #16)

  • These are requested by the NPS Office of Tourism and will provide data valuable in building partnerships between NPS and the tourism industry and in examining dynamics of leisure travel that may contribute to recent downturns in visitation to the National Park System.

  • Use of and satisfaction with interpretive services (Questions #14, #14a)

  • These questions requested by the NPS Interpretative Development Program are designed to provide one source of information useful in planning and evaluating interpretive programs in response to the PART evaluation of visitor services.

  • Volunteering and donation knowledge and behavior (Questions #23 [all parts], #24)

  • These questions are sponsored by the NPS Partnership and Visitor Experiences Directorate to provide information useful in assessing current partnering efforts and developing new programs to encourage volunteering and other forms of citizen support and philanthropy.

  • Perceptions of natural and cultural soundscape management (Questions #25, #26, #27)

  • These questions are sponsored by the NPS Air Resources Division and are designed to provide the first national look at attitudes toward natural and cultural soundscape management in the NPS. These questions will be further tested and developed through the focus groups and cognitive interviewing proposed in part B4 of this supporting statement.

  • Attitudes toward current natural resource and recreation management issues (Question #28)

  • This question is sponsored by the NPS Natural Resources Program Center and is a measure of general public attitudes toward natural resource and recreation management issues of current importance to the NPS. This question also will be further tested and developed through the focus groups and cognitive interviewing proposed in part B4.


The rationales for questions are more fully described under “Question Justifications.”


Cell Phone Over-Sample

Currently, about 13% of US households use cell phones exclusively, with projections of up to 25% by fall 2008. The increase in the popularity of cell phones calls into question the adequacy of conventional land-line sampling frames from which households are selected through random digit dialing (RDD). Looking to the future, survey methodology will need a mechanism to sample additional cell users. In this survey, an add-on of a cell phone user sample will form a benchmark to compare sampling differences with the RDD results. The cell user sample will be compared to the land-line sample, looking at demographic characteristics of respondents, park visitation rates, and attitudinal variables. The cell phone “over-sample” will add 500 cases to the planned RDD sample of 3,500 completed interviews. However, the additional cell cases will not be analyzed as part of the national sample. Their purpose is to test for differences in answers to survey questions based on mode of response. This information is needed by NPS to determine whether changes in measures tracked over time represent actual shifts in knowledge, attitudes, or behavior or are instead artifacts of differences in responses between cell-only households and households with land-lines.


In each wave, 250 cell phone interviews will be completed. The cell phone sample will not be stratified by region, since the cell-only sample will be too small to support within-region analyses.


Question Justifications

The rationale for each survey question is described below. Numbers are keyed to the question numbers in the telephone interview script in Attachment F.


QUESTIONS #1 - #5 ARE ASKED OF ALL RESPONDENTS

Screening Questions: #1 through #5. These questions are necessary to define the path through which the remaining questions are presented to the respondent. Each respondent is contacted through a randomly selected telephone number. Questions #1 through #4 solicit responses regarding the individual’s availability to answer questions and give the household members information about the research project. Question #1 briefly describes the purpose of the study to potential respondents and solicits cooperation. Question #2 provides information on whether or not the phone number is for a private household. Questions #1 and #2 also provide information about whether the phone is a cell phone or a land line. In order to get the sample of cell phone-only households, these are necessary screening questions. Question #3 provides information on the number of adults in the household, as this number will be used to determine which adult to interview. Question #4 will have three variations: most recent birthday, next birthday, and a randomized selection of household member based on age (oldest, second oldest, third oldest, etc.). This procedure has been determined to be the state of the art in assuring randomization of the respondent answering questions in the household. Results of these three variations can be analyzed to check for bias in substantive results and in representativeness to the US population. Since older individuals and women are more likely to answer the telephone, the “most recent birthday” question and method provides a means of randomizing the household respondent who answers the questionnaire.


Questions #5, #5a, #5a1, and #5a2 provide information on whether or not respondents visited an NPS unit within the 2 years prior to the survey. Answers to this question are checked against a master list of the 391 units of the National Park System available to interviewers and are used to identify “recent visitors” to a park unit. (Consistent with the 2000 survey, “recent visitors” are defined as those who visited a unit within the past 2 years that they could correctly identify.) Several questions in the survey dealing with the most recent visit to a park are asked only of this subset of respondents.


Question #5b identifies those who have not visited within the past 2 years, but did visit a unit within the previous 3-5 years. A few questions not asked of “non-visitors” are asked of recent visitors, as well as of those who visited within the past 3-5 years.


Question #5c asks if respondents have ever visited a unit of the National Park System in their lifetime. This identifies “non-recent” visitors.


Question #5d asks those identified as visitors to recall how old they were when they first visited a park unit. People who have early experiences with national parks are likely to have attitudes and behaviors that differ from those who didn’t visit a park until they were adults.


QUESTION #6 IS ASKED ONLY OF “RECENT VISITORS”

Satisfaction with Park Experience: Question #6 replicates a measure of satisfaction with park experiences from the 2000 survey. Responses will allow the NPS to measure trends in visitor satisfaction since 2000.


QUESTIONS #7 - #9 ARE ASKED OF ALL RESPONDENTS

Travel Questions: The plan to visit a national park unit in the next 12 months (Question #7) repeats an item from the 2000 survey, with a slight rewording to conform to a widely used format for asking behavioral-intention questions. Planning to visit in the future is a general indicator of continued, active interest in the National Park System.


Questions #8 and #9 are new questions that capture information regarding the leisure-travel preferences of the American public. Question #8 asks respondents whether they have vacationed overnight away from home during the previous 2 years. This question is extensively employed in household surveys conducted by the travel industry. Question #9 asks respondents whether they like specific types of vacations or vacation destinations, including national parks, other outdoor recreation areas (such as state parks and national forests), and additional types of destinations. This information is sought by the NPS leadership and stakeholders in the travel industry. The purpose of this question is to determine how national parks rate as preferred destinations compared to the growing number of other leisure-travel destinations, such as cruises and casinos. Greater competition and loss of “market share” to these other destinations may be one explanation for recent declines in visitation to the National Park System.


(For those individuals who answered “no” to questions 5a1 or 5a2, the path on the survey will move the interviewer to question #17 after questions #8 and #9 have been answered by all respondents.)




QUESTIONS #10 - #16 ARE ASKED ONLY OF “RECENT VISITORS”

Visits to National Park System Units: Questions #10 through #16 focus on experiences in national park units and are asked only of those respondents identified as “recent visitors.” Question #10 is an open-ended question (also used in 2000) that asks respondents to recall their main reasons for their last visit to a park unit. This information is important to park managers who must make decisions about which services and opportunities to provide for visitors. The most common responses to the 2000 Comprehensive Survey are provided for the interviewers, and other responses are coded as appropriate. Question #11 is a new question that asks park visitors to comment on how much different experiences added to their enjoyment during their most recent visit to a National Park System unit. The answers to this question will aid park managers in determining the importance of different mission-relevant experiences for their visitors. All items in #11 are drawn from the “Recreation Experience Preference” scales developed, tested, and validated by the US Forest Service.


Questions #12 and #13 are new questions in 2007 that ask respondents to describe the type and size of groups they traveled with on their last visit to a park. These questions are important to parks, the travel industry, and gateway communities because different group types (e.g., family, organized, single person) make different demands on services and facilities in and around parks. These questions are asked in a format similar to that used in the NPS Visitor Services Project surveys.


Questions #14 and #15 ask respondents to recall the recreational activities they engaged in during their most recent visit to a national park site. This question was asked in 2000, but the format has changed significantly to include activities of all personal group members and to list activities that have become mission-relevant since the last survey. For example, some activities in question #13 reflect the NPS’s concern with providing opportunities for healthful recreation as described by the US Surgeon General. Questions #15 and #15a focus specifically on use and evaluation of interpretive programs and services. Responses to these questions will assist the NPS in assessing the percentage of visitor groups using these services, as well as which types of services contribute significantly to visitor enjoyment.


Question #16 asks respondents to list the forms of transportation they used to reach the park they most recently visited. This question tests the notion that many visits now made to distant parks involve flying to a destination, renting a vehicle, and driving to the park. This is thought to be a departure from the historic pattern (prior to airline deregulation) of relying on a personal vehicle to reach distant parks and could partially explain decreases in visitation to units of the National Park System that are located “along the way” on major cross-country land routes (e.g., Badlands National Park, Carlsbad Caverns National Park).


QUESTIONS #17 AND #18 ARE ASKED OF ALL RESPONDENTS

Understanding Non-visitors: The decline in per-capita visits to the National Park System since the 1980s has prompted widespread discussion in the media, gateway regions, and Congress over why Americans do not visit parks in greater numbers, especially in light of significant increases in both the US population and the number of units in the National Park System. Question #17 replicates a question from the 2000 survey that asks both recent visitors and non-visitors to comment on why they don’t visit national park sites more often. Items #17a through #17l repeat responses used in the 2000 Comprehensive Survey. Items #17m through #17o are added to capture additional reasons that have been the topic of recent discussions. These include children losing interest in national parks, a “been there-done that” mindset among baby boomers, and the distractions provided by electronic media. A final response (#17p) results from focus group work with African Americans conducted for this survey, as well as previous focus groups with Hispanic Americans and Asian Americans conducted by the Army Corps of Engineers.


Question #18 is an open-ended follow-up to #17, allowing respondents to suggest measures that the NPS could take to encourage them to visit units of the National Park System. This question replicates a question from the 2000 survey. Fourteen of the coded response categories come from the most frequently mentioned responses to the first Comprehensive Survey. Four new coded responses reflect issues emerging from the African American focus group and from other stakeholders (i.e., organized hunting groups).


QUESTIONS #19 - #22 ARE ASKED ONLY OF “RECENT VISITORS”

Visitors’ Use of Accommodations: Questions #19 through #21 ask respondents to provide information about overnight stays (if any) during their most recent visit to a park site. Information on the percentage of visiting groups staying overnight in a park or in the surrounding area is of vital interest to park managers, concessioners, and gateway communities. These questions were asked in the 2000 survey, but in a format that combined overnight stay with mode of stay. In the 2007 survey, the questions are separated so that those who did not stay overnight can branch quickly to the next set of questions.


Information Sources for Trip Planning: Question #22 repeats a question from the 2000 survey asking recent visitors about information sources they used to plan their last visit to a park. This information provides important feedback to parks, gateway communities, and the tourism industry on the most effective ways to reach visitors with important trip-planning information.


QUESTIONS #23 and #24 ARE ASKED OF ALL RESPONDENTS

Ways to Help Parks: Questions #23a – #23d are sponsored by the NPS’s Partnership and Visitor Experiences Directorate. These questions ask respondents if they are aware of various ways that people can assist parks in carrying out their missions, including volunteering time or services and donating money. Such assistance is becoming increasingly important to parks, and this question will allow NPS to determine which types of people are aware (or not aware) of these opportunities.


Question #24 asks respondents directly if they have assisted parks in any way. The answer to this question will be analyzed to investigate how assistance varies by geographic area and demographic profiles.


QUESTIONS #25 - #28 ARE ASKED OF RESPONDENTS WHO HAVE VISITED A UNIT OF THE NATIONAL PARK SYSTEM WITHIN THE PREVIOUS 5 YEARS (INCLUDING “RECENT VISITORS”)

Questions #25 - #27 dealing with natural soundscapes, and Question #28 dealing with other park management issues, are new questions requested by the NPS Air Resources Division and the NPS Natural Resource Program Center. Clearance is requested to further develop these questions with focus groups and cognitive interviewing prior to fielding the survey (see Part B4).


Natural and Cultural Soundscapes: The NPS manages the physical properties of natural and cultural soundscapes in parks to protect sounds that are biologically important (e.g., mating calls and warning calls of wildlife) and to provide an historically authentic experience for visitors. Parks also educate visitors about the importance of protecting natural and cultural sounds. However, there has never been a national survey of park visitors that has addressed the significance of natural and cultural soundscapes to park experiences. The 2007 survey will fill this knowledge gap by collecting data on the importance of natural and cultural soundscapes to visitors. Further, these questions can be replicated in subsequent national surveys to help assess the effectiveness of programs informing visitors about soundscape management.


Questions #25 and #27 ask respondents who have visited a park unit within the last 5 years about the importance of hearing natural sounds and cultural sounds (e.g., musket fire, folk songs) as part of their park experience. Question #26 asks respondents whether or not the NPS should manage technologically created modern sounds, such as engine noise and cell phones, in order to protect natural soundscapes in parks.


Natural Resource and Recreation Management Issues: Question #28 assesses general attitudes toward NPS efforts to manage natural resources and recreation in the National Park System. The purpose is to provide a broad indicator of public opinion on these issues. Many of the efforts asked about require cooperation and negotiation with other jurisdictions and authorities, including stakeholders in gateway regions. The information collected in this question will be helpful in these negotiations.


QUESTIONS #D1 - #D9 ARE ASKED OF ALL RESPONDENTS

Demographic and Background Information: Questions #D1 through #D9 will provide information on the representativeness of the telephone sample and on the characteristics of park visitors and non-visitors that can be used to compare responses. Question #D1a is asked of respondents in the cell-phone sample to determine where they live (because this is a cell-phone interview, the area code does not necessarily indicate place of residence). Question #D1a is needed so that the sample of cell phone users can be weighted to compensate for the over-representation of multi-cell households in the sample. Similarly, Question #D1c is needed so that the sample can be weighted to compensate for over-representation of multi-phone households in the RDD sample.

Of particular interest are potential differences between visitors and non-visitors in race and ethnicity (#D4 and #D5) and how these characteristics are associated with perceived barriers to visitation.

Question #D7 on physical disabilities provides information useful in assessing NPS efforts to comply with the Americans with Disabilities Act. Relevant, non-overlapping disability categories drawn from the most recent version of the American Community Survey are employed in this question.


Questions #D2, #D3, #D4, #D5, #D6, #D8, #D9 will also be used in assessing the representativeness of the RDD sample by comparing the characteristics of the respondents and households with census data. Finally, questions #D10 and #D11 will record the respondents’ gender as well as the primary language of the interview (English or Spanish). This information is recorded by the interviewer, but is not asked of the respondents.



3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden [and specifically how this collection meets GPEA requirements].


The collection of information in this study does not involve automated means or electronic submission, but will reduce burden for 4,000 respondents through the use of technological hardware and software for computer-aided telephone interviewing (CATI). Telephone numbers will be available to the software and dialed automatically for the RDD numbers in the main sample (but dialed manually for numbers in the cell phone sample, as required by law. A live interviewer will be on the line at all times so that respondents will be unaware of the automatic dialing, since they will not encounter a delay after answering. The interview script will be programmed into the software, so that as respondents answer questions, the computer screens will automatically move through the script following the appropriate skip logic. Therefore, respondents will not be burdened with written data entry or with being asked questions irrelevant to their particular circumstances.


As an additional technological means of reducing respondent burden, some subsets of questions may be randomly rotated, thereby reducing the length of the questionnaire for any particular respondent. Use of CATI software will mean the rotation will be transparent to the respondents. The results of pre-testing the instrument, as described below, will guide the decision of whether and which questions to rotate, subject to the constraint that items for which a regional level comparisons are desired (as opposed to only national analysis) will not be rotated, but will instead be asked of all respondents.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


There is no duplication of information with this study, as the Comprehensive Survey of the American Public is the only national survey that focuses on issues of importance to the NPS. Moreover, it is the only national survey that contacts non-visitors to National Park System units, as well as recent visitors. Non-visitors comprise a population of vital interest to the NPS.


The last NPS Comprehensive Survey was completed in 2000 and will be 8 years old by the completion date of this survey. Therefore, the data from that survey will have outlived their usefulness to park managers and decision-makers.


Other federal recreation surveys, such as the National Survey of Hunting, Fishing, and Wildlife-associated Recreation conducted by the Fish and Wildlife Service and the National Survey of Recreation and the Environment carried out by the Forest Service, provide information on outdoor recreation participation in general, but do not cover the types of issues of central concern to the NPS.


Travel industry surveys tend to use self-selected groups of respondents recruited to serve on Web panels. Such surveys do not represent the American public, but are limited at the very least to Internet users who are usually offered significant incentives for participation.


The NPS Visitor Services Project and the Visitor Survey Card are on-site surveys which address how well visitors are served at National Park System units, but neither of these surveys contacts non-visitors.



5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.


There is no likely impact on small businesses or other small entities from the collection of these data.



6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Should the federal government not collect these data, programs, services, and programmatic changes contemplated by the NPS will have to proceed without empirical information about the attitudes and behaviors of the American public toward the National Park System.



7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


These circumstances are not applicable to this collection of data. The survey is a telephone survey, so frequency of reporting, preparation or submission of documents, retaining of records, and revealing of trade secrets do not apply in any way. It is a statistical survey designed to produce valid and reliable results that can be generalized, as discussed in Part B. No pledge of confidentiality is made to respondents.



8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice [and in response to the PRA statement associated with the collection over the past three years] and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported. [Please list the names, titles, addresses, and phone numbers of persons contacted.]


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years – even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


Attachment D contains the copy of the announcement of this information collection in the Federal Register, Tuesday, December 5, 2006, page 70785. One e-mail comment was received in response to this listing from a New Jersey citizen who felt that the University of Wyoming Survey and Analysis Center (WYSAC) was unqualified to conduct the survey. This citizen also felt that the NPS should survey visitors or non-visitors, but not both.


In response to this comment, a team of 4 technical reviewers selected WYSAC from 11 institutions in the network of Cooperative Ecosystem Studies Units that expressed interest in conducting the survey. In addition, WYSAC had successfully completed a national household survey in 2005 on the pricing of the new interagency public lands pass. And, as previously detailed, information on both visitors and non-visitors are of vital strategic interest to the NPS.


In addition to the Federal Register Notice, the sponsoring agencies notified the following stakeholders:


American Hiking Society, Celina Montorfano; [email protected]


America Outdoors; [email protected]


American Recreation Coalition, Derrick Crandall; [email protected]


Destry Jarvis; [email protected] 540-338-6970


Eastern National Parks Association, Chesley Moroz, President/CEO; [email protected]


Evan Hirsche; [email protected]


National Forest Foundation, Bill Possiel; [email protected]


National Parks Conservation Association, Tom Kiernan, President; [email protected]


National Park Foundation, Matt Grandstaff; [email protected]


National Park Hospitality Association, Tod Hull; [email protected]


Outdoor Industry Association, Frank Hugelmeyer, President; [email protected]


Travel Industry Association, Rick Webster; [email protected]


US Army Corps of Engineers, [email protected]


Western States Tourism Policy Council/National Association of Government Communicators/National Association of RV Parks and Campgrounds, Aubrey King; [email protected]


No responses were obtained from the first set of direct contacts, but we contacted all members of the above list by e-mail at least twice more in an attempt to elicit responses from professionals in the recreation and tourism industry. Responses were subsequently received from 4 of the organizations above: American Recreation Coalition (ARC), America Outdoors, National Park Hospitality Association (NPHA), and the National Parks Conservation Association (NPCA).


Two of the respondents wanted to be reassured that the results of the survey would be communicated to them directly (ARC and NPCA). One respondent suggested a number of questions, which were, or are now, part of the survey, with one exception. America Outdoors suggested a question on attitudes toward fees to enter the park. This question is quite “layered” in that there are several different kinds of fees (the annual parks pass, the specific fee for one park, additional access fees for special areas, and passes for the disabled). Moreover, the recent national survey for the Departments of the Interior and Agriculture for the Interagency America the Beautiful Pass addressed these issues only one year earlier. Therefore, this question is not included in the 2007 NPS Comprehensive Survey. Finally, the NPHA was concerned with the “creation” of resources, such as soundscapes, and suggested that such questions be removed from the survey. The General Authorities Act of 1970 and the 1978 amendment to the Act known as the Redwoods amendment, as well as the National Parks Omnibus Management Act of 1998, contain the basis of the NPS management policies on natural resources, including soundscapes. Further, the soundscape management policy of the NPS is detailed in Section 4.9 of “Management Policies 2006” of the NPS, which states (NPS 2006:56) that “Using appropriate management planning, superintendents will identify what levels and types of unnatural sound constitute acceptable impacts on park natural soundscapes.” This survey will assist in that planning process.


As part of the questionnaire development, a focus group participated in by 9 African Americans was held in Denver on Tuesday, February 20, 2007. Comments from the focus group proved useful in considering response categories to park visitation questions, particularly those dealing with reasons for non-visitation. In part B4 of this supporting statement, we propose a second focus group of Hispanic citizens to develop further insights into issues relevant to that important population segment and to evaluate the Spanish-language version of the telephone interview.



9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


No payment or gift will be provided to the telephone survey respondents. Participants in focus groups will receive a cash payment of $50 and participants in the cognitive interviews will receive a cash payment of $35 to compensate them for their time and travel expenses.



10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


No assurance of confidentiality will be provided to respondents, since the Department of the Interior does not have the statutory authority to protect confidentiality or to exempt the survey from a request under the Freedom of Information Act. Instead, those who inquire about this issue will be told that their answers will be used only for statistical purposes. They will also be told that reports prepared from this study will summarize findings across the sample so that responses will not be associated with any specific individuals. Respondents will be informed further that the University of Wyoming will not provide information that identifies respondents to anyone outside the study team, except as required by law.


Indeed, no personally identifying information will be obtained from any of those surveyed. The only potential link between the survey responses and any personal identification is the telephone number, and all phone numbers will be stripped from the data file by the Manager of the Survey Research Center at the University of Wyoming before the data file is made available for analysis to others on the University of Wyoming research team. Therefore, the survey administration might reasonably be characterized as anonymous. However, since the phone number is a potential link, confidentiality will not be promised to respondents in this survey.



11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


No questions are included in the survey that would be commonly considered as sensitive or private information. In addition, all respondents are advised that their answers are voluntary.



12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.


The focus group of Hispanic Americans will involve approximately 12 participants recruited through a newspaper advertisement. We anticipate that the focus group will last 90 minutes. Participants will incur an estimated 45 minutes burden in roundtrip travel to and from the focus-group locations. Therefore, the total hour burden for focus groups is estimated at 21 hours. The small-group cognitive interviewing will also involve about 12 participants and is expected to last about 60 minutes. These participants also will incur an estimated 45 minutes burden in roundtrip travel. This equals a burden of 21 hours. The full telephone interview will be pre-tested on no more than 20 respondents. At approximately 15 minutes per interview, this produces an additional burden of 5 hours.


The total hour burden for developmental work and pre-testing is 47 hours (21 + 21 + 5).


In addition, we will complete a maximum of 3,500 land-line telephone interviews of the questionnaire, plus 500 interviews in cell-only households. Each interview will typically take 15 minutes to complete. Members of the research team base this time estimate on experience with other telephone surveys, as well as dry runs through the survey with our own staff.


This implies a maximum of 1,000 hours of telephone interviewing.


The completion rate is estimated at 50% for the land-line survey and 40% for the cell-phone survey. (See section B1 for more detail on completion rates.) Assuming all numbers initially called are eligible, approximately 4,750 initial refusals will not be converted into completed interviews after call-backs. Together, the initial refusal and call-back conversion attempts will take about one minute per household.


This adds another 79 burden hours to the survey.


Therefore, the burden hours for all work sum to 1,132. At a wage rate of $20 per hour, this would amount to $5.66 per participant/respondent, totaling $22,880.



13. Provide an estimate of the total annual [non-hour] cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information [including filing fees paid]. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


The cost burden on respondents and record-keepers, other than hour burden, is zero.



14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.


The overall study is divided into two parts. The first part includes the development of the survey instrument, the statement to OMB, consultations with NPS personnel, coding the survey for CATI operations, and pre-testing the CATI survey instrument. The second phase includes the telephone surveying plus cleaning, editing, and analysis of the survey data.


The total cost for the project is $189,312 as itemized below.


ACTIVITY Total Staff Time* Total Labor Costs All Other Costs**

Telephone Survey 33.3 months $116,691 $72,621

and Analysis


TOTAL: $189,312


* This figure is obtained by taking the percent of the number of years (or part of year) worked, times the number of FTE’s for each type of WYSAC personnel employed on the survey.


**Other costs include travel, supplies, G&A, sample purchase, telecommunications, report writing, and CESU indirect costs.



15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.


There is an increase in burden from this new collection of information.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Data will be post-weighted in inverse proportion to the numbers of telephones and adults in the household (as reported by the respondents), and further weighted to bring the sample of respondents into line with the age, gender, race and Hispanicity proportions within each region, based on Census data. Additionally, the sample will be post-weighted to produce a national dataset in which the proportion of interviews completed in each NPS administrative region reflects the eligible households in that region. This dataset will be used to describe results in a national technical report. In addition, 7 regional technical reports will be prepared based on the sets of 500 completed interviews in each administrative region.


Information will be reported in frequency distribution tables associated with each survey question. For the national technical report, comparisons will be made between the 7 administrative regions and between visitors and non-visitors. For the 7 regional reports, frequency responses of visitors and non-visitors will be presented.


In addition to the national technical report and the 7 regional reports, a technical report comparing results from the 2000 Comprehensive Survey with identical questions from the current survey will be prepared. To maximize comparability, only data from the spring survey wave in 2008 will be compared with 2000 results, since interviewing for the 2000 survey was conducted in the spring only.


A technical report describing the land-line sample and the cell phone sample will compare the two groups in terms of their demographic characteristics and responses to selected questions. These comparisons will be made at the national level only, since the cell phone sample will not be large enough to support analyses down to the level of administrative region.


Finally, several in-depth thematic reports may be commissioned that analyze subsets of questions, such as those dealing with leisure travel patterns, soundscape management, and comparisons between racial and ethnic groups on responses to questions about reasons for visiting or not visiting national parks. The completion of these reports is subject to the availability of funding in FY08 and FY09.


Before publication, all technical reports will be peer-reviewed by subject matter experts from academia, industry, and government.

Project Schedule

First meeting of research team - September 2006


Discussions of survey draft instrument - November – February, 2006/2007


Federal Register publication, 60-day public commenting period – December 2006 – January 2007.


Focus group with African Americans – February 2007


Review of comments from public – March 2007


Revisions of survey Instrument – July 2007


Preparation of OMB supporting statement – July 2007


OMB review, Federal Register publication and 30-day public commenting period – August 2007


Small group cognitive interviewing – September 2007


Hispanic focus group – September 2007


Fall survey – October – November 2007


Cleaning of fall data, first reports – January 2008


Spring survey – February – March 2008


Final tabulations – June 2008


Final reports – September 2008



17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


We are not seeking such approval.



18. Explain each exception to the certification statement identified in Item 19, "Certification for Paperwork Reduction Act Submissions," of OMB Form 83-I.


There are no exceptions to the certification statement.

16


File Typeapplication/msword
File TitleSUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSION
Authorptaylor
Last Modified Bymmcbride
File Modified2007-10-18
File Created2007-10-17

© 2024 OMB.report | Privacy Policy