Passback Responses on Known Pool

Passback_PoolOfQuestions_5-21-08.doc

Programmatic Approval for National Park Service-Sponsored Public Surveys

Passback Responses on Known Pool

OMB: 1024-0224

Document [doc]
Download: doc | pdf

NPS Response to OMB Passback on Pool of Known Questions (NPS Generic ICR)


I. OMB Concern: Controversial policy questions


KNOW14 (awareness of NPS fire policy)

OMB: The question is about a controversial issue.

NPS Response: The question has been removed from the pool, although NPS believes asking visitors if they are aware of a long-established NPS policy is not controversial.


FVIS9, FVIS10 (willingness to tolerate occasional smoke or temporarily blackened stumps for prescribed burns)

OMB: The question is about a controversial issue and “willing to tolerate” appears to be leading.

NPS Response: The questions have been removed from the pool.


KNOW13 (awareness of NPS non-native species policy)

OMB: The question is about a controversial issue.

NPS Response: The question has been removed from the pool, although NPS believes asking visitors if they are aware of a long-established NPS policy is not controversial.


OPMGMT6 (support control and removal of non-native species?)

OMB: The question is about a controversial issue.

NPS Response: The question has been removed from the pool.


SOUND5 (hear aircraft on this trip?)

OMB: The question is about a controversial issue.

NPS Response: The question has been removed from the pool.


Crowding (series of questions about crowding and related issues)

OMB: Generally, questions in this section are leading (i.e., crowing is bad).

NPS Response: As noted in Part A of the Supporting Statement (pp. 13-16), the NPS is mandated by law to establish visitor carrying capacities for each of its units and to manage designated wilderness to provide outstanding opportunities for solitude. These questions are necessary to support this management process.


Crowding is defined in the social science literature as a negative evaluation of social density. Research in outdoor settings over the past four decades has shown this evaluation to be context-dependent. Relatively high social densities may be acceptable to visitors at scenic overlooks in the frontcountry, but the same social density in the backcountry may be unacceptable. Even in backcountry and wilderness areas, social densities may be evaluated differently, depending on whether one is at a trailhead or an overnight campsite.


Visitors often are able to avoid exposure to unacceptably low or high densities by voluntarily “displacing” themselves to times and areas in a park that correspond to their desired experiences. Preserving a diversity of settings characterized by different levels of social density is an important visitor capacity management tool because it offers people the freedom to choose the location or time in a park most enjoyable to them. These questions help maintain this ability by telling managers which areas of a park visitors evaluate as crowded and uncrowded.


In the vast majority of cases, visitor capacities are not set for an entire park, but for specific areas within a park based on legal mandates (e.g., wilderness), physical capacity (e.g., of parking lots), and prescribed visitor experiences (e.g., primitive vs. developed areas). Large parks seek to offer a range of capacities. Only in the case of the smallest units of the National Park System (i.e., an historic house) would a single visitor capacity be set for an entire park. In this case, the capacity depends on straightforward physical limits, such as the number of visitors who can be accommodated in a room during a guided tour.


All questions in this section provide information to managers that allow them to identify where and when park visitors feel crowded or uncrowded and how visitors cope with unacceptable levels of social density. They are essential to meet statutory responsibilities and to fulfill the NPS mission to provide enjoyable visitor experiences while maintaining resources unimpaired for the enjoyment of future generations.



II. OMB Concern: Would a respondent be able to answer the question meaningfully?


Age3c & Age3d (number of visits by group members in last 12 months and lifetime)

OMB: Suggest deleting since a respondent may not know or recall.

NPS Response: NPS recognizes that it is not always appropriate to ask respondents to estimate their visits to a park over their entire lifetime. However, for some destination parks that represent once-in-a-lifetime visits (i.e. Denali National Park, Grand Canyon National Park) the question is reasonable and can be answered for most group members. Conversely, the question would be unreasonable in the Georgetown District of the C&O Canal National Historical Park where many visitors walk, run, or bike on the towpath daily or weekly. Usually, the survey includes either the “last 12 months” or the “lifetime,” question, but not both, depending on the nature of the park.


Know15 (in your and your group members’ opinion, what is the national significance of this park?)

OMB: How can one know your group members’ opinions?

NPS Response: The wording has been changed to “your opinion,” deleting “and your group members’ . . .”


PA3 (importance of site to you and your group)

OMB: How can the respondent know how important the site is to the group?

NPS Response: The wording has been changed to “you,” deleting “and your group.”



III. OMB Concern: Formatting of questions


Income 1 (household income by category)

OMB: Too fine a gradient for low income.

NPS Response: The first two income categories have been combined, eliminating “Less than $14,999” and “$15,000 to $24,999” and replacing these with a single category, “Less than $24,999.”


Income 2 (household income by category, alternate grouping)

OMB: What is the rationale for cuts?

NPS Response: The question has been removed from the pool.


TPLAN5 (how much time planned to spend in park)

OMB: Open-ended question for minutes? Usefulness of this measure?

NPS Response: The existing format has worked well in many VSP surveys. The survey question is used by park managers who are planning the length and frequency of visitor programs or who are developing tour itineraries requiring different lengths of stay. Such data are especially useful for assisting visitors with trip planning. It is common for employees in visitor centers or at entrance stations to be asked, “We only have X time to spend here. What should we do?” By knowing the range of times individuals expect to spend in a park, parks are able to tailor programs to meet different timeframes.


TPLAN7 (from list above the most important reason for visiting the park)

OMB: Wouldn’t a check-box format make more sense instead of writing in?

NPS Response: As space allows, the options will be reproduced for a check-box format. However, it is usually the case, especially on the VSPs, that there is not enough space to reproduce the options from a preceding list, so this open-ended format is employed.


TPLAN8 (safety measures taken when preparing for trip)

OMB: Wouldn’t a check-box format make more sense?

NPS Response: As space allows, the options will be reproduced for a check-box format. However, it is usually the case, especially on the VSPs, that there is not enough space to reproduce the options from a preceding list, so this open-ended format is employed.


TPLAN11b (sources of information used when planning future trips)

OMB: “prior to future visits?” is a confusing phrase:

NPS Response: The wording has been changed to “on a future visit.”


TPLAN17 (who in group made the decision to visit the park?)

OMB: The use of male and female head of household may be heteronormative, please be aware of context.

NPS Response: The wording has been changed to “head of household.”





TPLAN25 (what locations were you unable to spend enough time in?)

OMB: Too broad:

NPS Response: This question will only be asked as a follow-up to ITIN3, which provides a checklist of areas visited. The wording of TPLAN25 has been modified to read, “From the preceding list, what locations were you and your group unable to spend enough time in?”


ACT18 (number of times participated in activities in park over past 12 months)

OMB: Wouldn’t it be best to have a column for never participating in an activity? This may reduce noise.

NPS Response: The question has been reformatted to add a “Never” column.


ACT24 (activities expected to be done, done on this visit, and done on past visits)

OMB: Rearrange a, b, and c such that chronology is intuitive. Also, what if there is no “most important [activity] to your visit to this park”?

NPS Response: The question has been reformatted, switching the order to ask: a) past activities, b) expected/planned activities, and c) activities on this trip. Also added a “None” option to part d of the question about the “most important” activity.


ITIN1 (amount of time in hours or days spent at each location in park)

OMB: Change format of column b) to “Time spent” and have “Days OR Hours”:

NPS Response: This question has been modified in two ways. First, the instructions have been shortened and simplified to emphasize that respondents should enter the number of hours OR days they spent in each location in the park. Second, the final section of the question asking respondents to list the total amount of time they spent in the park has been removed. If needed, this question can be asked separately (see TRIP11).


TRANS6 (use of transportation modes in the park)

OMB: Change column a to “did not use” to reduce noise.

NPS Response: The wording in column a has been changed to ask “Yes” or “No” for use of each transportation option.


TRIP11 (how long stayed in park on this visit?)

OMB: There is confusing formatting with number of days/hours.

NPS Response: The question has been modified to add OR for clarity i.e., ___ Number of hours, if less than 24 hours OR ­­­___ Number of days, if 24 hours or more.


SOUND3 (how did modern sounds affect ability to hear and enjoy the following sounds?)

OMB: Clarify that the first three columns are associated with experiencing noise such that “no effect” and “did not experience” are differentiated.

NPS Response: The order of the responses has been rearranged and a separating line created within the table to clarify further.





TpB1 (likelihood of different outcomes of doing a specific activity, e.g., taking a shuttle)

OMB: Would NPS ask all these questions in a single question or pick-and-choose depending on the site?

NPS Response: The items are meant to be illustrative and will differ depending upon the planned behavior of interest (e.g., using a park shuttle). As described on pp. 10-11 of Part A of the Supporting Statement, these items measure the respondents’ assessment of the likelihood of different outcomes occurring as a result of participating in a certain behavior. In the Theory of Planned Behavior, the likelihood of salient outcomes is one predictor of participating in the behavior of interest. Research shows that people typically consider from five to nine outcomes simultaneously. Each of these outcomes is measured by multiple indicators (usually two or three items for each outcome) in order to provide a basis for statistical tests of reliability.


CRWDATT3 (likelihood of experiencing solitude at different numbers of encounters with others)

OMB: Shouldn’t 5-point scales be used instead of 9-point scales?

NPS Response: The 9-point scales are well-established in the literature relative to crowding and VERP (Visitor Experience and Resource Protection) measures. NPS will attempt to keep scale lengths consistent throughout instruments. Deviations will occur based upon the literature informing a measure or the need for item replication for comparison between studies.


CRWDATT5 (likelihood of experiencing solitude at different time spans without seeing others)

OMB: Shouldn’t 5-point scales be used instead of 9-point scales?

NPS Response: Same as CRWDATT3.


CROWDATT11 (how different experiences affected your sense of being in wilderness)

OMB: All items are written in the same direction; mix?

NPS Response: Three of the six items have been rewritten to reverse direction, e.g., “Number of hiking groups you saw while you were hiking on the trails” has been changed to “The amount of time you were able to hike without seeing other hiking groups.” These items may differ slightly based upon the specifics of other questionnaires, but NPS will continue to mix items unless justification is otherwise provided.


VERP1, VERP7, VERP13 (acceptability of photo and sound simulations)

OMB: Shouldn’t 5-point scales be used instead of 9-point scales?

NPS Response: The 9-point acceptability scale has been used since the first VERP studies in 1993. The number of meta-analyses and comparisons across multiple data sets is increasing in the visitor capacity literature (see, Leisure Sciences, 30(2), March-April 2008, a special issue on meta-analysis in outdoor recreation research). Therefore, it is important to maintain comparability in measures, including response scales, in these types of surveys.


EVALSERV9 (evaluation of reservation services in parks)

OMB: The question seems very long, essentially five questions per reservation service. What is the efficacy of this question if the respondent gets tired of it and skips it? Are there any ways to shorten this?

NPS Response: We have dropped “assistance from reservation staff” as one of the evaluative criteria, since this item overlaps with the remaining four. In addition, typically there are only a small number of reservation systems in parks (e.g., one for NPS campgrounds and one for concession lodging), so the number of systems that individuals could potentially use and rate is limited.


EVALSERV11 (evaluations of commercial services in parks on several dimensions)

OMB: The question is too long, see comment on EVALSERV9

NPS Response: One of the four response columns for each commercial service has been deleted (value for money paid). If this question is needed, it may be asked as a separate item.


EVALSERV12, 13, 14 (open-ended explanations for evaluations of commercial services given in EVALSERV11)

OMB: The questions seem very broad and time consuming. Essentially, they take a long multiple choice and turn it into an even longer short answer.

NPS Response: For EVALSERV12 and EVALSERV13, “What was the problem?” has been deleted. This is the second part of these two-open ended questions. The nature of any problems experienced will likely be evident in responses to the first part of these questions, which ask for an explanation of “worse than expected” and “very poor” ratings. It’s important to maintain these types of questions to allow people to comment systematically on services and to help managers by providing specific information that can be used to improve visitor services. In the absence of such questions, respondents will often write unsolicited comments on margins or attach a letter to the returned questionnaire. These are more difficult to process during data entry.


EVALSERV19 (indicate how safe you felt from crime, accidents, and natural hazards during this visit)

OMB: Please include a “neither safe/unsafe” or at least a “don’t know” column:

NPS Response: A “Neither Safe nor Unsafe” category has been added.


EVALSERV21 (rate importance, quality, and past use of services/facilities)

OMB: The question is too long, see comments on EVALSERV9.

NPS Response: The question has been shortened by dropping the “used on previous visit” column.


EVALTRAN6 (check improvements that would increase likelihood of using shuttle in the future)

OMB: Include phrase “check all that apply” in the introduction.

NPS Response: This phrase has been added.


TRANS2 (have you used transit in any National Park?)

OMB: Clarify meaning of “transit” and time reference (ever?). Reliability established?

NPS Response: The question has been removed from the pool.


LEARN1 (how much understanding of topic improved as a result of interpretive program)

OMB: “Improved” suggests deficiency. “Increased” more neutral?

NPS Response: The wording has been changed to “increased.”



TRIPC26 (group expenditures in different categories)

OMB: Consideration of check-box format? Evaluation of past responses? What type of heaping/rounding occurs? How addressed/granularity of results needed?

NPS Response: This VSP expenditure module is very popular with parks and provides expenditure profiles for different types of visitor segments (day-users, campers, overnight lodgers, etc.). These profiles serve as input into the NPS Money Generation Model-version 2 (MGM2). The format has worked well for a number of years.


MGM2 is a conservative, peer-reviewed, IMPLAN-based economic impact tool authored by Dr. Daniel Stynes of Michigan State University. (Dr. Stynes does similar work for the Forest Service and the Army Corps of Engineers). Data elicited by this question are transmitted to Dr. Stynes, who produces an individual report on the economic impact of visitor spending in the gateway region around the park.


Non-response to the expenditure question is usually less than 10%. Item non-response within the question is handled conservatively by treating lines left blank as zeroes, rather than as missing values. This decreases overall spending averages by about 7% compared to treating blanks as missing. Outliers are evaluated and either omitted or adjusted. Spending averages with and without outliers are reported. When available, gross receipts from concession services in corresponding categories (e.g., lodging) provide an approximate check on spending estimates based on survey data. These comparisons have been quite close. In the most recent report for Yellowstone National Park, the survey-based estimate of in-park spending was about 5% less than reported concession receipts, well within the 95% confidence interval of the visitor spending estimate.


The major use of the MGM2 estimates by parks is to demonstrate to partners in gateway regions the economic activity attributable to spending by park visitors. Given this use, the NPS considers the level of accuracy in the estimates to be acceptable. For significant policy or planning applications (i.e., estimating the economic impact of alternatives in an EIS), NPS recommends that other approaches be used (i.e., trip diaries, sales tax data).


A copy of the MGM2 report for Yellowstone is attached.



IV. OMB Concern: Redundant questions


Income 3 (open-ended household income question)

OMB: The question is redundant with the income-category question.

NPS Response: The open-ended format is used in the American Community Survey. It is included in the pool for use in those cases where space constraints make the check-box format impractical.





FVIS3, FVIS4, FVIS5, FVIS6 (what services, facilities, opportunities would you prefer on a future visit?)

OMB: Are these duplicative of trip planning questions?.

NPS Response: The questions are not duplicative. Asking individuals about activities/programs respondents may be interested in on during a future visit allows the NPS to explore new programs or interpretive services and to determine where visitor interest lies relative to possibilities the park may offer. These future-visit questions are unique because they allow individuals to report interests that may not have been available at the park during their current or past visits.


TRIPC6 (where coming from and going to on today’s visit?)

OMB: Questions on entrances are repeated in TRIPC7, TRIPC8, and TRIPC17.

NPS Response: While there are similarities between these questions, they won’t be asked in the same survey. There are multiple ways of asking these questions based upon the purpose of the survey. TRIP6 is a good way of getting all the stay and entrance/exit information in one question, whereas TRIPC7, TRIPC8, and TRIPC17 can be used when parks may not be interested in all the information, but are specifically interested in one aspect, such as where people entered the park or where they stayed the night before. Additionally, some questions (i.e., TRIPC8) are better suited for on-site interviews, while others work better for post-trip surveys.


TRUST2 (how much do you trust the NPS at the national level and at the specific park level)

OMB: Asking about trust of NPS at site is same as question TRUST1. Also should you ask “trust the National Park Service to_____.”

NPS Response: The area-specific management items in TRUST1 are designed to predict overall trust in the National Park Service (as measured in TRUST2) at both the national level and the park level. Thus, TRUST1 items are independent variables and TRUST2 items are dependent variables. This type of analysis helps park managers understand the specific park issues that have the most influence on the overall trust evaluations measured in TRUST2.


V. OMB Concern: Practical utility of the questions


ITIN2 (which one site was the most important to your visit to the park?)

OMB: What does “most important to your visit to [NPS site]” mean? Could a more specific word be used to convey intention more?

NPS Response: The question has been modified to read, “Which one site was the most important reason for your visit to the park?


LEARN3 (what was the most important information that you learned about the park?)

OMB: What does “most important information” mean?

NPS Response: This question is intended to elicit the respondent’s perception of what he/she considers to be significant new information learned during the visit. The question has been modified to read, “What was the most important new information that you learned about the park?”


KNOW1-KNOW12 (prior to this visit, were you and your group aware the site was managed by the NPS, aware of other nearby NPS units, aware of the difference between a national park and national forest, etc.?)

OMB: Evaluation of responses to these types of questions? Does anyone admit to not knowing these items? (Social desirability)

NPS Response: Experience shows that in some cases visitors’ awareness of park management is high, and in some cases it is low. (This question is not asked in parks, such as Yellowstone or Yosemite, where knowledge of the managing agency can be presumed.) The following are results from VSP surveys over the past four years showing the percent of respondents aware that the park was a unit of (or managed by) the National Park Service:


Dayton Aviation National Historical Park: 36%

Apostle Islands National Lakeshore: 78%

Effigy Mounds National Monument: 57%

Manzanar National Historic Site: 45%

John Day Fossil Beds National Monument: 50%

Congaree National Park: 58%

San Francisco Maritime NHP: 36%

Lincoln Home National Historic Site: 42%

Chickasaw National Recreation Area: 71%

Timpanogos Cave National Monument: 50%

Cuyahoga Valley National Park: 76%

John Fitzgerald Kennedy NHS: 35%

Devils Postpile National Monument: 73%

Monocacy National Battlefield: 69%

Katmai National Park and Preserve: 75%

Big Cypress National Preserve: 61%

Mount Rushmore National Memorial: 84%


Questions about other types of awareness, although asked less frequently, also show a range of responses, as illustrated below (percent answering yes):


Aware Mount Wanda is part of John Muir NHS? 37%

Aware of difference between national forest and park?

@ Devils Postpile 54%

@ Timpanogos Cave 61%

Aware Keweenaw NHP existed before visiting? 52%

Aware Dayton, OH has an NPS site? 50%

Aware of nat’l wildlife refuge near Effigy Mounds? 26%


KNOW3 (aware visiting area managed by NPS prior to visit?)

OMB: What information is NPS eliciting?

NPS Response: It is extremely useful for a park manager to know the proportion of visitors who are aware of NPS management and of different agencies managing nearby or adjacent sites. Visitors often confuse who manages a park, and therefore have more difficulty locating accurate information about the site (online, verbal, and paper). In addition, confusion over the managing authority can result in more misunderstandings of the rules and regulations at the site. Thus, the questions asking about individuals’ knowledge of management agencies is often coupled with questions concerning familiarity with NPS regulations and policies.


KNOW6 (prior to visit, aware of nearby NPS sites?)

OMB: What information is NPS eliciting? The question seems really awkward.

NPS Response: See response to KNOW3 and KNOW1-KNOW12.


KNOW10 (prior to visit, aware that X% of park was designated wilderness?)

OMB: What information is NPS eliciting?

NPS Response: Designated wilderness is managed differently from frontcountry areas of a park. Often it is often managed differently from other backcountry areas. Visitors may be unaware of these differences. For this reason, it is important for the NPS to know if visitors are aware that part of a park may be designated wilderness and regulations governing behavior may be different.


KNOW11 (prior to visit, aware of listed values of wilderness?)

OMB: What information is NPS eliciting?

NPS Response: While KNOW10 gets at individuals’ overall knowledge of the existence of wilderness areas, KNOW11 explores respondents’ knowledge of the specific characteristics of a wilderness area, as well as their learning during the trip and their future interest in learning more. Individuals’ prior knowledge, current learning, and future interest are critical in NPS’s management of designated wilderness areas, as they provides information to inform interpretive programming and media.


KNOW12 (prior to visit, aware that different organizations administer nearby or adjacent sites?)

OMB: What information is NPS eliciting?

NPS Response: See response to KNOW3.


KNOW13 (aware of NPS non-native species policy prior to visit?)

OMB: What information is NPS eliciting?

NPS Response: The question has been removed from the pool because OMB considers it to be about a controversial issue.


KNOW14 (awareness of NPS fire policy prior to visit?)

OMB: What information is NPS eliciting?

NPS Response: The question has been removed from the pool because OMB considers it to be about a controversial issue.


PART1 (are you or any members of your group a member of the [specific name of friends group]?)

OMB: What is the purpose of this question?

NPS Response: Many parks receive assistance from voluntary associations generically referred to as “friends groups.” The question determines what percentage of the sampled population belongs to a park’s friends group. It also determines if visitors who do not belong would be interested in joining such a group, and, if not, the reasons why. The purpose is to provide information to park managers and leaders of friends groups that can assist them in sustaining or growing their membership, which benefits the parks associated with these groups.


ACT29 and ACT30 (of park programs and media, which one was most meaningful to you? Why?)

OMB: How to define “meaningful?”

NPS Response: One of the objectives of NPS interpretive programming is to provide “meaningful” experiences to visitors. The NPS is interested in learning what visitors consider to be meaningful experiences. The purpose of these two questions is to solicit this information from respondents.


PA1 (scale items measuring various dimensions of place attachment)

OMB: How would the responses be reasonably interpreted?

NPS Response: The construct of place attachment measures two dimensions: place identity and place dependence. It is the combination of these dimensions that comprise and define the concept of place attachment, with place identity referring to a specific place being central to an individual’s life, and place dependence being the individual’s reluctance to substitute another site for participation in his or her chosen recreation activities (Moore & Graefe 1994). The concept of substitution becomes pivotal to place attachment in regards to recreation settings because recreation sites may be recognized as unique settings beyond their ability to facilitate recreation experiences for specific activities.


In the past, the idea that recreation settings were interchangeable and reproducible, and the assumption that recreation was activity-driven, led to the conclusion that sites with similar attributes which afforded individuals opportunities to participate in a specific recreation activities all had nearly the same value. This view has been replaced by one that attempts to recognize the emotional, symbolic, and spiritual value of resources in planning. Thus, place attachment has become an important instrument in realizing the unique benefits of parks.


Because place attachment is a complex construct, multiple indicators are required to ensure internal reliability and validity. Many recent publications outline well-tested measures, including format, layout, and number of items (Kyle, Absher, & Graefe, 2003; Kyle et al., 2003; Williams & Vaske, 2003).


When this construct is included in a study, factor analysis is used to determine how the items load on the two dimensions. From this, managers can begin to understand how visitors perceive their specific sites. It is important that managers recognize the emotional ties visitors have to a site because individuals who are highly attached to a place normally have an increased level of concern regarding how a place is being used and managed (Williams et al. 1992). In addition, it is helpful for managers to know if individuals are not identifying with a specific site, so that priority can be placed on educating and connecting visitors to the unique resources and attributes of a specific place.


Moore, R.L. & A.R. Graefe. (1994). Attachments in recreation settings: The case of rail-trail users. Leisure Sciences, 16(1), 17-31.


Williams, D.R.,M.E. Patterson, J.W. Roggenbuck, & A.E. Watson. (1992). Beyond the commodity metaphor: Examining emotional and symbolic attachment to place. Leisure Sciences, 14, 29-46.

The other references cited above are included in the Supporting Statement reference list.


SOUND10 (listening exercise measuring visitors’ responses to sounds in park environments)

OMB: (Specific query not provided to NPS)

NPS Response: These questions were tested in Grand Teton and Yosemite national parks in 2006, with OMB clearance. They expand the focus of social science research on soundscapes from understanding the detectability and acceptability of human-caused sounds in parks to visitors’ responses to natural sounds. Although the NPS has been measuring natural ambient and human-caused sound levels in parks for over 20 years, far fewer studies have examined visitors’ responses and evaluations of natural sounds. Anecdotally, it is known that some visitors seek out natural soundscapes as an intrinsic part of their experience, while others accustomed to urban acoustic environments are uncomfortable in the absence of familiar human-caused sounds. The information from these questions enhances park planning and soundscape management efforts by assessing visitor responses to both natural and human-caused sound and identifying potential indicators and standards of quality for soundscape conditions in parks.


TpB3 (how much do you think each of the following people would think that you should ride the [shuttle bus] in the park?).

OMB: What information is NPS eliciting?

NPS Response: The question is illustrative and may be applied to behaviors other than riding a park shuttle. The question is only asked when the Theory of Planned Behavior (TpB) is applied to a study. As described in Part A of the Supporting Statement (pp. 10-11), the TpB states that individuals’ behavior is influenced by: 1) their attitudes toward the behavior; 2) their subjective norms regarding the action; and 3) their perceived control over engaging in the behavior. Each of these components is composed of other factors. Specifically, attitudes toward a behavior are influenced by beliefs about the outcomes of engaging in that behavior, weighted by evaluations of outcomes as positive or negative. Subjective norms are based on an individual’s “normative beliefs,” which are comprised of beliefs about what people who are important to the person think should be done in a particular situation, weighted by the individual’s motivation to comply with these people. Finally, perceived control is a function of “control belief strength” and “control belief power” (Ajzen, 2005). TpB3 measures a key part of the subjective norm component by providing information about the respondent’s perception of what people important to him or her think the respondent should do in a particular situation. Combining the subjective norm component with the other theoretical components allows managers to better predict and understand why visitors do or do not engage in specific behaviors.


12


File Typeapplication/msword
File TitlePreliminary Draft
AuthorOMB
Last Modified Bymmcbride
File Modified2008-05-21
File Created2008-05-21

© 2024 OMB.report | Privacy Policy