Supporting_Statement_A_1-25-07

Supporting_Statement_A_1-25-07.doc

Identifying Capacity for Local Community Participation in Wildlife Management Planning: White-tailed Deer in Northeastern National Park Service Units

OMB: 1024-0251

Document [doc]
Download: doc | pdf

Supporting Statement for a New Collection RE: Identifying Capacity for Local Community Participation in Wildlife Management Planning: White-tailed Deer in Northeastern NPS Units


OMB Control Number 1024-new


A. Justification


1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.


The survey data to be collected will assist NPS managers in fulfilling recent policy directives for public participation by indicating how to tailor participatory processes to best meet the social reality of the management context. NPS policies recognize that because “…parks are integral parts of larger regional environments, the service will work cooperatively with others to anticipate, avoid and resolve potential conflicts…and address mutual interests in the quality of life of community residents” (National Park Service 2006:13). In addition, NPS policies place emphasis on public participation in wildlife management planning, especially local publics (National Park Service 2003, 2006). Federal agencies also are required to engage stakeholders whenever any action is considered that may significantly impact the environment (National Environmental Policy Act, NEPA,1969, National Park Service 2001a).


Attachment A contains the relevant sections of the NPS Management Policies, as well as the appropriate section of the National Environmental Policy Act outlining the expectations of the NPS relevant to public participation. In order to accommodate this mandate for public participation, the NPS will solicit opinions regarding white-tail deer management from homeowners in selected communities around national parks in the Northeast Region.


2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. [Be specific. If this collection is a form or a questionnaire, every question needs to be justified.]


This study will provide the National Park Service (NPS) and park managers with critical public input regarding deer issues in and around northeastern NPS units. The study will use a mail survey of homeowners in communities near parks to assess: (1) the degree to which experience, individual capacity, and perceptions of institutional capacity affect residents’ intention to participate in deer management planning, (2) the degree of cognitive coorientation between park managers and stakeholders about deer and deer management, and (3) social and demographic attributes of residents with different degrees of intention to participate and/or coorientation to managers. This information will assist park staff in improving communication with the public in formative phases of wildlife management planning (prior to and/or including formal public scoping related to an Environmental Impact Statement).


Justification for each specific survey question follows, organized by topic and question number.


Your experiences with <PARK NAME>, deer and your community. Questions #1 through #7. These questions are necessary to control for respondents’ level of actual experience with the park (Questions #1, 2 and 4) and with deer (Questions #5 and 6). In addition, Questions #3 and #7 establish respondents’ perceptions of the role of the park in their community. These questions are necessary to determine respondents’ level of involvement with the park, which is one component of individual capacity for participation.


Your opinions about deer in the park and community. Questions #8 through #13. Questions #8 through 11 are necessary to assess additional components of individual capacity for participation, specifically, problem recognition and level of involvement. Question #8 asks about perceptions of deer behavior, both within and outside the park, as one element of problem recognition. For this survey, problem recognition is recognition by respondents of the problem as defined by managers. NPS managers listed concern about deer behavior as a potential reason to consider management action. They also would like to know the degree to which local community residents perceive deer in the park to be different from deer in the community. Managers recognize that the same deer move between the park and the local community, Question #8 is designed to determine the degree to which geographic context affects respondents’ perceptions about deer. Questions #9 through 11 assess the level of concern about impacts from deer, as measures of both problem recognition and level of involvement. These questions also differentiate between concerns in the park and in the community. Questions #12 and #13 assess problem definition, level of involvement, and coorientation with NPS managers. Deer management is not a clear-cut technical problem, but instead is a wicked problem (Rittel and Webber 1973) with many interrelated elements. These questions are necessary to understand the range of beliefs about deer and impacts of concern to respondents and how they align with those of managers.


Your experiences with park management. Questions #14 through #20. Question #14 will be used to control for actual experience with public participation. Question #15 asks about intention to participate, which will be the dependent variable in the regression models. Questions #16 through #18 assess constraint recognition (a component of individual capacity). Constraint recognition is the degree to which a person perceives internal or external limitations to carrying out a type of behavior, which could affect their intention to participate. Questions #19 and #20 assess perceptions of institutional capacity. Question #19 adapts goal statements from NPS Director’s Order 75-A: Civic Engagement and Public Participation (2003) and operational statements from NPS Director’s Order 52A: Communicating the National Park Service Mission (2001b) to assess the degree to which process credibility affects intention to participate. Question #20 adapts elements of Meyer’s Credibility Index (Meyer 1988) to determine the degree to which source believability and community affiliation affect intention to participate.


Background information. Questions #21 through #28. Information from these questions is necessary to assess the social and demographic attributes of residents with different degrees of intention to participate and coorientation to managers. Question #28 is not actually a question but provides respondents the opportunity to submit additional comments.


3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.


This information will be primarily collected via mail-back questionnaire. Telephone interviews will be conducted with a small number of non-respondents to the mail survey. No automated data collection will take place.


4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.


The questions in the survey instrument have been narrowly tailored to address gaps in data about residents of communities near parks and to provide the information needed to estimate perceived impacts associated with deer and natural resource management in and around parks in the northeastern US. These knowledge needs were identified by NPS Natural Resource Management personnel, public participation practitioners who work regularly with federal land management agencies, and through preliminary qualitative inquiry with residents of communities near three of the five parks to be surveyed (see OMB Approval #1024-0224, NPS #05-047). A thorough review of previous survey efforts revealed that this type of data does not exist.


5. If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.


The data collection will not impact small businesses or other small entities.


6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Should these data not be collected, future participatory processes would be designed without empirical evidence about relevance to potential audience members. This could result in receiving public input that is not representative of the public at large or designing participatory processes that incite controversy rather than identify constructive management solutions.


The sampling schedule and target sample size efficiently collects the data needed for robust estimation and comparison between study sites. Further restriction of the sample size and schedule would risk compromising the significance and reliability of the resulting information.


7. Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document;

* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.


These circumstances are not applicable to our collection of data. Ours is a onetime, mailback survey, so frequency of reporting, preparation or submission of documents, retaining of records, and revealing of trade secrets do not apply in any way. It is a statistical survey designed to produce valid and reliable results that can be generalized, and using only data classifications to be reviewed and approved by OMB. The mailing letters offers no pledge of confidentiality, although, as discussed in section A(10), below, the survey responses are virtually anonymous.


8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice [and in response to the PRA statement associated with the collection over the past three years] and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.


Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.


Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.


A notice was published in the Federal Register on September 18, 2006 (Federal Register Volume 71, Number 180, pages 54686-54687). Attachment B contains a copy of this notice. Attachment C contains a copy of the 30-day Federal Register Notice, published on January 19, 2007.


One request for a draft survey was received from D.J. Schubert, Wildlife Biologist at the Animal Welfare Institute. A number of comments were submitted in response to the draft survey. These comments and responses to them are listed below.


Comment: To adequately assess public opinion, the survey should be broadened to include park users and to a representative sample of the public nationwide.

Response: This survey is not meant to be a metric of general public opinion. The survey is intended to assess only local community beliefs about and level of interest in deer and deer issues in and around these parks.


Comment: Those who receive the survey may understand it to be an indication that it is the first step towards lethal management action, and the introductory remarks are inadequate.


Response: This survey is not designed to be a tool for making decisions about different action alternatives. The survey is intended to assess only local community beliefs about and level of interest in deer and deer issues in and around these parks and is not equivalent to public scoping as required by the National Environmental Policy Act (NEPA, 42 U.S.C. § 4231 et seq.). If any of the parks decide to consider formal management action related to deer, including lethal action, a full public scoping process would be undertaken. In response to the above comment, a section to this effect is included in the cover letters that are received with the survey. At this time, only Valley Forge NHP is undertaking a Deer Management Environmental Impact Statement, and this park has begun a separate public scoping process.


Comment: The survey should include more knowledge questions to assess the reasons behind people’s beliefs about both the NPS and deer and questions that assess people’s experience using non-lethal deer management alternatives. Some of the data collected in the survey may be difficult to interpret and may provide misleading results unless additional data is collected and the survey is amended.


Response: In designing the survey, we worked closely with professionals who specialize in survey design and considered tradeoffs between likelihood of response and survey length, clarity of questions, and depth of understanding. We are not attempting to determine the full suite of people’s reasons for holding the beliefs that they do. We recognize that people’s history of experience, knowledge, and values (among others) will play a large role in the way they respond to question items. To fully assess all the reasons behind each response is beyond the scope of any survey. Instead, our goal is to identify the climate for communication with the park; i.e., what are the main concerns of local community members and how are these similar or different from the park. Future dialogue with park staff would be needed to determine the full suite of reasons behind these concerns.


Comment: The format of Question 8 could be confusing; Question 10 should be worded more neutrally and should be presented as two questions for clarity, and; Question 11 asks people to make value judgments that may be based on different criteria for different people.


Response: Questions 8, 10 and 11 are similar in format to questions that have been used in previous surveys conducted by Cornell University’s Human Dimensions Research Unit (HDRU) and did not appear to pose problems of clarity. Question 11 is meant to evaluate the degree to which respondents agree with statements related to deer, not the criteria they use to make those judgments. In response to specific comments above, we reworded Question 10 to be more neutral.


Comment: Resolving deer-related concerns in national parks is dictated by law, regulation, and policy and that management cannot deviate from such standards, regardless of public opinion.


Response: Each of the study sites for this survey is a park where formal deer management is not currently in place. Formative research with NPS managers identified local community members as playing a crucial role in the development of issues (like those related to deer) from vague concerns to topics meriting management action (Leong and Decker 2005). This survey is designed to help managers identify salient problem elements and communication needs, should they decide to move forward with deer management. By identifying these needs a priori, this survey will help managers improve the quality of future public participation and civic engagement processes that are mandated by federal policies as a vital part of the decision-making process (National Park Service 2001a, b, 2003, 2006). These policies also recognize that local communities may have different concerns than the general public and that it is important to consider these concerns in addition to national concerns. Results from this survey cannot be used to make recommendations about management actions because (1) the management problem has not yet been formally defined (except in the case of Valley Forge NHP), and (2) no questions will be asked about potential actions.


No other comments were received for this one-time information collection as a result of the Federal Register notice.


Input was sought out from a number of stakeholders and others interested in the research project, including interviewees identified in previous preliminary qualitative inquiry with residents of communities near three of the five parks to be surveyed (see OMB Approval #1024-0224, NPS #05-047). Comments from two individuals were received as a result of this request for input.


Comments regarding sampling frame were received from Gerard Stoddard, President of the Fire Island Association. He observed that there are many long-term renters who would not be reached by a survey focusing on homeowners. He also noted that Fire Island communities are inside, not near the park. We recognize that there are many stakeholders who are interested in the management of Fire Island National Seashore, from homeowners to long-term renters, short-term renters, campers, boaters, and other day users. We chose to focus on homeowners for this survey because preliminary qualitative inquiry indicated that homeowners had property concerns that were somewhat different from renters (see OMB Approval #1024-0224, NPS #05-047). Long-term renters were included in preliminary qualitative inquiry and their perspectives helped shape the questions included on the survey instrument. Language describing the study area of interest and a map showing park boundaries were added to the questionnaire to clarify the relationship between Fire Island communities and Fire Island National Seashore boundaries.


Another comment regarding sampling frame was received from Ronald Martin, President of the Fire Island Pines Property Owners Association. He pointed out that the opinions and experiences regarding deer may be different for communities on Fire Island and those on Long Island. He believed that results should be geographically segmented. In response to this comment, geographic information about responses will be collected so that analysis can be accordingly segregated.


9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


No payments or gifts will be provided to respondents.


10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


No assurance of confidentiality will be provided to respondents, since the Department of Interior does not have the statutory authority to protect confidentiality or to exempt the survey from a request under the Freedom of Information Act. Instead, those who inquire about this issue will be told that their answers will be used only for statistical purposes. They will also be told that reports prepared from this study will summarize findings across the sample so that responses will not be associated with any specific individuals. Respondents will be informed further that the researchers will not provide information that identifies respondents to anyone outside the study team, except as law requires.


Names and addresses will be collected from public information via tax rolls. That personal information will only be used to send questionnaires and follow-up correspondence. Addresses of individuals will be destroyed upon completion of data collection and prior to any reports being published; therefore, the survey is virtually anonymous. However, since addresses are a potential link, the researchers will not promise confidentiality.


11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


No questions of a sensitive nature will be asked. In addition, respondents are advised that their answers are voluntary.


12. Provide estimates of the hour burden of the collection of information. The statement should:

* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.


This information collection requires only one response per respondent. Mail survey instruments will be sent to 1200 households at each of the five study sites (study total=6,000). It is expected that approximately 1000 of these will be deliverable per site (study total=5,000), and that 40% of deliverable surveys (400 per site, study total=2000) will be completed and returned. Burden estimate is 20 minutes per respondent, based on consultation with a sample of fewer than 10 potential respondents and with survey research specialists at Cornell University. Therefore, based on direct calculation, the annual burden hour estimate for the mail survey is 667 hours.


Follow-up telephone interviews will be conducted with 100 non-respondents per park, for a total of 500 individuals. This information collection also requires respondents to respond only one time. Burden estimate is five minutes per respondent, based on consultation with a sample of fewer than 10 potential respondents and with survey research specialists at Cornell University. Therefore, based on direct calculation, the annual burden hour estimate for the non-response follow-up interviews is 42 hours.


The total annual burden hour estimate for the study (mail survey + follow-up interviews with non-respondents) is 709 hours.


13. Provide an estimate of the total annual [non-hour] cost burden to respondents or recordkeepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).

* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information [including filing fees paid]. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.

* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.

* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.


The total annual cost burden to respondents resulting from the collection of information is $0.00 (zero dollars).


14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.


The NPS estimates that the agency contribution to the study will total $50, 911. The NPS contribution to the project is support from a cooperative agreement with the Biological Resource Management Division. The costs include: non-labor costs of survey production, postage, and telephone charges; labor costs of staff time to implement and enter data from the mail survey and non-respondent telephone interview; and indirect costs.



15. Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.


This is a new, one-time collection, accounting for a program change of 709 hours. No adjustments are involved.


16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.


Design of the survey instrument began in August 2005, aided by the qualitative interviews that were conducted as part of the larger project. Design of the survey instrument began as qualitative interviews were being transcribed and analyzed. In the spring of 2006, the survey instrument was reviewed by survey design specialists in the HDRU and pre-tested for readability and burden estimate with fewer than 10 respondents. In the fall of 2006, the survey instrument was refined based on comments from the 60-day Notice of Intent published in the Federal Register on September 18 and comments from the NPS Social Science Program. At this point, work on the survey component of the larger project awaits OMB approval to field the survey. Target date to begin survey implementation is February 1, 2007, if approved by then. Assuming that approval is granted, the four waves of mailing will be completed by February 28. Non-response follow-up telephone interviews will be conducted during the month of April. Data analysis and preparation of the draft report will continue until November 30. Following feedback on the draft from the sponsoring agencies, the final report will be submitted by December 31, 2007.


The time schedule for the larger project, including the survey component covered here, is summarized below.

Task

2005

2006

2007

Sum (Aug-Sep)

Fall (Oct-Dec)

Win (Jan-Mar)

Spr (Apr-Jun)

Sum (Jul-Sep)

Fall (Oct-Dec)

Win (Jan-Mar)

Spr (Apr-Jun)

Sum (Jul-Sep)

Fall (Oct-Dec)

1. Qualitative Interviews

X

X









2. Interview Transcription


X

X

X







3. Interview Analysis



X

X

X

X





4. Survey Design



X

X

X

X





5. Mail Survey Implementation and Analysis







X

X

X

X

6. Telephone Non-Response Follow-up Interviews and Analysis








X

X

X

6. Final Report










Due 12/07


17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.


The expiration date for OMB approval of the information collection will be displayed.


18. Explain each exception to the certification statement identified in Item 19, "Certification for Paperwork Reduction Act Submissions," of OMB Form 83-I.


There are no exceptions to the certification statement.


File Typeapplication/msword
File TitleSupporting Statement for a New Collection RE: Capacity for Local Community Participation in Deer Management Planning
AuthorMegan McBride
Last Modified ByMegan McBride
File Modified2007-01-25
File Created2007-01-25

© 2024 OMB.report | Privacy Policy