Supporting_Statement_PartB

Supporting_Statement_PartB.doc

Programmatic Approval for National Park Service-Sponsored Public Surveys

OMB: 1024-0224

Document [doc]
Download: doc | pdf

Supporting Statement for Programmatic Clearance for NPS-sponsored Public Surveys


OMB Control Number 1024-0224 (renewal)


B. Collections of Information Employing Statistical Methods


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The potential respondent universe for studies likely to be conducted under the programmatic approval includes visitors, potential visitors, and residents of communities near parks. All study proposals must include a description of a survey’s particular respondent universe. For qualitative studies employing purposeful sampling methods, the size of the respondent universe is not an important consideration because statistical inferences to the population will not be made.


The NPS requires that each submission under the programmatic approval includes an estimate of the expected response rate. This estimate must be based on rates obtained in previous research similar to the proposed study.


2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


All submissions under this program employing representative sampling methods and reporting statistical data must include a specific description of: 1) the sampling plan and sampling procedure (including stratification and selection methods for individual respondents); 2) how the instrument will be administered to respondents; 3) the planned analysis; and 4) expected confidence intervals. Experimental studies designed to estimate treatment effects must demonstrate that the power of statistical tests is sufficient to detect hypothesized outcomes. Qualitative studies must show that the method of respondent selection is appropriate for the purposes of the research. The details of these specifications will vary depending on the populations being sampled, the reasons for the surveys, and the questions being investigated. In its technical and administrative review, the NPS Social Science Program will work with researchers to ensure that information-collection procedures are appropriate for the intended uses of the data.


In general, periodic (less frequent than annual) data collections are not part of the surveys likely to be submitted under this program. In rare cases, follow-up surveys may be planned, as when participants in a park’s interpretive programs are later contacted at home to determine long-term effects of their participation. In these cases, investigators must show through the script used in the initial survey that respondents have consented to the follow-up survey and have voluntarily provided contact information.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


For surveys designed to infer from a sample to a population, the NPS requires that proposals address issues of potential non-response. Surveys must incorporate best practices to maximize initial response rates (i.e., multiple follow-ups or call-backs, minimal hour burden). Further, specific strategies for detecting and analyzing non-response bias are to be included in the submission form accompanying survey instruments. These may involve the use of survey logs in which observable characteristics of all those initially contacted on-site are recorded and/or a short interview asking a small number of questions to survey respondents and non-respondents. Investigators conducting telephone surveys may use their most experienced interviewers to convert “soft refusals” to completed interviews in order to maximize response rates.


The NPS Social Science Program requires that the results of non-response bias analyses be included in technical reports, and that the likely effects of this bias (if any) on the interpretation of data must be made clear to managers. In some cases, it may be feasible to balance or post-weight a sample to align important sample statistics, e.g., demographic or zip code characteristics, with known population parameters. However, this does not guarantee that there will not be non-response bias in attitude, knowledge, or belief variables.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Pre-testing and peer review of methods, procedures, and data-collection instruments are strongly recommended by the NPS as a means to reduce respondent burden and maximize the validity of NPS-sponsored surveys. In its review of submissions under the programmatic approval, the Social Science Program considers the need for pre-testing and other developmental work prior to fielding a full survey. In some cases, as when novel questions are employed or a new topic is investigated, NPS recommends that developmental work or pre-testing be conducted. This may be under a separate clearance. However, for many surveys likely to be submitted under this program, the methods and questions have a long history of successful application in the field. In problematic cases, NPS will ask for documentation in peer-reviewed literature of a question’s previous use. In all cases, NPS strongly encourages pre-testing on nine or fewer respondents prior to proposal submission to verify respondent comprehension, identify sources of measurement error, and refine estimates of hour burden. Ideally, participants in pre-tests should be drawn from the same respondent universe as the full sample. However, if this is not feasible, a similar respondent universe (e.g., visitors to a nearby park comparable to the one where the survey will be conducted) should be used.


Review of NPS Visitor Services Project Questionnaires by Don A. Dillman

At the request of the NPS Social Science Program, Dr. Don A. Dillman, Thomas S. Foley Distinguished Professor of Government and Public Policy at Washington State University, reviewed two Visitor Services Project (VSP) questionnaires. Dr. Dillman serves as Deputy Director for Research and Development of the Social and Economic Sciences Research Center at Washington State University, is a former president of the American Association of Public Opinion Research, and authored Mail and Internet Surveys: The Tailored Design Method. The reviewed questionnaires were employed in visitor surveys in 2007 at Agate Fossil Beds National Monument and Independence National Historical Park.


The VSP conducts 10-15 surveys in parks annually, which is 20-25% of all NPS social science research reviewed by OMB in a typical year. The Social Science Program often suggests to investigators that VSP wording be used for certain common questions (e.g., individual or trip characteristics) when this is consistent with the purpose of the study. Thus, the review by Dillman is pertinent not only to the VSP, but to other social science research in the National Park System.


Dr. Dillman’s review is largely positive. The complete text is appended to the supporting statement as Attachment J.


In his review, Dr. Dillman considered several factors related to the design of the VSP questionnaires. These included:

  • Effect of increased length on response rates;

  • Ease of understanding by respondents;

  • Booklet layout, including white space, navigation through the questionnaire, use of maps, and skip instructions;

  • Question formatting, including level of complexity, ordering of questions, and use of multi-part questions;

  • Instructions for the survey as a whole and for individual questions;

  • Completion time.


Dillman’s comments are summarized below.

  • Length: Dillman notes that VSP response rates have fallen only slightly over an 18-year period, even as the number of questions and pages in the survey booklets have increased. According to Dillman, one reason for this could be the addition of more follow-ups to maintain response rates, even as the general culture of surveying has led to lower rates across the board. The switch to a scanning process for data entry will reduce the number of items in each VSP booklet, because scanning requires more space between individual questions. To the extent that the number of questions determines response rate, this change should have a favorable impact on the VSP’s response rates.

  • Ease of Understanding by Respondents: Dillman notes that, from the standpoint of general clarity and communication, the VSP surveys are well done compared to most other surveys he reviews. He made the following additional comments:

    • Visual design: Although Dillman does not believe that the visual appearance of questionnaires is a major determinant of response rates, he notes that the VSP’s simple and less expensive design is capable of obtaining data of excellent quality. He further observes that respondents are likely to perceive VSP questions as interesting and to believe that other people will benefit from their responses.

    • Booklet layout: Dillman states that VSP’s use of a booklet format which does not require a return envelope is a good decision. He does not suggest any changes to this format.

    • Directions for branching instructions: While Dillman notes that the VSP branching instructions are different from those he uses, he concludes that the VSP’s method may not only conserve space, but be effective. He describes some branching instructions as “particularly creative,” and states that he might consider trying them at some time in the future.

    • White space: Dillman describes the pages of the VSP booklet as “fairly full.” However, he would not put much priority on increasing white space because it would not benefit the small VSP format as much as it would larger-size paper formats.

  • Question formatting:

    • Order of questions: Dillman states that he is not inclined to change the order of the questions in the VSP surveys in any significant way. He notes that the items are in logical order and that there are few attitude and opinion questions in which answers to earlier questions would be changed by answers to later ones.

    • Check-all response format: Based on recent research, Dillman recommends against a “check-all-that-apply” response format in mail surveys, advising the use of a “yes/no” format in its place. It is unclear if the issues that Dillman describes with check-all responses (i.e., checks biased towards earlier categories and fewer checks overall) apply only to long lists of responses or to short lists as well. The VSP uses both long lists (information sources and facilities used, activities participated in) and short lists (disabilities, race). Switching to the yes/no format for these questions is not a trivial decision because yes/no questions take more space and response time than do the check-all-that-apply formats. VSP proposes to conduct a comparison using the “yes/no” and “check-all” options in 2008 for both long and short lists. The results of this analysis will inform recommendations that the Social Science Program makes to other investigators submitting survey proposals for review.


    • Double-response requests: Dillman cautions against using double-response requests (i.e., responding to the same listing twice) and recommended reviewing the practice for possible negative consequences, such as higher non-response to the second question. The VSP plans to test alternatives to this practice in its 2008 surveys to determine the effects of double-response requests on item non-response.

  • Instructions: Dillman describes the instructions for answering questions as reasonably clear and sees nothing obvious to suggest a dramatic change that would improve response. He does suggest switching from blank lines for response spaces to boxes, which are more conventional. The VSP is adopting this recommendation as part of its switch to optical scanning for data entry.

  • Completion time: OMB recommended timing visitors while they completed questionnaires onsite to determine if the VSP’s estimated burden of 20 minutes per questionnaire was accurate. In his review, Dillman makes the same recommendation. The VSP tested completion times for ten respondents at two parks during the summer of 2007. The parks were Ebey’s Landing National Historical Reserve in Washington state and Independence National Historical Park (NHP) in Pennsylvania. The average completion time for the surveys was 17 minutes, with a range from 12 minutes to 32 minutes:

    • 7/16: 15 minutes

    • 7/19: 12 minutes

    • 7/20: 12 minutes

    • 7/21: 32 minutes

    • 7/22: 14 minutes

    • 7/22: 21 minutes

    • 7/22: 22 minutes

    • 7/27: 12 minutes

    • 7/27: 19 minutes

    • 7/28: 13 minutes


On-site observation suggested that completion times were affected by several factors besides questionnaire length. These included participation by respondents in other tasks while answering the questions and familiarity with the park. The longest completion time (32 minutes) was observed for a respondent at Ebey’s Landing who was eating lunch while filling out the questionnaire. Two of the shortest times (12 and 13 minutes) were recorded for respondents who lived across the street from the Washington Square unit of Independence NHP in Philadelphia and regularly used that location to exercise their dogs. Many of the questions in that park’s survey would not apply to these users and could be skipped. Dillman comments in his review that items that seem complex to a survey designer and questionnaire reviewer who have not gone through a “park” experience may be much simpler for respondents who have. Given the results of the timed tests and Dillman’s observation, the VSP’s estimate of 20 minutes per response seems reasonable.


  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The names and contact information of the responsible NPS liaison and the principal investigator(s) who will collect and analyze the data are included on all submission forms received under the programmatic approval. In addition, the following individuals were consulted on statistical and other design aspects of this program.


Dr. James H. Gramann

Visiting Chief Social Scientist

National Park Service

(979) 845-4920 and (202) 513-7189


Dr. Don A. Dillman

Thomas S. Foley Distinguished Professor of Government and Public Policy

Washington State University

(509) 335-1511


Dr. Lena Le

Assistant Visitor Services Project Coordinator

University of Idaho

(208) 885-2585





6

File Typeapplication/msword
File TitleSupporting Statement for Programmatic Clearance for NPS-sponsored Public Surveys
Authormmcbride
Last Modified Bymmcbride
File Modified2007-11-30
File Created2007-11-30

© 2024 OMB.report | Privacy Policy