Pretest Report of Findings

NPS-NHHS Pretest Technical Memo.docx

National Park Service Centennial National Household Survey

Pretest Report of Findings

OMB: 1024-0254

Document [docx]
Download: docx | pdf


OMB Control Number 1024-0254

Shape2


National Park Service


Centennial National Household

Survey Pretest





























Technical Memo

Draft















Introduction


In 2000, the National Park Service (NPS), in association with Northern Arizona University’s Social Research Laboratory, conducted the first Comprehensive Survey of the American Public (CSAP1) on a representative sample of all U.S. adults, including non-visitors to NPS units. More than 3,500 households nationwide were interviewed by telephone on subjects covering: frequency of visiting national park units, reasons for not visiting, the cost of traveling to a park, and attitudes toward various park management policies. In 2008, the NPS in cooperation with the University of Wyoming’s Wyoming Survey & Analysis Center (WYSAC) conducted the second Comprehensive Survey of the American Public (CSAP2) to address new questions and issues such as public attitudes and behaviors related to programs and services provided by the NPS that were not covered in the 2000 version of the survey. The information from both surveys was combined to provide input that was used to develop strategic plans and programs system-wide.


Specific to the NPS Centennial goal of achieving relevance in their second century by connecting with and creating the next generation of visitors, the NPS Centennial National Household Survey (NPS CNHS) will be used to provide information needed to understand the American public’s relationship with the NPS. Of foremost interest is knowing how the National Park System is perceived by visitors and non-visitors, including youth around the country. This survey will be instrumental in the agency’s quest to develop programs and missions that will make the Park System more relevant for generations to come.


The information collection discussed in this memo was the second step in preparation for the NPS CNHS.


The first step was completed in May 2017 and it included conducting 30 cognitive interviews to assess the adequacy of the wording of new questions, questions not included in CSAP1 and CSAP2, and in particular new questions based on goals set forth by the NPS Director's Call to Action for 2016. The analysis of the findings from the cognitive interviews were used to prepare for the second step of the pre-testing process—the pretest of the entire questionnaire. Question-wording was further refined. No questions were eliminated from the survey instrument.


The main purpose of the survey pretest (step two in the pretesting process) was to measure interview length duration. The goal is to finalize the questionnaire script for the NPS CNHS so that interview length will be around 20 minutes on average in view of an effort to keep the burden on the public at a reasonable level and control cost.


In consultation with NPS SSB it was decided to conduct 90 survey pretest interviews.


Method


The script for conducting the survey pretest interviews approved by The Office of Management and Budget (OMB) and updated based on the analysis of the results of the cognitive interviews was tested as follows.


  1. The script was programmed for WYSAC’s CATI (computer assisted telephone interviewing) system.

  2. A probability sample (simple random sample) of telephone numbers representative of the population of all 50 US states was purchased from the Marketing Systems Group, a leading national vendor specializing in the generation of scientific samples. The sample included both landline and cell phone numbers in the proportion of 21% (n=776)) landline to 79% cell phone (n=2930) for a total sample size of 3736 phone numbers.

  3. Experienced telephone interviewers were instructed on the purpose of the survey pretest interviews. The selected interviewers were trained in three steps: (1) in a lecture format, (2) hands-on, going through the script in the CATI system on their own and finally (3) by completing mock interviews with each other.

  4. Interviews were condcuted between July 19 and July 31, 2017.

  5. The non-response bias interview scrip was implemented, as was the Youth Engagement module.

  6. For this pretest we did not use a split sample design, since this would have limited the number of responses to each question and thus reduce the ability to assess the value of the question for the NPS CNHS by analizing the percentage distribution of responses. All respondents were asked all questions as intended by the skip logic imbedded in the questionnaire script. Also, knowing the average duration of the interview when no split sample design is used provides more flexibility when deciding on the design of the split sample approach if the need to shorten the duration of the interview becomes evident.

  7. Phone numbers were called up to 15 times if previous attempts did not result in a completed interview, an irate refusal, or an otherwise ineligible number (disconnected, etc.), which is higher than the customary12 attempts cap, and the intended attempt effort for the NPS CNHS. The average number of attempts was 6.2, which is rarely reached in general population surveys. This extensive call attempt protocol will help reach fairly accurate estimates of the size of the initial sample of phone numbers and will reduce the number of iterations in the sample size estimation process.

  8. A limited number of refusal conversion attempts were made during the pretest. The protocol for the NPS CNHS will included a significant level of refusal conversion attempts, where every soft refusal will be attempted at least once in a attempt at refusal conversion.


Summary of Findings and Recommendations


Composition of Final Sample:

  • A total of 92 interviews were successfully conducted with respondents representing five of the seven NPS administrative regions (the Intermountain Region, the Midwest Region, the Northeast Region, the Pacific West Region and the Southeast Region). We were not able to successfully reach respondents from the Alaska Region or the National Capital Region.

  • Two of the 92 completions were a result of respondent conversion from the non-response bias script to the full length survey.

  • In addition, 128 non-response bias interviews were completed.

  • Three interviews were completed with children aged 12 to 17.

  • All age groups are represented in the final sample.

  • Every ethnic/racial group is represented in the final sample.

  • Both males and females were interviewed.

  • Both households with children and households without children were interviewed.

  • Full time employed and part time employed people, full-time students and retirees, as well as unemployed people are represented in the final sample.

  • All income groups are represented in the final sample.

  • Interviews were completed with both non-visitors (defined as those who had not visited in the past two years) and visitors (defined as those who had visited in the past two years). The group of visitors was representative of both infrequent and very frequent visitors.


Accuracy of CATI system programming

Although the purpose of the step two pretesting was not to identify programming errors; there are ways of identifying such that do not require burden on the public; being able to assess the accuracy of the programming was a useful side benefit of the pretesting.


Initial analysis of the results did not reveal any programming errors. Further analysis may identify areas where the program would benefit from some edits.


It became evident that the actual number of questions asked of those not agreeing to the full length survey, but just to the non-response bias script, is not five, but six questions of substance plus the age screener and the informed consent scrip.


Duration of Interviews:

The average duration of the telephone interviews was 27.5 minutes. This finding dictates the need to find ways to shorten the duration of the interviews to get closer to the intended target length of 19 to 20 minutes, on average.


Recommendations:

We recommend that the introductory screen for the non-response bias script reads a few questions, rather than five questions, which change would avoid a false promise.


The pretest determined that interview duration if using the current questionnaire goes far beyond the intended length of 19 to 20 minutes on average.


After careful analysis of the distribution of responses to all questions included in the survey script and the intent to go beyond the visitation paradigm and assess the relevance of NPS in areas beyond the management of the National Parks, the survey research team proposes to apply predominantly the split sample approach rather than the elimination of questions approach..


We propose the following split sample design.



The five series of visitation questions PV6 to PV19, PV20a to PV20J. PV 21 to PV30, PV31 to PV46 and PV47 to PV47 to PV51, includes about 60 items, which equates to a little over one-third of the total number of items currently included in the script for adults. This series will be asked of roughly 30% of all respondents. This estimate is based on the proportion of respondents who were classified as visitors (visited in the past two years and able to successfully name a park from the official list of National Parks available to the interviewer). Thus, splitting the sample on these series of questions will have a significant effect on the statistical power of the regional level estimates (the final sample will have about 150 responses from visitors in any region). We propose to NOT split the sample on these series of questions, with one exception. In our opinion, it is not essential to obtain regional estimates for the PV47 to PV50 series. We propose to apply at least a split sample design for those four questions (half of eligible respondents get PV47 and PV48, the other half get PV49 and PV50). This series (PV47 to PV50) could also be considered for elimination from the questionnaire script.


We propose to apply a split sample design for the following questions.


Question PV5 would be asked of all respondents. We propose that it be asked of a random half. This will result in approximately 1750 responses per question overall, 250 per region.


The following series of questions would be asked of all respondents.


PA1 to PA8. (8)

EP1 to EP11 (7)

CP1 to CP7 (7)

RP1 to RP5 (5)

NNL1 to NNL5 (5)


These series include a total of 32 questions. We propose that 16 are asked of a random half of the respondents, and 16 are asked of the other half of respondents. This will result in approximately 1750 responses per question overall, 250 per region.


The NV1 to NV31 series of questions would be asked of non-visitors (not visited in the past two year or visited in the past two years, but not able to successfully name a national Park from the list available to the interviewer).


Non-visitors represent approximately 70% of the respondents. We propose that a random half of the respondents get 15 of the questions in the series, and the other half get 16 questions. This application will result in approximately 1125 responses per question overall; 175 per region.


6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAugust 26, 1999
AuthorFred Solop
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy