1024-0254 Part B

1024-0254 Part B.docx

National Park Service Centennial National Household Survey

OMB: 1024-0254

Document [docx]
Download: docx | pdf

Supporting Statement B


National Park Service Centennial National Household Survey


OMB Control Number 1024-0254



Terms of Clearance: None


Collections of Information Employing Statistical Methods


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


This is a one-time collection that will consist of three distinct telephone surveys (Household, Youth Engagement, and Non-response Bias). The Youth Engagement and the Non-response Bias surveys will be conducted within the context of the Household survey.


Household Survey:

Respondent universe: All English or Spanish speaking adults, aged 18 and over, living in the 50 states and the District of Columbia.

Sampling method: A probability sample of phone numbers will be drawn from the totality of all phone numbers believed to belong to U.S. residents. The sample will be disproportionately stratified by the seven NPS administrative regions to secure about equal numbers of completed surveys from each region. A dual sampling frame will be used, to include both landline and cellphone numbers in a proportion to reflect the prevalence of cellphone only and cellphone mostly household in the respective strata based on the most recent estimates available at the time of the sample draw. In no case will the proportion of cellphone numbers in the each sample strata be less than 60%. For the landline sample there will in-house random selection of respondent.

Response rates and sample sizes: The response rate achieved in the 2008- 2009 NPS survey (CSAP2), using the RR3 formula defined by the American Association for Public Opinion Research was 12.5% (overall); 15.4% for the landline sample and 5.7% for the cell phone sample. As a result of the decline in survey response rates, we expect a lower overall response rate for this data collection (CSAP3) than the one achieved ten years ago.

The decline in overall response rates was paralleled by a process of convergence of landline and cellphone response rates—the absolute decline in participation in telephone surveys for landline samples was accompanied by a relative increase in participation in telephone surveys for cellular samples. As a result, these two sampling frames now yield very similar response rates. It is for this reason, that we do not propose using separate raw response rate to estimate the size of the initial samples of landline and cellphone numbers needed to reach the desired number of completions.

Based on the pretest and on WYSAC’s most recent experience with national surveys1, we expect an overall raw response rate of 2.5%. This is the estimate used to determine the size of the initial sample of telephone numbers that we will need in order to reach the desired number of completed surveys (500 per region; 3,500 total), which we estimate will have to consist of 140,000 telephone numbers total. This initial sample will be disproportionately stratified by the seven NPS administrative regions and will include about 20,000 phone numbers per region believed to belong to households in that particular region (Table 1.)


Table 1. Respondent Universe, Sample Sizes, and Estimated Number of Completed Interviews

Stratum (NPS administrative region)

Respondent universe

Respondent universe size (estimated)

Initial sample size

Estimated number of completed interviews

National Capital Region

Adults speaking English or Spanish

543,588

20,000

500

Northeast Region

Same as above

57,468,582

20,000

500

Southeast Region

Same as above

52,844,391

20,000

500

Midwest Region

Same as above

59,412,954

20,000

500

Intermountain Region

Same as above

31,742,985

20,000

500

Pacific West Region

Same as above

42,710,749

20,000

500

Alaska Region

Same as above

550,189

20,000

500

Total


245,273,438

140,000

3,500



Youth Engagement Survey:

Respondent universe: Individuals aged 12 to17, living in the same household as the adult completing the household survey and receiving permission to participate.

Sampling method: There will be no further sampling within household once eligibility is established.

Sample size and effective response rate: Based on the results of the pretest we estimate that the initial sample will consist of about 550 eligible households with an expected response rate of about 30% (Table 2).


Table 2. Respondent Universe, Sample Size, Response Rate and Completed Interviews

Respondent Universe: Expected Number of Households with Children Aged 12 to 17

Expected Effective Response Rate

Total Number of Completed Youth Engagement Surveys

550

30%

165


Non-response Bias Survey

Respondent universe: All household in the sample who did not agree to complete the full-length survey.

Sampling method: All non-respondents will be asked the non-response bias questions..

Sample size and raw response rate: Based on the results of the pretest we estimate that the initial sample for the non-response bias survey will consist of about 136,500 phone numbers (per assumption of 140,000 phone numbers in the overall survey sample and 2.5% overall raw response rate to the survey) and the raw response rate for the non-response bias survey will be 3.5% (Table 3.)


Table 3. . Sample Size, Raw Response Rate and Completed Interviews for Non-response Bias Survey

Initial Sample Size


Expected Raw Response Rate

Estimated Number of Completed Non-response Bias Surveys

136,500

3.5%

4,778


2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The proposed data collection will be conducted using the telephone interview mode of survey administration. A dual sampling frame will be used. Both landline and cell phone Random Digit Dialing (RDD) generated numbers will be drawn into the sample. The sample of telephone numbers will be purchased from the Marketing Systems Group (M.S. G.), one of the leading national vendors specializing in the generation of scientific samples. M.S.G. will draw probability samples from the following sampling frames.

Landline RDDthis sample will include randomly generated numbers within residential area code and exchanges. These exchanges are restricted to 100-series banks known to contain households. The landline RDD database contains all residential landline exchanges in the U.S. including the District of Columbia. RDD samples can be defined by a variety of geographies, including states. Additionally, each exchange contains a demographic profile reflective of the area it serves. This auxiliary information may be used in the non-response bias analysis of the results of the final survey.

Cellular RDD - this telephone sample will be randomly drawn from a database containing all cellular dedicated 1000 series blocks (first seven digits) in the country. Cellular RDD samples contain working, non-working and unassigned numbers, which insures that each telephone number has an equal probability of selection. Cellular RDD samples can be defined by a variety of geographies, including state.


As indicated above, the proportion of cell phone numbers in the total number of phone numbers drawn into the seven subsamples will be determined individually for each region based on the most current estimates for the prevalence of cellphone only and cellphone mostly household in the respective region to account for the variations between states and hence between NPS regions. In no case will the proportion of cell phone numbers in the each sample strata be less than 60%. The telephone numbers will be crudely pre-screened by M.S.G. to eliminate, insofar as possible, disconnects, businesses, and other known ineligibles. Any ineligibles not identified through the pre-screening process will be further screened during the survey calling process.


Adjustment to the regional initial samples sizes may be needed to secure the desired total number of completions for the region, since the number of non-working, or otherwise ineligible numbers may vary by region, as could respondent cooperation rates. Other than that we do not expect unusual problems requiring specialized sampling procedures. Following the objective to obtain results that could be analyzed not just on the national level, but on the regional level as well, and the desire to achieve similar statistical power of the results for each region, the initial sample will be disproportionately stratified to secure about 500 completed interviews in each of the seven NPS administrative regions. Thus, the total target number of completed surveys will be about 3,500. Random samples 500 and 3,500 yield margins of error of about +/- 4.5 and 1.7 percentage points respectively, with 95% confidence. This level of statistical power will be applicable to all questions on the survey asked of all respondents.


In order to reduce the pretest burden estimate of 27.5 minutes per completed interview, it was decided that a split sample method would be the best approach to maintaining all of the new questions. The split sample method is used to randomly split the pool of respondents into subsamples and asking each subsample a predetermined subset of questions within the questionnaire versus every single questions. This approach was shown to reduce respondent burden by nearly 10 minutes to 18 minutes (per interview).


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


This data collection will employ the following protocols in an effort to achieve adequate coverage of the population in the universe studied, to maximize effective response rates, to secure representativeness of the final sample, and to obtain reliable data that can be generalized to the universe studied:

  • Use a combination of landline and cellphone. The samples will be in a proportion reflective of the prevalence of cellphone only and cellphone mostly households at the time the samples are drawn. This sample design will secure adequate coverage of the universe studied.

  • Use probability samples of phone numbers. This methodology will allow the findings for the respondents to be generalizable to the universe studied. The statistical power of the inferences that will be made will vary depending on the effective final sample sizes.

  • Use in-house random selection of respondent. To achieve better representativeness of the final sample, survey methodology has been employing various method of within-household random selection of respondent. In the 2008-2009 survey (CSAP2), a three-fold random method of within-household selection of a respondent was used. In one subsample (a random one-third of participating households), respondent selection was based on the “last birthday” for comparability with the 2000 sample. In a second subsample, respondent selection was based on the “next birthday” intending to help identify and offset some of the biases that might exist in the “last birthday” method. In the third subsample, strict randomization (using CATI software to randomly select, for example, the second oldest adult in the household) was used to allow an assessment of possible selection bias under either of the other two methods, and to ensure comparability with future iterations of the survey as this was expected to be the only method of within-household selection for future iterations of the survey. For the present iteration of the survey, the latter method of within-household random selection of respondent will be used with the landline sample. Current survey industry standards do not prescribe attempting to apply random within-household selection of respondent with cellphone samples.

  • Use multiple callbacks. WYSAC’s protocols for conducting telephone surveys using RDD samples dictate to call phone numbers up to 12 times if a previous attempts did not result in a completed survey, an irate refusal, or an otherwise ineligible number. Soft refusals are attempted again by specially trained interviewers. This extensive effort is intended to increase response rates, reduce non-response bias, reduce early response bias, and improve the demographic distribution of the final sample.

  • Use a calling schedule to include both weekday and weekend, and both evening and daytime calling. Calls will be made Sundays through Thursdays from 5 to 9 PM respondent time and on Friday and Saturday afternoons. Additional daytime calling sessions may be added on an as needed basis.

  • Conducted interviews in Spanish. In all cases where the first successful contact with the household identifies the need to transfer that record to a bi-lingual interviewer.

  • Split sample method. The method will be implemented to reduce interview length and respondent burden. This in turn will lead to fewer breakoffs (incomplete surveys) and higher cooperation rate, which will ultimately increase the effective response rates.

  • Completions will be monitored for each region individually, so that any lag in completion rates will be established early on and additional effort and resources will be employed to compensate for that lag.


The reality that in most general population surveys younger people, people in the lower education groups, people in the lowest and highest income groups, and males are underrepresented in the final sample, regardless of how comprehensive the sampling frame is and how well the samples are drawn, presents serious challenges to survey research. Including cellphone frames from which samples are drawn has helped counter that trend. Nevertheless, the potential for bias of the results exists when certain demographic groups are underrepresented in the final samples. To compensate for that potential bias, as a standard procedure, known population demographic benchmarks (for sex, age, education, race, etc.) obtained from the U.S. Census Bureau, will be used for weighting (post stratification) of the final survey sample, to bring the weights of key demographic variables in the sample of respondents in line with the true weights of those demographic variables in the population within each region and the country as a whole. Insofar as demographic characteristics are correlated with behaviors and attitudes, the post-weighting should adjust for that type of non-response bias.

  • In addition to post-weighting, we will conduct non-response bias analysis, using the findings from the Non-response Bias survey to be conducted within the context of the household survey. Potential respondents who do not agree to complete the full-length survey will be solicited to answer a few demographic questions as well as two substantive questions. The information collected will be used to perform non-response bias analysis. Specifically, respondents will be compared to non-respondents on their answers to the questions in the survey that correspond to the non-response bias questions answered by non-respondents. In addition, respondents’ demographic characteristics will be compared to US Census Bureau statistics, where applicable. Based on the pretest, we expect a sizable number of non-response bias surveys to be completed, which will be a reliable basis for the non-response bias analysis that will be performed once data collection is closed and the data sets are available for analysis.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


In preparation for the proposed data collection, we conducted a total of 30 cognitive interviews to assess clarity of wording and adequacy of response choices on new questions that were not used in CSAP1 and CSAP2. The analysis of the findings from the cognitive interviews were used to prepare for the second step of the pretesting process—the pretest of the entire questionnaire. Question-wording was further refined in the instrument used in the full-length survey pretest.


The main purpose of the full-length survey pretest (step two in the pretesting process) was to measure interview duration. The goal was to be able to finalize the questionnaire script so that interview length will be about 18 minutes on average. We also used this effort to identify any issues related to the methodology that needed to be addressed prior to the final survey. Based on the results of the pretest, the average time to complete a telephone interviews was 27.5 minutes. This indicated the need to find ways to reduce respondent burden. After careful analysis of the distribution of all responses, the survey research team proposed to use a split sample approach, as described above, rather than to eliminate questions.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The design of the samples for the cognitive interviews, the survey pretest, and the full scale survey was developed by WYSAC. The telephone samples will be purchased from the Marketing Systems Group, a leading national vendor specializing in the generation of scientific samples.


Individuals consulted on the appropriateness of the approach toward the sample design and survey methodology:


Stephen Bieber, Professor of Statistics

Executive Director

Wyoming Survey & Analysis Center (WYSAC)

University of Wyoming

307-766-2989


Bistra Anatchkova, Ph.D

Survey Research Manager, WYSAC

University of Wyoming

307-760-3459


Brian Harnisch

Senior Research Scientist, WYSAC

University of Wyoming

307-766-6103


Steve Lawson

Senior Director

Resource Systems Group, Inc.

802-295-4999


The following individuals will be responsible for the proposed information collection:


Bistra Anatchkova, Ph.D

Survey Research Manager, WYSAC

University of Wyoming

307-760-3459


Brian Harnisch

Senior Research Scientist, WYSAC

University of Wyoming

307-766-6103

1 National Adult Tobacco Survey (n=1500, duration-24 min. average), conducted in summer 2017.

6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorEast Carolina University
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy