NASS Peer Review with Response

0596-NEW 2013 ARNF Trans_survey NASS review comments v04-04-2013.docx

Arapaho-Roosevelt National Forest Transportation System Alternatives Study

NASS Peer Review with Response

OMB: 0596-0232

Document [docx]
Download: docx | pdf

NASS peer review of:

Arapaho-Roosevelt National Forest (ARNF) Transportation System Alternatives Study- 2013

Purpose of Survey

The purpose of this project is to collect information that will help the USFS improve transportation conditions, and recreation and resource management on the ARNF. In particular, the survey instruments in this study are designed to collect information about visitors’ perceptions, experiences, and expectations, with respect to transportation conditions and services, recreation opportunities, and visitor experience quality at BLRA (including the IPW), GP, and MERA. The information collection is also designed to help identify transportation-related issues experienced by visitors at each of the three recreation sites, and assess visitors’ opinions about potential changes in operations to modify and improve transportation services and facilities.”

Survey Administration Time Frame

The primary data collection period is mid-May through Labor Day, 2013. The supplemental data collection period is mid-May through Labor Day, 2014. The supplemental period will only be used if the number of respondents obtained during the primary period is insufficient.

Notes: If supplemental data collection is required in 2014, will the data collected relating to traffic conditions be comparable across years?

Response: Text was added to Supporting Statement B to note that traffic data will be collected during the 2013 study period, and again during the 2014 sampling period (if the extended study period is required) in order to report and interpret study results in the context of corresponding traffic conditions.

Populations of Interest

All recreational users of each of the 4 recreational sites of interest- BLRA, GP, MERA, and IPW during the survey period.

Notes:

There could be a lot of intersection in these populations. There does not appear to be a mechanism in place to determine population size, so the reviewer assumes that population sizes are unknown. The reviewer would suppose that there are some patrons who will visit more than one of these sites during the survey period. This means that patrons that visit more than one site will be in multiple populations. There does not appear to be a mechanism in place to positively identify individual respondents. It could be possible that a single person could be a respondent to more than one of the site specific surveys. Pooling data across the site surveys might be problematic. As long as conclusions are site specific, this should not present any problems.

Response: The comment is noted; analyses will not involve pooling data across survey locations; rather, analysis, results, and conclusions will be site specific.

Survey Design



Questionnaires-

Paper questionnaires will be used. There are 4 versions, administered to patrons at 4 different recreational sites within the ARNF. Each version is specifically tailored to the characteristics of each site.

Data Collection-

Notes:

It is not clear to the reviewer whether a survey administrator asks the questions of the respondent and records the responses on the questionnaire-- or the questionnaire is handed to the respondent and the respondent records his/her own responses. This affects the logistics of the sampling procedure. If an administrator is heavily involved with recording responses, then only a very limited number of sampled patrons can be engaged simultaneously. If the selected patron records his/her own responses, many can be engaged in supplying data simultaneously.

Response: Text in Supporting Statement B states “If the contacted visitor agrees to participate and is 18 years of age or older, the information collection staff person will administer a survey instrument to the respondent and instruct him/her to complete it onsite.” As noted, this will allow the survey administrators to manage multiple respondents simultaneously.

Sampling Procedure

Sample Size-

For each site survey, 425 visitor groups are to be contacted, with one respondent per group. The estimated response rate is about 70% for each site, yielding about 300 completed questionnaires per site. Partial non-response is not mentioned.

Notes:

Sample sizes and response rates are based on the literature and past experience.

Response: In very similar surveys conducted by the principal investigator at Mount Rainier, Rocky Mountain, and Yosemite National Parks, partial or item non-response occurred in fewer than 1% of the completed questionnaires. Thus, the sample size estimates reported are not expected to deviate substantively due to partial or item non-response.

Sample Design-

The supporting documents state that the sampling procedure is based on past successes, and the literature (Vaske, 2008 is sited). The reviewer admits not being familiar with the literature regarding these types of survey/sample designs.

Notes:

It is not exactly clear how patrons are actually selected for the sample. It appears that as patrons pass a specific location (check point) on the site, they are approached by a survey administrator and asked if they will participate in the survey. The reviewer is not clear on whether every group that attains the check point(s) will be asked to participate, or if it’s a little “hit-and-miss”. Regardless of the specific procedure used, care should be taken to eliminate as much selection bias as possible. Tendencies by administrators to avoid selecting visitor groups that are “too big”, are harried parents with a lot of kids, or are otherwise deemed to not look very “approachable”, might result in a collection of data that does not reflect the population of interest very well. The reviewer assumes that survey administrators are experienced in approaching and working with the general public.

Response: The comment is noted; survey administrators will undergo training, pre-testing, and onsite supervision procedures to ensure that biases of this nature are minimized.

It appears that there will be an attempt to enforce a limit to 1 sampled individual per group using the introductory set of screener questions. It might be a little difficult to ensure that multiple individuals that arrived in the same “carload or group” are not selected. To the extent that this occurs, some of the data could be duplicated.

Response: The comment is noted; survey administrators will be trained to follow procedures to minimize the extent to which this occurs.

It does not appear that the term “Group” that is used in the screener dialogue is well defined.

Response: Text was added to Supporting Statement B to specify the term “group” as referring to friends and/or family with whom the visitor is at the study location that day.

One might also look at the sample design as a two stage cluster design where a group is selected first and subsequently one individual from the group is sub-sampled from the cluster. This might be an important way to consider the sample because different individuals in the group might not do all the same activities, nor respond the same way to the more “opinion” oriented questions on the questionnaire. There would likely be some variation in response across members of the same “group” for some questions, and very little variation of responses within the group for other questions.

Has consideration ever been given to interviewing more than one person in a “group”?

Response: Consideration has been given to interviewing more than one person in each group. However, the decision was made to limit the interviews to just one person per group to avoid possible “false replication” on responses to questions about activities, destinations, and other questions that have a high likelihood of being the same for all members of a group. It is noted that this approach does assume, however, that the one group member who completes the interview responds to questions in a manner that is representative of the group generally (e.g., “opinion” oriented questions). This approach balances the relative advantages and disadvantages noted according to convention for visitor surveys in public lands recreation areas, though a case could be made for either approach. Reporting will note the sampling procedures, and corresponding assumptions and limitations.

Analysis-

Key [quantitative] estimates from the data will be descriptive in nature, primarily measures of central tendency (mean and median), dispersion (standard deviation), and frequency distributions. Some tests for differences in means and proportions by various sub-groups are expected.”

Notes:

The sampling procedure as described will result in a “non-probability” sample. This means that it will be difficult or impossible to assign a “probability of selection” for any member of the population. Also, the sampling procedure will not likely produce a true “random sample” due to various factors that the administrators of the survey cannot fully control under the conditions that exist in the field. Although descriptive statistics such as means, modes, medians can be calculated and should prove useful, the estimation of the actual precision (standard errors) of such statistics are problematic using design-based methods unless an assumption that regards the sample as being “random” is appealed to. The more precautions that are taken to prevent selection bias, the weaker this assumption becomes. (Refer to the earlier paragraph concerning selection bias).

Response: The comment is noted; survey administrators will undergo training, pre-testing, and onsite supervision procedures to ensure that selection bias is minimized.

Non-response Bias-

Non-response bias is being taken into consideration. The number of non-respondents will be tallied and observable characteristics of the non-respondents noted. In addition, an attempt to obtain answers to a few general, low burden questions will be made.

Notes:

This will assist in ascertaining some aspects of possible non-response bias. This bias, if determined to exist will be considered in the analysis to the extent possible.

Data Confidentiality

Respondents will be assured that their response is voluntary and anonymous, and that their data will be kept secure and used only for the purpose of the study.

Notes:

This appears to adequately address this concern.

Conclusion

Notes:

The reviewer has attempted to bring to light some of the aspects of this particular survey that can be challenging from a design-based sampling perspective and to highlight a few aspects that are important for a survey of this nature. The comments made should be taken as things to keep in mind when conducting the survey and analyzing the resulting data. They should not necessarily be taken as the pointing out of faults. The methods outlined in the supporting documents appear to the reviewer to be satisfactory considering the situational constraints encountered when conducting this type of survey.

Response: The comment is noted; thank you for the careful review, it will help to ensure robust study procedures and results.

Matt Fetter- USDA/NASS

202-720-7986





Page 1 of 3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy