NASS Comments and Forest Service Response

NASS_Comments_and_FS_Response.doc

Public Support for Fuel Reduction Policies: Multimedia versus Printed Materials

NASS Comments and Forest Service Response

OMB: 0596-0203

Document [doc]
Download: doc | pdf

Comments from the National Agricultural Statistical Service

Response from the USDA Forest Service

Public Support for Fuel Reduction Policies: Multimedia vs. Printed Materials

July 2006

NASS Comment: One of the purposes of this study is to determine what effect the method of data collection has on the response and the rate of response. The two methods of data collection that will be compared are: phone-self administered video tape and phone-mail-phone. It appears from the report that the phone-mail-phone portion of the survey has already been completed, although the report does not specify how long ago. If much time has passed, it will be difficult to determine if the differences in response and response rate are due to different data collection modes or due to other differences, like time. Ideally, the two data collections would occur simultaneously to minimize other factors that might affect the responses. Are the same questions, worded in the same way, going to be asked in the self-administered survey as were asked in the mail survey?


FS Response: Yes, the wording of the questions in the self-administered survey is the same as the paper mail survey. The mail paper questionnaire was completed several years back; we will update the WTP estimates obtained in the mail paper survey using the consumer price index to the date the new self-administered survey is completed to have comparable WTP estimates.


NASS Comment: The method of non-response adjustment is unclear. The report described how the questionnaire asks the respondent to explain reasons why he/she chose a particular response, but the report does not explain how the study will handle item or unit non-response. What methodology will be used to adjust for item non-response when a person refuses to answer a question? How will the data be adjusted if a person agrees to complete the self-administered survey but never returns it? Or if the respondent from the screener is different than the respondent from the answer sheet?


FS Response: To deal with the issue of nonresponse once the participant has agreed to complete the survey but does not complete the self-administered survey, we propose to use the bivariate probit model with sample selection. This model incorporates Heckman’s (Heckman, J. 1979. Sample selection bias as a specification error. Econometrica 47(1): 153-161.) thoughts on sample selection bias into the standard bivariate probit, a model with two simultaneously estimated equations that allows for correlation between the error terms in each equation. The premise of Heckman’s sample selection model is that “using non-randomly selected samples to estimate behavioral relationships” results in biased and inconsistent parameter estimates (1979, p. 153). Because self-selected respondents may differ in some significant way from non-respondents, it is important to correct for this bias. Ignoring this issue could lead to inconsistent parameter and WTP estimates, making them unfit for generalization to the population.


To reduce the possibility of a different person completing the answer sheet we will check for gender and age during the pre-screener (already checking for gender), and then crosscheck with answer sheet to make sure they match for gender and age. However, because the unit of analysis is the household, we think it will not make much difference. But this is unlikely to occur because the transmittal letter will state the survey is to be answered by the person addressed to.


NASS Comment: The report refers to the use of a stratified random sampling technique, but does not describe what stratification variables will be utilized.


FS Response: A stratified random digit dialing along a fire risk gradient across several counties in California and Montana consisting of 1400 heads of households to reach 1000 completed surveys. The sample will be divided in thirds as follow: 333 each for Spanish and English speaking household in California, and 334 English speaking households in Montana. The risk gradient varies from counties with a large numbers of wildland fires to counties with very few wildland fires. The counties with few to zero fires will serve as control. These counties are the same as those used in the mail paper survey. We propose to sample three times as many people in the high fire counties as in the control (low to zero) counties; and two times as many households in the medium fire counties than in the control counties (the approximate distribution is 168 households in the high fire counties, 110 in the medium fire counties and 55 in the control counties).


NASS Comment: The report does not specify the desired level of accuracy. Is the proposed sample size large enough to achieve reliable results for the intended population?


FS Response: Yes; we will be within 5.7 percent error margin for all three populations in our dichotomous choice CVM question. (Babbie, Earl. 1991. The practice of research, 6th ed.; Belmont CA: Wadsworth Publishing Co. 493 p.)


NASS Comment: The questions that were read and displayed on the video did not match the questions on the answer sheet. This happened on Q4 (different wording) and Q8 (the last question on the video did not appear on my answer sheet). Also, the speaker in the video kept holding up a booklet entitled “Expanded California Fire Management Program”, but it did not contain the answer sheet (the answer sheet was a separate packet). There were additional questions in the booklet that were not numbered, and I was confused as to whether I was supposed to answer them or not. There were also a few grammatical errors in the answer sheet. (For example, in Q3: “Do you agree of disagree”…should read “…agree or disagree…”).


FS Response: Question 8 was inadvertently left out of the answer sheet; answer sheet corrected to include Q8; grammatical errors corrected. Thanks for catching this! The additional questions are demographic questions to test representativeness of the sample.


NASS Comment: In order to play the video tape, I had to wipe the dust off my VCR and actually hook it up to my TV because I never use my VCR. I wonder if survey response will suffer because many people don’t use that technology anymore. Could the respondent choose to be sent a DVD instead?


FS Response: The screener survey will now include a question asking potential participants if they would like a VHS videotape or a DVD.


NASS Reviewer: Alexandra Riley

Statistics Division

Statistical Methods Branch

NASS

2

File Typeapplication/msword
File TitleComments from the National Agricultural Statistical Service regarding the survey entitled: Homeowner Risk Reduction Behaviors
AuthorFSDefaultUser
Last Modified ByFSDefaultUser
File Modified2006-11-21
File Created2006-07-31

© 2024 OMB.report | Privacy Policy