Response to Comments from Van Johnson, NASS
GENERAL
Comment 1A: Were the individuals cited consulted on the specifics of the sample design and questionnaire design?
Response: Yes. While the sample and questionnaire design process has been led by Dr. Daniel McCollum (USDA Forest Service, Rocky Mountain Research Station, Fort Collins, CO.), the process has had extensive input and collaboration from the social scientists listed in the OMB Supplementary Information (Item A-8). Of note, Dr. Robert Berrens and Dr. Jennifer Thacher, University of New Mexico, are integral parts of the research team and helped develop the sample design and questionnaire design through all stages. Other researchers listed were consulted on specific question modules and for overall review of the instrument. Several rounds of detailed review comments, on both the sample and questionnaire design, from the Southwestern Region staff (USDA Forest Service) were summarized and provided by Dr. Richard Periman, Social Science Coordinator. Dr. Deborah Shields, Economist, USDA Forest Service, Rocky Mountain Research Station, Fort Collins, CO., provided input on questionnaire design, including current information on a select set of questions from the survey on values, objectives, beliefs and attitudes (VOBA), which has previously been implemented as a module of the Natural Survey on Recreation and the Environment (see Shields et al., 2002; and Haefele et al., 2005).
The OMB Supplemental Information (Item A-8) has been revised to reflect these involvements.
Comment 1B: The reference for the note was unclear and the note did little to clarify the situation when the referenced section was located. The supplied survey doesn’t have section designations.
Response: The Note was originally inserted to clarify a particular type of question. As illustrated by Mr. Johnson, the Note only confused things. We have deleted the Note referred to and incorporated the relevant information into our response to Item #6 below.
QUESTIONNAIRE
Comment 1:“What do you circle? The number, the statement, or both? How will the data be captured?”
Response: Changed question text (Q1, page 2, and other places) to read: ‘Circle the number . . .’ See the response to the “Final Comment” (below) for a discussion about the data capture process.
Comment 2: “How will this be recorded, there is no correspondence between opinion and question part. Line numbers needed.”
Response: In the revised survey, we have changed the question text (Q1, page 5) to read: ‘Each of the following statements is an objective for managing National Forests and Grasslands in the Southwestern Region. Indicate how important you think each of the objectives is by circling the appropriate number for each statement.’ For the TEST version, the text (TEST Q1, page 5) was changed to read: ‘Indicate your level of agreement with each of the following statements by circling the appropriate number for each statement.’ No line numbers or letters were added because it will not aid in data capture and is not of particular use for respondents.
See the response to the Final Comment (below) for a general discussion about the data capture process. Specific to this question, the database entry form is designed to detect and correct any data entry errors. Although this question appears as a large table in the paper survey, the database entry form will show about five of the statements on a page with a data entry field for each statement (as does the web-based survey option). The number circled by the respondent will be entered into the field. On complete entry for the five statements appearing on the data entry screen, the data will be saved to the database and the data entry program will continue to the next set of statements.
Comment 3: “What issues are being referred to?”
Response: This question is meant to identify how much an individual cares about national forest management. As such, in the revised survey the question text (Q2, page 6; TEST Q3, page 7) is changed to read: “How important to you is National Forest and Grassland management?”
Comment 4: “If you are showing total breakdowns, you need to show what the remainder and relate that to the information you are trying to gather.”
Response: The revised survey includes an additional line in the breakdown (page 13) that reads: ‘28% to directly addressing the three threats.’ The sentence after the breakdown has been deleted. This, in conjunction with the specific response characteristics described immediately afterwards, gives the respondent some context about the status quo response to the three threats. That is, they will be able to refer to the breakdown to know which programs will see the largest cuts if a reallocation of the budget is chosen. We have also taken out the “research” spending item and allocated that spending among the line items, and we added spending on “state and private forestry.” This should give respondents an aggregated but relatively complete picture.
Comment 5: “The explanation for Wildland Fires percent of High Priority Acres Treated is not clear.”
Response: In the revised survey, this paragraph has been re-worded to read: ‘The Forest Service treats land for hazardous fuels based on priority. About 8 million acres of the 22 million acres of the region’s Forest Service land are considered a high-priority for hazardous fuels treatment, including land near towns and homes. About 2% of the 8 million high-priority acres is now treated for hazardous fuels.’ We expect this will make the explanation more clear.
Comment 6: “If the alternatives are chosen randomly how will they be captured and summarized. (I believe the note refers to this section but still does not make the methodology clear)”
Response: Q4 (pg. 15) is an example of a choice question. There is an extensive literature on the theory and application of choice questions in marketing, transportation, and economics.1
In a choice question respondents are presented with different alternatives and asked to choose the one they prefer. For example, in this question individuals are being asked which of three types of management plans they prefer for dealing with the three threats: Option A, Option B, or the Status Quo. A management plan is described through four “attributes:” the percent of high priority acres treated for Wildland Forest threat, the percent of acres treated for invasive species, the number of law enforcement officers dedicated to trail enforcement, and the amount of reallocation from other forest spending. Each attribute can take on different “levels.” By choosing which of the three management plans he/she prefers, a respondent is providing information on the relative importance of each characteristic. This information can be aggregated over many individual respondents and statistically analyzed in a probability model.
The number of scenarios and the combinations of attribute levels that appear in them were selected using optimal design software specifically developed for conjoint-type questions, of which these choice questions are one. Such software provides for statistically efficient design and is widely used in the profession. In this case, analyses indicated 15 scenario combinations to be statistically optimal. These scenario combinations are reflected in 15 versions of the survey. Each individual in the sample is assigned to one of the 15 possible survey versions; each survey differs in terms of the numbers (or attribute levels) the respondent would see in Option A and Option B. (That is the only difference between the 15 survey versions—the numbers that fill in the table for Options A and B on page 15.) By varying these numbers through the 15 versions, we are able to statistically determine the relative importance of the different attributes of a management plan and estimate the value that respondents place on a change from the status quo to a particular management plan. Preferences across the different management attributes and relative values of different alternatives are what will be summarized from this question. Because individuals are (randomly) assigned to a particular survey version which is then linked to their unique survey ID number, the attribute levels to which they are responding are known and preserved in the database.
Comment 7: “Where is question 7 (Q7)? Why have same table except for last line. How will this be used?”
Response: The question numbering problem was a typo. This has been resolved in the revised survey version. The duplicate table included a different funding mechanism for addressing the three threats; this is no longer included in the survey, so only one table and its corresponding questions remain.
Comment 7A: “(Q4) What about unemployed individuals?”
Response: The revised survey (Q5, page 16; TEST Q5 page 13) now includes two additional options for occupational status: ‘unemployed – looking for a job’ and ‘unemployed – not looking for a job.’ The distinction between looking and not looking for a job is an important one, as most employment data collected (for example, by the Bureau of Labor Statistics) makes this distinction.
Comment 8: “How will this be captured? Maybe a few prominent natural resource jobs should be listed.”
Response: The data for this question (Q9, page 16; TEST Q9, page 13) will be entered in two stages. First, a binary response will be recorded for “yes” or “no” to whether the respondent works in a natural resource job. This data will be of most use for analysis. Second, written in responses to the “yes” answer will be coded and entered into one of several general job descriptions. The actual responses (specific job listed) will also be preserved for additional analyses that may be desirable in the future.
Comment 9: “Aren’t questions 7 and 8 both attempting to capture racial groupings. Why is Hispanic singled out?”
Response: These are now Q11 and Q12. Separate questions regarding Hispanic ethnicity and race are needed for several reasons. First, the Office of Management and Budget (OMB) requires that these questions be asked in this particular format. Second, Hispanic is an ethnicity, and an Hispanic person can be of any race. Singling out Hispanic ethnicity may be debatable along several lines, but it is accepted practice for federal agencies. Third, the two questions also correspond exactly to how the questions are asked by the Census Bureau. Since Hispanics are a minority that the federal government pays particular attention to, it is necessary to ask the questions in a consistent way to ensure that the survey sample is representative and useful to agencies.
Final Comment: “Overall the survey instrument as is would probably be a data capture nightmare…”
Response: We appreciate the reviewer’s comments and we have taken them seriously. As noted above, some relatively minor wording and formatting corrections should help to mitigate some of these concerns. More generally, although the reviewer never defines his concern, we interpret “data capture” to relate to converting survey responses to data in a format that can be used for statistical analysis and reporting. Absent more specific details about data capture concerns, we can only respond generally. Our research team, led by Dr. Daniel McCollum (USDA Forest Service, Rocky Mountain Research Station) will follow standard social science protocol for data coding and entry. The research team has extensive experience designing, implementing and analyzing large and complex social science surveys using a variety of survey modes, including both mail and web-based (e.g., Berrens et al., 2002, 2003 and 2004; Talberth, Berrens et al., 2006; McCollum, 2003; McCollum and Boyle, 2005; McCollum et al., 1999, and numerous other published citations). In addition to descriptive statistics, this includes coding complex variables for use in a wide variety of econometric and regression-based statistical models. We note that portions of the survey are designed based on the latest research on choice-based modeling (see references). Further, portions of the survey are taken from the Forest Service’s Values, Objectives, Beliefs and Attitudes survey which comprise a module of the National Survey on Recreation and the Environment (NSRE). The NSRE module has been rigorously field-tested (see Shields et al., 2002; and Haefele et al., 2005). With only several exceptions (e.g., the opportunity for comments), all responses to survey questions will be numerically coded, including identifying missing, and “don’t know” responses (e.g., -9, and -99). This will facilitate simple statistical analyses as well as identifying non-response patterns. Further, address based contact information is matched with available census tract and geographic information system (GIS) data to help evaluate sample representativeness and to test for sample selection effects. Response dates will also be coded in the data base allowing for additional analyses of response patterns.
For the web-option to the mail survey, the web-version has already been created to match the paper survey. Responses to the web-version of the survey are recorded directly into a secure database. Coding and response entries into that database will match those used for the mail survey.
Under the direction of the survey manager (Michael Hand, University of New Mexico), a double-entry electronic database form will be used to enter paper survey responses into the same electronic database as the web-based respondents. Two experienced data entry research assistants from the University of New Mexico will separately enter survey responses into a user-interface database form that is similar to the internet survey. The form is designed to detect and then reconcile any discrepancies between the two entries for each unique respondent. Once both entries are consistent, the data will be recorded into the master database for that respondent.
Partial and fully completed survey observations are all individually tracked, and matched to any experimental design characteristics (e.g., parameters for the choice experiment or language type). Data entry performance checks have already been conducted on both the mail (paper) and web-versions of the survey (in both English and Spanish). All steps in data base design and management will be handled by a Microsoft training certified database manager (Jeff Bjarke, University of New Mexico).
In closing the research team anticipates no problems with data capture. These surveys are consistent with surveys conducted and analyzed for social science and economic research all the time.
Works Cited
Adamowicz, W., Louviere, J. and Swait, J., 1998. Introduction to attribute-based stated choice methods. Technical report, NOAA.
Batsell, R. and Louviere, J., 1991. An experimental analysis of choice. Marketing Letters, 2:199-214.
Berrens, R., A. Bohara, H. Jenkins-Smith, C. Silva and D. Weimer. 2003. “The Advent of Internet Surveys for Political Research: A Comparison of Telephone and Internet Samples.”Political Analysis, 11(1):1-23.
Berrens, R., A. Bohara, H. Jenkins-Smith, C. Silva and D. Weimer. 2004. “Information and Effort in Contingent Valuation Surveys: Application to Global Climate Change using National Internet Samples.” Journal of Environmental Economics and Management, 47(2):331-363.
Berrens, R., H. Jenkins-Smith, A Bohara, and C. Silva. 2002. “Further Investigations of Voluntary Contribution Contingent Valuation: Fair Share, Time of Contribution, and Respondent Uncertainty.” Journal of Environmental Economics and Management, 44(1):144-168.
Green, P. and Srinivasan, V., 1990. Conjoint analysis in marketing: new developments with implications for research and practice. Journal of Marketing, 54(4): 3-19..
Haefele, Michelle; Shields, Deborah J.; Lybecker, Donna L. 2005. Survey responses from Region 3: Are we achieving the public’s objectives for forests and rangelands? Gen. Tech. Rep. RMRS-GTR-156. Fort Collins, CO: U.S. Department of Agriculture, Forest Service, Rocky Mountain Research Station. 27 p.
Louviere, J., 1988. Conjoint analysis: introduction and overview. Journal of Transport Economics and Policy, 10: 93-119.
McCollum, D. W. 2003. Nonmarket Valuation in Action. Chapter 13 in: Champ, P. A., K. J. Boyle, T. C. Brown (eds.), A Primer on Nonmarket Valuation. Kluwer Academic Publishers.
McCollum, D. W. and K. J. Boyle. 2005. The effect of respondent experience/knowledge in the elicitation of contingent values: An investigation of convergent validity, procedural invariance and reliability. Environmental and Resource Economics, 30:23-33.
McCollum, D.W., M.A. Haefele, R.S. Rosenberger. 1999. A Survey of 1997 Colorado Anglers and Their Willingness to Pay Increased License Fees. Project Report No. 39, for the Colorado Division of Wildlife. Fort Collins, CO: Colorado State University, Human Dimensions in Natural Resources Unit, and USDA Forest Service, Rocky Mountain Research Station.
Shields, Deborah J.; Martin, Ingrid M.; Martin, Wade E.; Haefele, Michelle A. 2002. Survey results of the American public's values, objectives, beliefs, and attitudes regarding forests and grasslands: A technical document supporting the 2000 USDA Forest Service RPA Assessment Gen. Tech. Rep. RMRS-GTR-95. Fort Collins, CO: U.S. Department of Agriculture, Forest Service, Rocky Mountain Research Station. 111 p.
Talberth, J., R. Berrens, M. McKee, M. Jones. 2006. “Averting and Insurance Decisions in the Wildland Urban Interface: Implications of Survey and Experimental Data for Wildfire Risk Policy.” Contemporary Economic Policy, 24(2):203-223.
Wittink, D. and Cattin, P., 1989. Commercial use of conjoint analysis - an update. Journal of Marketing, 53(3): 91-96.
1 For survey articles see Louviere [1988], Wittink and Cattin [1989], Green and Srinivasan [1990], Batsell and Louviere [1991], and Adamowicz et al. [1998].
File Type | application/msword |
File Title | Response to Comments from Van Johnson, NASS |
Author | Michael Hand |
Last Modified By | FSDefaultUser |
File Modified | 2006-05-23 |
File Created | 2006-05-18 |