NORE Appenndix B

NORE Appendix B.docx

National Ocean Recreational Expenditure (NORE) Survey

NORE Appenndix B

OMB: 0648-0637

Document [docx]
Download: docx | pdf

Appendix B Focus groups & interviews

Appendix B


Reports from focus groups and cognitive interviews

for the National Ocean Recreation Expenditures Survey (NORES)


Qualitative testing objectives


To evaluate the content and flow of draft versions of the survey instrument. Members of the general public were recruited to voluntarily participate in focus groups and cognitive interviews (one-on-one interviews). The qualitative testing period for this data collection was from April to August 2010.


Qualitative testing provided NMFS researchers with information related to:

  • how information in the survey was understood and perceived;

  • whether the list of ocean recreation activities was complete and whether the categories they were grouped in were understood as distinct;

  • to test the ability of participants to recall ocean recreation activities over different lengths of time ranging from two months to 12 months;

  • to test how well the expenditure tables were understood and whether the expenditures categories contained in the tables were actual expenses incurred when participating in ocean recreation; and

  • other elements of the survey instrument noted during the focus group discussions and interviews.


The information collected as a result of these focus groups and interviews helped to shape iterations of the survey instrument over the course of the qualitative testing period. Specific objectives for each focus group and cognitive interview group are detailed below.


Focus group overview


  • Focus group participants were recruited by a focus group facility contracted by the NMFS.

    • Twelve individuals were recruited per focus group to ensure that at least nine participants showed up per group.

    • Local area resident were recruited using random recruitment methods. The specific method was left to the discretion of the facility. NMFS researchers requested that the contracted facility not use their existing standing panel of possible participants as a sample frame for recruitment.

    • A recruitment screener was provided to the contacted facility by NMFS researchers. Participation in ocean or coastal recreation within the last 12 months was a critical screening criterion. Other characteristics such as how often an individual participated in ocean recreation, age, employment level, and gender were also used to screen participants. These characteristics were noted to ensure that each group consisted of a diverse group of participants. Focus group screeners are available upon request.


  • No more than nine participants per focus group

    • One moderator (NMFS researcher) per group

    • One to four observers (NMFS researchers) per group

    • One focus group per night to allow time for changes to be made before the second focus group.

    • Each group lasted approximately 1.5 hours.

    • All groups were recorded (audio and/or video), with consent of participants.

    • Draft survey instruments and moderator guides are available upon request.


Cognitive interview overview

  • Cognitive interview participants were recruited by the focus group facility contracted by NMFS.

    • Four individuals were recruited for each time slot to ensure that three participants showed up.

    • Local area residents were recruited using random recruitment methods. The specific method was left to the discretion of the facility. NMFS researchers requested that the facility not use their existing standing panel of participants as a sample frame for recruitment.

    • A recruitment screener was provided to the contacted facility by NMFS researchers. Participation in ocean or coastal recreation within the last 12 months was a critical screening criterion. Other characteristics such as how often an individual participated in ocean recreation, age, employment level, and gender were also used to screen participants. These characteristics were noted to ensure that each group consisted of a diverse group of participants. Cognitive interview screeners are available upon request.


  • No more than nine cognitive interviews per night

    • Three groups of interviews were scheduled throughout an evening. Each group consisted of three interviews held in separate rooms and occurring concurrently. Each interview consisted of an interviewer (NMFS researcher) and an interviewee (recruited participant).

    • Each interviewer conducted three interviews per night.

    • Each interview lasted approximately 1 to 1.5 hours, depending on whether the interview was self-administered or followed a “verbal protocol” procedure.

    • At least two interviews per night followed verbal protocol procedures. That is, we asked participants to read the survey information and questions out loud and think out loud as they answered these questions.

    • All interviews were recorded (audio and/or video), with consent of participants.

    • Self-administered and verbal protocol moderator guides are available upon request.




Focus groups, Charleston, SC

4/27 - 4/28/10


What we did


Charleston was the location of the first set of focus groups conducted by NMFS researchers. These groups were an opportunity to test the first drafts of the survey instrument. For each focus group, the survey instrument was broken up into four handouts. There was one moderator and two observers per group.


Handout A introduced the survey, the survey’s sponsor (NOAA), defined ocean recreation for the survey, and listed activities we thought encompassed the suite of possible ocean recreation. Twelve ocean recreation categories were tested for these focus groups. Respondents’ recall of their participation in these activities over the last 12 months was also tested (i.e., yes or no), as well as their participation in non-ocean recreation activities.


Handout B was intended to elicit feedback about questions related to how often (in days) respondents participated in ocean recreation over the last four months and whether respondents could allocate each of those days to specific ocean activities. Questions related to durable expenditures over the last four months that were associated with ocean recreation such as boat, vehicle, and second home expenses were also evaluated.


Handout C1 intended to elicit feedback about questions related to their “most recent ocean recreational trip.” “Trip” was defined in the introductory portion of this section and this definition was evaluated. Questions that were evaluated included ones related to trip duration, the primary reason for the trip (i.e., recreation, business, other), the ocean recreation activities engaged in and the “most important” activity on that trip, the location of the recreation activity, and an expenditure table.


Handout C2 intended to elicit feedback about questions related to factors that may have contributed to the respondents’ choice of location for their most recent ocean recreation activity and whether “outdoor temperatures” was a factor determining location choice. There were two versions of Handout C2, one used in each of the two focus groups conducted in Charleston. The first version asked respondents to indicate which factors contributed to their location choice (i.e., yes or no) and the second version asked respondents to rank their top five factors (i.e., “1” as the most important).


What we learned


Regarding Handout A, the definition of ocean recreation seemed to make sense to people and the activities we were excluding (e.g., freshwater activities) was understood by participants. We received suggestions on how we might break up one of our ocean recreation categories into two categories. We also received feedback on recreation activities we missed. We started to become aware that four of our categories (e.g., “Social activities”) may be problematic because it was unclear whether the ocean or coast was the primary reason they engaged in the activity or some other factor. For example, if a wedding occurred at a beach, we were asked if that was considered an ocean activity. Similarly, we were asked if a baseball stadium happen to have a view of the waterfront, was that activity considered an ocean activity. In later versions of the survey, we remove these categories due to these issues.


In Charleston, we started to understand that in Handout B, when asked about how often they recreated at the ocean or coast, the time period was important in terms of recall of their activities. For this handout, we asked them to tell us how many days they recreated in the last four months and this was difficult for high activity months. Some respondents mentioned that they felt less confident about their estimate of days recreating for time periods longer than one or two months. Also, the flow of the survey was questioned in terms of the time periods we were interested in. At different points in this handout, respondents are asked about their activities in the last 12 months, the last four months, and then their most recent trip. Some respondents did not pick up on the time period changes right away. Also, the flow of this handout seemed to be confusing to some. Formatting changes such as bolding or underlining text was suggested by participants as a way to highlight the time period changes, and for other questions and information throughout the survey.


Regarding the questions about durable goods expenditures, participants asked that items such as a “boat” or “vehicle” be clarified. That is, for a boat, some participants asked whether kayaks and canoes should be included. For vehicles, some participants asked whether camper trailers and motorcycles should be included. Also in Handout B, participants indicated that questions they were asked regarding the amount of time they spent using these durable goods for ocean recreation, were confusing and unclear. In subsequent versions of this survey, the durable goods were defined more clearly and the time question was changed over the course of several focus groups. Also in this handout, the recall period for these durable expenditures was four months. However, participants indicated that if the recall period was 12 months, they would not have trouble remembering these purchases because boats, vehicles, and second homes are big ticket items not often purchased.


In Handout C1 and C2, the word, “trip,” made some participants believe that they were being asked about longer trips to the ocean or coast. As a result, some participants indicated that they did not visit the ocean or coast when in fact, they walked down to the beach within the past week. Thus, “trip” was changed to “visit” for subsequent versions of this survey. Participants were also asked to identify the location of their last visit to the ocean or coast. These questions asked about the state where the activity occurred as well as the county. Participants indicated that if their last ocean recreation activity did not occur in their county of residence, they would likely have difficulty identifying the county. In subsequent versions of this survey, we changed this question to ask respondents about the city or town they recreated in, rather than the county.


Also in Handout C1 and C2, participants indicated that the expenditure table was too long; it spanned three pages. Since we were testing paper-based versions of the survey, the length of the expenditure table included all possible ocean recreation-related trip expenditures. For the online survey, the rows a respondent would see would be directly related to the activity that they engaged. Therefore, the size of the table would decrease. However, we noted these comments in these focus groups.


Handout C1 and C2 differed only in the last question regarding factors contributing to their choice of location for their recent trip. Handout C1 asked respondents to indicate how important a factor was, from “very important” to “not important”. In Handout C2, respondents were asked to rank their top five factors (“1” being the most important”). This change, from indicating how important each factor was to ranking only their top five, was made due to comments we heard during our first focus group in Charleston. For the second focus group, we changed this question to test the suggestion made the previous night. Regarding the temperatures participants experienced on their most recent trip, respondents indicated that other conditions such as wind, humidity, and rain, were also important factors for choosing a location and/or an ocean activity. These additional factors were added in subsequent surveys.


Overall, the biggest changes made as a result of our Charleston focus groups were related to wording, the flow of questions within and between handouts, and the ocean recreation categories. For example, “trip” was changed to “visit” for subsequent groups. The flow of questions within and between handouts was also questioned, particularly in terms of the period of time we were referring to (e.g., twelve months in Handout A to four months in Handouts B). Other comments led to changes to Handout B; we separated this handout into two handouts for subsequent groups. This increased the number of handouts to be tested from four to five.











Focus groups, Seattle, WA

5/11 – 5/12/10


What we did


Seattle was the location of our second set of focus groups. These groups were an opportunity to test the changes we made since our groups in Charleston. For each focus group, the survey instrument was broken up into five handouts. There was one moderator and two observers per group.


Handout A introduced the survey, the survey’s sponsor (NOAA), defined ocean recreation for the survey, and listed activities we thought would encompass the suite of possible ocean recreation. In these groups, 13 ocean recreation categories were tested. Respondents’ recall of their participation in these activities over the last 12 months was also tested (i.e., yes or no), as well as their participation in non-ocean recreation activities within the last 12 months.


Handout B1 was focused on durable equipment expenditures such as for a boat, vehicle, or second home. We had very few focus group participants who had made one of these purchases so we did not receive much feedback on this handout.


In Handout B2, we were interested in learning about how often focus group participants engaged in ocean recreation activities within the last four months. We were also interested in whether they could allocated each of day they participated to one of the 13 ocean recreation categories of interest.


Handout C1 intended to elicit feedback about questions related to their most recent “visit” to the ocean or coast. We used the term, “visit” rather than “trip,” based on feedback from our Charleston focus groups. This section also collected information about the types of activities participants engaged in, which activity was the “most important” for that visit, the location of the visit, and expenditures associated with the activities they engaged in.


Handout C2 was focused on what factors contributed to a respondents’ decision to recreate at a particular location. One focus group was asked to rank their top five factors. The second focus group was asked to check all factors that were important to them. Some questions were also focused on how weather conditions might influence the choice of recreation location.


What we learned


For Handout A, we received some feedback that the number of recreation categories we introduced was very long and focus group participants were skimming them rather than reading them carefully. Also, five of the categories that included visiting venues or attending events near the ocean or coast, was difficult to categorize as an “ocean activity” in many cases. For example, if a participant attending a baseball game in Seattle and that stadium is on the waterfront, we were asked whether that activity was considered an ocean activity. Due to these concerns, we decided to drop these categories for our next focus groups in New Orleans. Additionally, we received comments that it wasn’t clear whether we were interested in ocean recreation that occurred within the U.S. or in other countries. Some wording changes were made in this handout and in Handout C1 to emphasize that we were interested in U.S. ocean recreation only. Lastly, we received questions about the goals of the survey from both groups and as a result, re-worded our introduction to this survey that was on the very first page.


We also received comments that overall, this and other handouts were “text heavy” and visually, hard to get through. Also, we received comments that some of the wording in questions changed throughout the handouts and a more consistent terminology was suggested to reduce confusion for respondents.


We received very little feedback for Handout B1 because very few focus group participants had purchased a boat, vehicle, or second home within the last 12 months. However, we did have one participant in each group purchase a boat and from these two individuals, it seemed that this handout was not difficult to understand or fill out.


For Handout B2, we asked participants to tell us how many days they spent engaging in an ocean recreation activity over the last four months. This was difficult for participants who engaged in several different activities during this time period or who frequently participated in activities. Respondents felt more confident about their responses for the last one or two months but less confident about responses for the last three or four months. As a result of feedback from these focus groups and those in Seattle, we changed the recall period from four months to two months for subsequent groups.


As in Charleston, we received comments about the length of the expenditure table (about three pages) in Handout C1. For the online version of the survey, only the rows relevant to the activity engaged in on a recent visit to the ocean would be seen by respondents. Therefore, this complaint will be addressed and is a product of the paper-based version of this survey. No other comments were made about this handout. We also received several questions about why we were focusing on recent visits to the ocean or coast rather than typical or favorite visits. No changes were made as a result of these questions.


In Handout C2, we learned that ranking their top five factors for choosing a particular ocean or coastal location was not a problem. For subsequent groups, we continued to ask respondents to rank their top five factors related to location choice. We also learned that our question related to whether a 5 degree change in outdoor temperatures might change the likelihood of participating in an activity, was probably too small a change to make much difference in whether to engage in a particular activity. As a result of these responses, we increased the temperature change to 20 degrees in our next set of focus groups to see how participants might respond.




Focus groups, New Orleans, LA

5/25 – 5/26/10


What we did


New Orleans was the location for our last set of focus groups. These groups were an opportunity to test the changes made as a result of our previous groups. As in Seattle, the survey instrument was broken up into five handouts. There was one moderator and up to four observers per group.


For the first night, Handout A included a longer introduction than what was presented in previous focus groups. This was done due to requests in past groups for more information related to why this data collection was important and what the information might be used for. For the second night, we removed this additional information. Also, we tested for eight ocean recreation categories rather than up to 13 in past groups. As in previous groups, we were interested in learning whether the ocean categories presented were inclusive of all ocean activities participants engaged in or were familiar with.


In Handout B, participants were asked questions about expenditures on durable items used for ocean recreation activities. As in previous groups, these items were for boats, vehicles, and second homes. Fewer than two participants in either group incurred these types of expenses within the last 12 months.


Handout B2 was focused on the number of days participants engaged in ocean recreation within the last two months and the activities they engaged in. Participants were also asked to allocate each day spent recreating to one of the eight ocean categories.


For Handout C1, participants were asked about their most recent visit to the ocean or coast to engage in recreation. Except for some wording and formatting changes, these handouts were fairly similar to the handouts used in New Orleans.


In Handout C2, participants were asked to rank their top five factors that contributed to their decision to recreate at the ocean location visited on their most recent trip. Participants in both focus groups were asked to rank these factors. This was different than in Seattle and Charleston where participants in one group were asked to rank factors while the other group was asked to indicate the importance of each factor to their decision. For the temperature-related questions, we asked participants in one group whether a 10 degree change would influence their decision to engage in an activity. In the second group, this change was increased to 20 degrees.


What we learned


In Handout A for the first focus group, we included a longer introduction to the survey (first page of the survey) than in previous groups. This introduction added information about why the survey was collecting ocean recreation information and how this information might be used. We found that this information elicited more questions than was intended. For the second focus group, we removed this information. As in previous groups in Seattle and Charleston, the introduction to the survey was focused on the information being collected and did not include ways in which this information might be used in the future. For this second focus group, a more concise, focused introduction seemed more understandable to participants and did not elicit any comments or questions.


Also, only eight ocean recreation categories were tested for these focus groups. This seemed to be an improvement from previous groups where up to 13 categories were included. Participants indicated that the eight categories we presented them were understandable and were inclusive of activities they or their friends engaged in. Lastly, it was requested by participants that we include “bayou” in our definition of “ocean recreation.”


Also, for the definition of “ocean recreation” used in the introduction, participants in New Orleans in both groups suggested we add the term, “bayou,” a local term for coastal wetlands and estuaries. We incorporated this change in subsequent versions of this survey.


Handout B1 was focused on expenses on durable items. From the few participants who had incurred these expenses within the last 12 months, they indicated that the questions and tables were understandable and included expenses they made. However, a few typos were pointed out and were updated.


Based on feedback from previous focus groups, Handout B2 was focused on ocean activities engaged in within the last two months rather than four. We found that participants were more confident about their responses related to the number of days spent recreating within the last two months. However, some of the instructions in this handout were confusing to participants. For example, participants were asked to allocate each day to one ocean activity. For days where participants engaged in more than one activity, they were asked to attribute that day to the one activity “most important” to them. Some participants found this confusing and some wording changes were made to help clarify what we were asking respondents to do for that question.


In Handout C1 related to a recent visit to the ocean, participants indicated, as in previous focus groups, that the expenditure table was too long. However, participants also indicated that it was easy to skip sections of the table that were not related to their activity. The only issue we had for this handout was related to one column of the expenditure table. Some participants were unclear whether to include themselves in the column, “number of people you paid for” for a particular expense. This question came up in both groups and this column was labeled more clearly in subsequent versions of the survey.


For Handout C2, some participants indicated that the list of factors contributing to their ocean recreation location choice, was too long. Some suggested that these factors be grouped somehow, perhaps by activity. This list of factors was reduced for subsequent versions of the survey to focus on by aggregating factors in to broader categories. In addition, participants were asked whether a change in temperature of 10 or 20 degrees might change their decision to engage in their choice of ocean activity. For New Orleans focus group participants, a 10 degree change did not seem to be as influential as a 20 degree change. They also indicated that changing their choice of activity would depend on the base temperature and which activity they were engaging in.




Cognitive interviews, San Diego, CA

6/30/10


What we did


For our first set of cognitive interviews, the version of the survey that emerged from our New Orleans focus groups were used to produce an web-based version to test in one-on-one interviews. Three sets of interviews were conducted concurrently by three different interviewers. Each interviewer conducted three interviews for a total of nine interviews in one night. Each interviewer followed an interview guide to elicit feedback on how well they understood the survey and whether the survey itself was operating correctly (i.e., whether the skip patterns were working). Seven of the interviews were “self-administered” by the interviewee (participant). That is, the participant filled out the survey by themselves. Then, the individual was interviewed after completing the survey.


Two of the interviews followed a “verbal protocol” procedure where the interviewer sat with the interviewee as they completed the survey, and the interviewee was asked to read the survey outloud and talk through their thought processes outloud. The interviewer was asked to not interact with the interviewee as he or she was talking outloud. That is, the interviewee was asked to pretend that the interviewer was not there, as if talking to him or herself. This was helpful for getting a sense of how participants were thinking about these questions, how they were answering these questions, and the flow of the survey in general.


What we learned


The time it took participants to complete the survey ranged from 10 minutes to roughly 20 minutes. As in the focus groups, very few participants purchased durable goods in the last 12 months. Of the two who did, they indicated that the questions were easy to answer. Unlike in our focus groups, we received no comments about the length of the expenditure table for their most recent visit to the ocean or coast. For the web-based survey, only rows relevant to the activity they indicated were visible to participants. This seemed to work for respondents.


Several participants indicated that the wording in the survey was “repetitive” and “verbose”. For our next set of one-on-one interviews, much of the survey’s text was updated to remove much of the repetition. Lastly, we found that some of the skip patterns in the web-based survey were not working properly. For example, some participants did not see the weather-related temperatures at the end of the survey. Also, some participants indicated that they would like to see a “back” button so that they could navigate to previous pages in the survey to review responses or to be reminded of definitions (e.g., “water contact sports”).




Cognitive interviews, Boston, MA

8/26/10


What we did


As in San Diego, three sets of interviews were conducted concurrently by three different interviewers. Each interviewer conducted three interviews. A total of nine interviews occurred in one night. Interview guides for both self-administered interviews and verbal protocols were provided to all interviewers. Changes based on what we learned in San Diego were incorporated prior to these cognitive interviews. As in San Diego, seven self-administered interviews and two verbal protocol interviews were conducted.


What we learned


The time it took participants to complete the survey ranged from 14 minutes to roughly 41 minutes. On average, it took respondents about 20-22 minutes. The participant who completed the survey in 41 minutes was dissimilar to other participants; all other respondents completed the survey in under 30 minutes. It was unclear why this participant required this length of time to complete the survey but the interviewer indicated that the participant did not seem comfortable using a computer.


Overall, we again received some comments regarding repetitive language and clumsy wording in some sections. Many of the suggestions made by these participants were incorporated after these interviews. Generally, participants seemed confident in their responses to questions. One exception was related to allocating the total number of ocean recreation days to individual ocean categories. A couple of participants indicated that this was not easy to do when several activities were engaged in during one day. Suggestions to clarify this question included reducing the number of rows in the effort (number of days) table by focusing it to just the activities that the participant indicated in the first few questions of the survey. That is, if the individual indicated that they participated in three of the eight categories in the last 12 months, then the effort table should only include these three categories. This suggestion was incorporated. Lastly, many participants indicated that in the expenditure tables, if they indicated “no” (i.e., they did not incur a particular expense), a “0” should automatically fill in for that row. Participants found it cumbersome to have to manually fill in “0”s. This change was also incorporated.









File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorsarah.brabson
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy