California_Signage_SS_080312_Part_B_final

California_Signage_SS_080312_Part_B_final.doc

Evaluation of Interpretative Signs Located Along the Californnia Coastline Part of the California Signage Plan Initiative

OMB: 0648-0653

Document [doc]
Download: doc | pdf


SUPPORTING STATEMENT

EVALUATION OF INTERPRETIVE SIGNS LOCATED ALONG THE CALIFORNIA COASTLINE PART OF THE CALIFORNIA SIGNAGE PLAN INITIATIVE

OMB CONTROL NO. 0648-xxxx



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g. establishments, State and local governmental units, households, or persons) in the universe and the corresponding sample are to be provided in tabular form. The tabulation must also include expected response rates for the collection as a whole. If the collection has been conducted before, provide the actual response rate achieved.



Two survey sites located along the California coastline

Annual total visitor attendance (avg.) by the General Public at the Monterey Bay Recreational Trail (MB) and the Landing Emporium for the Channel Islands (CI)

500,000

Estimated number of adult visitors (age 18+) in the MB and CI general public visitor audience

350,000

Desired sample size of general public adult visitors in the MB and CI audience

664 adults will be approached to obtain a sample of 400 adults

Respondent selection method

One adult per randomly selected visitor group, when exiting from the exhibit areas of MB and CI

Estimated rate of cooperation of randomly selected adult visitors

66% [x 664] or fewer visitors for a final sample of 400]


Note:  In the nearly 30 years experience of the evaluator who will direct this study, the actual rate of onsite cooperation at similar facilities (aquariums, museums) averages about 80%; the rate from about 20 projects in the last two years has ranged from 72% to 98%. We are estimating a lower response rate due to the online option, as online options typically have lower response rates. The online survey is intended to serve as a convenience to potential respondents. Thus, the average response rate, considering the onsite and online response rates, is 66%.


2. Describe the procedures for the collection, including: the statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose described in the justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The characteristics of visitor populations at visitor centers and museums vary considerably and randomly (e.g., a local family may be followed by a tourist couple who may be followed by a single adult tourist, and so on). In places with relatively low volumes of visitors (such as the Channel Islands Marine Sanctuary), compared to high volume places such as the Smithsonian, a representative random sample of visitor groups can be obtained by using a “next available” protocol, as follows:


The interviewer is positioned near the exit from the exhibit space (e.g., at the Channel Islands Marine sanctuary area where boats pick up and drop off visitors).  As any visitor group (usually 1-4 people) begins exiting, the interviewer approaches and makes eye contact with the ‘first adult’ (in practice: the one who is physically closest to the interviewer) and requests their participation in giving feedback about the exhibits.  The cooperation rate for this type of intercept interview (using a brief introduction that explains the purpose in one sentence) typically averages about 80%.  If the adult visitor agrees, the interview is completed.  Upon completion, the interviewer will tend to step aside to complete their work on the interview form (documenting the date and time of the interview, adding their own initials to it, reviewing the form to check for completeness and readable handwriting, and also to put away the completed interview form and have a new blank one ready); this process usually takes 2-5 minutes.  When the interviewer is then prepared with a new blank interview form and related, s/he looks up and selects the “next available” visitor group who are moving to the exit.  


For the online survey, visitors at the two sites will be intercepted, given a postcard with the survey URL, and asked for their e-mail addresses so that NOAA can send a reminder e-mail with the survey link. The postcard will be given to visitors who decline completing the survey onsite. If visitors decline to provide an e-mail address, they will still be given a postcard. By soliciting in person, we hope to increase the response rate.


PRA information will be read to onsite respondents, and a handout (attached) with the information will be given to visitors who agree to complete the Web-based survey.


The principle of this and other sampling methods is that the interviewer does not choose who to interview by appearance or facial expressions that might indicate enjoyment or not, or by whether there are or are not children in the group; in essence, the visitor group selects themselves by deciding when to exit (although they don’t know the sampling parameters). There may be another group being interviewed at the time when another group leaves, in which case they would not be selected.  Depending on the visitor flow, the next visitor group might be leaving right then, or the interviewer might have to wait for 5-10 minutes for the next group to leave.  This characteristic of ‘low volume’ visitor facilities makes it impractical to use other methods such as selecting every 4th visitor group, or using a random number chart (for example, from 1 to 5) to decide which visitor group to select.  


In addition, data will be collected on weekdays, weekends and different times of day, to endeavor to yield a representative sample. Data collection will take place over all open hours of the sites; no weighting will be necessary as the sample will be representative of all visitors to the site.

If we intercept every 4th person (for example) that means that we will be getting more (or fewer) visitors on weekends (for example),depending on the flow.  Our systematic sampling takes into account the flow of visitors; when more visitors come through, we collect data from more people. Ultimately because of our systematic procedure, we will obtain a sample that is representative.


3. Describe the methods used to maximize response rates and to deal with nonresponse. The accuracy and reliability of the information collected must be shown to be adequate for the intended uses. For collections based on sampling, a special justification must be provided if they will not yield "reliable" data that can be generalized to the universe studied.


ONSITE:

Prior experience with intercept interviewing of visitors in museums and interpretive centers leads our evaluator to expect a response rate of 80%-90%.  Therefore, non-response is not likely to be a factor during onsite data collection.  Typically, experience has shown that inviting visitors to give their opinions is a positive motivator, and that the way in which the invitation is delivered can enhance or detract from visitors’ desire to cooperate (e.g., neat appearance of the interviewer, a clear voice, pleasant demeanor, and in a small proportion of interactions: giving assurances that the interview won’t take too long, or that it won’t be too hard).  


When the survey instrument and procedures are approved for implementation, we will begin monitoring the cooperation rate onsite and maintain a log of those who decline to participate, noting their gender and approximate age. We will compare the respondents with non-respondents (those who declined) to determine whether the populations differ according to these two variables.  If the response rate is below 80% in the early part of our collection (our estimated response rate for onsite surveys [not including the typically lower response rate of on-line surveys]) we will experiment with fine-tuning of the logistics of the survey (where the interviewer stands, which sentence of the explanation comes first, offering tokens of appreciation/thanks before interviewing vs. after) to seek improvements in the cooperation rate onsite.  In the unlikely event that we encounter an ongoing response rate onsite of below 80%, we can increase the number of hours volunteers collect data until the desired sample size is reached, while noting the lower overall response rate.


If respondents decline participation onsite, volunteers will give them a postcard with a URL for responding online. As with the onsite interviewees, gender and (apparent) age will also be noted, using the attached form. The postcard will note an incentive for completing the survey online. The incentive is a $50 Amazon.com gift card; four gift cards will be given away via a random drawing.

 

We are consciously choosing to not advise visitors when they enter the site that we will be seeking their opinions and feedback, since this tends to cue people in ways that sometimes lead to changing their behavior and use of exhibits (e.g., staying longer, feeling that they will be “tested” later), and we are seeking to generalize to the normally occurring pattern of visitor experience.  


ONLINE:

By offering the incentive of winning one of four $50 Amazon.com gift cards, we hope to encourage those visitors who declined to be interviewed onsite to complete the survey online. We are striving for a 52% online response rate (online response rates are higher when there has been a personal invitation to participate). Offering several prizes presents a greater chance of winning and thus more incentive to participate. Participants will not be able to enter the raffle unless they complete the entire survey. Additionally, sending a reminder with a link to the survey to those visitors who provided an email address onsite will also encourage participation.

When respondents decline participation onsite, volunteers will give them a postcard with a URL for responding online. The postcard will note an incentive for completing the survey online. The incentive is a $50 Amazon.com gift card; four gift cards will be given away via a random drawing.


We will report non-response rates based on the number of postcards distributed vs. number of surveys completed online. We will note the gender and age of all persons receiving a postcard and will compare gender and age data on the sample that declined with the sample that participated and note differences. In addition, we will include the following questions in the reminder email and request a reply to help identify additional nonresponse biases:



Did you visit that site alone or with others? [Mark one response.]

Alone

With friends/family

Organized group


What effect, if any did the sign or signs have on your visit to this site?

None, the sign(s) neither enhanced nor detracted from my experience at this site.

The sign(s) detracted from my experience at this site.

The sign(s) enhanced my experience at this site.

Other (please describe):___________________________________________________


NOAA will be responsible for collecting and forwarding the responses to be statistically analyzed.


4. Describe any tests of procedures or methods to be undertaken.  Tests are encouraged as effective means to refine collections, but if ten or more test respondents are involved OMB must give prior approval under the Paperwork Reduction Act.  


The evaluator will conduct onsite testing of the survey prior to data collection, with fewer than 10 individuals, and will change the instrument as necessary. The new versions will be tested during the staff and volunteer training sessions and any final changes will be made before the paper surveys are printed. A change request will be submitted once the two survey instruments are finalized.


If, after the survey is approved and the formal data collection begins, survey completion rates seem low, strategies for maintaining the scientific quality of the research while increasing cooperation will be considered (e.g., as described in the response to the previous question: location of the interviewer, offer of incentives, etc.).


5. Provide the name and telephone number of individuals consulted on the statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


The evaluator, who developed the research design and composed the survey instrument, is:

Randi Korn, Founding Director, Randi Korn & Associates, Inc. (www.randikorn.com); 703-548-4078.


Ms. Korn will supervise the beginning of the implementation of the survey, and oversee Research Associates who will conduct training (in random selection, techniques for conducting intercept interviews and maintaining rapport with visitors), and will coach and support the OMNS staff coordinator, Seaberry Nachbar (regarding the monitoring of the quality of interviewers’ work) who will organize and manage the data collection process.


Randi Korn & Associates, Inc. has 23 years of experience in museum evaluation and visitor studies, (with 30 years of work in the field of visitor studies) and the experienced staff (with 2 to 15 years of experience) will analyze and interpret the data.


Seaberry Nachbar will be NOAA’s principal representative in interpreting the data and articulating the possible implications for exhibits, programs and related ways of educating the public about the ONMS.


interview Refusal Log


#

Data Collector

Date

M/F

Age

Reason for Refusal


Email address
















































































































































































Age Categories

18-24

25-34

35-44

45-54

55-64

65+




5


File Typeapplication/msword
File TitleINTRODUCTION
AuthorShayne Gardner
Last Modified BySarah Brabson
File Modified2012-10-25
File Created2012-10-19

© 2024 OMB.report | Privacy Policy