OMB Response 2 - 10 9

OMB response2 10 9 09.docx

Garrett Lee Smith Campus Case Studies

OMB Response 2 - 10 9 09

OMB: 0930-0305

Document [docx]
Download: docx | pdf

GLS Campus Case Study

OMB Response

October 9, 2009


Question 1

To clarify, you mention that each campus participating in the SPEAKS is allotted $1000 for student incentives. Please describe the lottery procedures that were previously approved/are being used for this study (e.g. specifically what amount of money is used ($1000 to one person?), how many people get this amount, how selection is done, etc.)


Response

The cross-site evaluation team budgeted $1000 for student incentives at each campus during each administration of the SPEAKS. Prior to the administration of the SPEAKS with Cohort 1 and Cohort 2 campus grantees, the cross-site team collaborated with each campus to determine how to award incentives. Campus staff were given the choice of implementing a lottery-style incentive or providing $5 money orders to up to 200 respondents. Campuses that chose a lottery-style incentive were given the opportunity to determine the incentive structure they felt best fit the students on their campus. For example, several campuses chose to award money orders in varying denominations that added to a total of $1000. Other common incentives included iPods; gift cards to Best Buy, Target, Apple, and Barnes and Noble; and gift cards to local restaurants and campus bookstores. In addition, some campuses chose a mix of various types of incentives.


Once the SPEAKS administration ended, the cross-site team produced a list of respondents for each campus. For campuses using a lottery-style incentive, staff randomly selected the appropriate number of student winners. Once the winners were chosen, the cross-site team sent the appropriate incentive to each individual with a letter thanking them for completing the survey.


Question 2

Also, what “proven methods” are you referring to in your response to 4(b)? (“Because we intend to work closely with each campus to increase response rates by proven methods, we anticipate a response rate of 20-30%.”) Please include brief description and documentation if possible on how these response rates were determined. Also we suggest changing the language from “proven” to “previously successful.”


Response

The cross-site evaluation team has and will continue to implement established and previously successfulmethods to achieve response rates of at least 20 percent on the SPEAKS. The cross-site team utilized a tailored design method in an effort to achieve desired response rates across campuses(Dillman, 2000). When administering the SPEAKS, the cross-site team implemented several strategies from the tailored design method, including:


  1. Delivering multiple contacts to notify respondents of the upcoming survey; introduce the survey and provide a link/password to complete it; and send reminder emails to non-respondents

  2. Personalizing each correspondence sent with the company name and contact information for the government project officer at SAMHSA and the principal investigator at ICF Macro

  3. Providing an incentive for participation and including a description of the incentive in correspondence


Additionally, the cross-site team’s goal was to establish trust with respondents (Dillman, 2000). By describing the GLSMA in correspondence, the cross-site team aimed to indicate to respondents that the survey was sponsored by a legitimate authority, SAMHSA, and that the survey was of great value and importance to the initiative.


The cross-site team also provided a help email address where respondents could send questions, concerns, and requests. Two team members were tasked with responding to help emails within a 24-hour period and the structure of responses included thanking respondents for their email, describing the importance of the survey, answering questions or providing technical assistance, and providing contact information in the event further communication was warranted.


Finally, as described above, campuses were given the opportunity to choose the incentive method with which they had experienced the most success. Based on their experience, campuses decided which incentive structure to implement (i.e., a reward to each respondent or a lottery-style incentive) and what incentives to provide based on their knowledge of the student body (e.g., iPods, money orders, gift cards, etc.). In addition, campuses were given the opportunity to tailor email notifications about the SPEAKS with their campus name and local contact information, which helped to further legitimatize the survey and provided a local resource for respondents to ask questions about the survey. Finally, campuses with higher response rates seem to advertise the survey more on their campuses through posters and announcements in the student newspaper, for example. We will provide technical assistance to the campuses to ensure the successful implementation of the Dillman (2000) methods and to advertise the survey in advance.




References

Dillman, D. A. (2000). Mail and Internet Surveys: The Tailored Design Method. New York, NY: John Wiley & Sons, Inc.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorgina.m.sgro
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy