Assessment of the Communities Talk: Town Hall Meetings
to Prevent Underage Drinking
Supporting Statement
B. Statistical Methods
B.1 Respondent Universe and Sampling Methods
For the Organizer Survey, the respondent universe consists of community-based organizations (CBOs) that received a stipend to conduct at least one biennial Communities Talk event in their community. For the Participant Form, the Communities Talk event attendees make up the respondent universe. The sampling method for both instruments is described below.
Organizer Survey (Initial and 6-month follow-up)
The initial Organizer Survey will be collected via a random sampling of CBOs that registered and received a stipend to conduct a Communities Talk event (N=1,500). SAMHSA has collaborated with National Prevention Network (NPN) members of each state, U.S. territories, and the District of Columbia for the past six rounds of events (2006-2016) to identify CBOs in their respective state to organize at least one biennial event. Along with NPN members, other entities, including institutions of higher education, will be invited by SAMHSA to identify CBOs to organize events in 2018 and 2020. Once the recommendations are received by SAMHSA, SAMHSA will provide CBOs with an invitation to host a Communities Talk event (Attachment 6). The CBOs then register online and confirm their participation. The CBOs are provided with information about obtaining a stipend from SAMHSA to conduct an event in their community, resources to assist in planning the event, and are notified that a random sample of CBOs will be selected to complete the web-based Organizer Survey about their event following the event. They will also be notified that a sample of CBOs will be selected to take part in the Participant Form data collection, and that SAMHSA will provide an opportunity at the end of the initial Organizer Survey for event hosts to opt in to participating in a 6-month follow-up survey to provide details on any actions that were taken as a result of the Communities Talk event that was hosted in the community. Like the initial Organizer Survey, the 6-month follow-up survey will also be web-based.
A stratified random sample of 500 CBOs will be selected for the organizer survey in the expectation of achieving 400 completed surveys. Strata will be the 10 HHS regions defined as groups of states. The first-stage (CBO) sampling frame will be compiled using the event details submitted online by every stipend-recipient CBO in the target population. These sample sizes will ensure that survey estimates will be within +/- 5 percentage points at the 95% confidence levels. The organizer survey estimation precision will be even better due to relatively large sampling fractions due to the selection of about 1/3 or 30% of the eligible 1,500 CBOs. The CBO sample will also provide the base, first-phase sample for the participant survey described in the next section.
The CBOs agree to conduct at least one Communities Talk event in their respective communities, and it is solely up to the CBO as to whether it will conduct more than one event, since this is often based upon resources and funding. The information collected on the Organizer Surveys will be used to document the nationwide events, determine if the events lead to additional activities within the community that are aimed at preventing and reducing underage drinking (UAD), identify what these activities may include, and help to plan for future rounds of Communities Talk events.
Participant Form
To determine the Participant Form respondents, we will design and select a two-phase, two-stage stratified random sample of CBOs (events) and participants in the Communities Talk events. The sample is designed to address the main objectives of the study. The design is focused on raising public awareness of underage drinking and high-risk drinking, and capturing the intent of community members to engage in the prevention of UAD as a follow up to the Community Talk events.
At the first stage, we will select a random subsample of eligible CBOs with approximately equal probabilities from the first-phase CBO sample selected for the Organizer Survey (i.e., the CBO sample for the participant survey will be selected from the approximately 400 CBOs expected to complete the organizer survey). We will identify whether the CBO has an online or in-person event, and restrict the subsample to the latter mode of implementation (in-person). At the second stage, we will select all participants from each sample CBO to simplify operations. We anticipate an average of n=40 sample participants for this stage of sampling. For an anticipated response rate of approximately 80 percent, the total number of responding participants will be 30 per CBO on average.
The expected precision depends on the anticipated intracluster correlation within CBOs (the homogeneity of clusters); i.e., on how similar participants within a same CBO are along the key survey variables. As discussed in the brief precision analysis below, we anticipate substantial clustering effects. For the 150 sample CBOs, and an average of 30 responding participants per sample CBO, the anticipated total number of completed surveys is 4,500. Effective sample sizes that take into account design effects (DEFFs), however, range from 1,500 (DEFF=3) to 4,500 (DEFF=1). As discussed next, the sample sizes are large enough to protect against design effects as large as 3.0 or even 4.0. We also discuss below the precision gains due to the finite population correction (fpc).
The precision calculations are focused on estimated percentages, or proportions, that apply to the key survey variables of dichotomous form. Examples include:
Did you learn anything new about UAD and its associated problems that you didn’t know before attending the Communities Talk event?
Do you plan to share any material(s) or lessons learned from the event with others?
Did you learn of specific ways in which you, as an individual, can help to prevent UAD?
The table below shows the standard error of estimated percentages for a range of design effects (DEFFs). The design effect is defined as the variance under the actual design divided by the variance under simple random sampling. To be conservative, we use percentages of 50 percent, where standard errors are the largest possible. The design effect reflects clustering effects that tend to inflate variances.
Standard error of estimated percentages based on a sample of 150 CBOs (4,500 participants)
Design Effect (DEFF) |
Standard Error |
1.0 |
0.7% |
2.0 |
1.1% |
3.0 |
1.3% |
4.0 |
1.5% |
The table shows that estimated percentages will be within +/2.5 percent for 95 percent confidence intervals even if design effects are as large as 4.0. If design effects are expected to be smaller—for example, expected to be 2.0 or less—we could consider CBO sample sizes as small as 100, which yields an effective sample size of 2,250 for DEFF=2. To be conservative, we plan to select 150 sample CBOs for the precision described in the table above. The random sample selected for the Participant Survey will also reflect precision gains due to the finite population correction (fpc) associated with a sampling fraction of 100/1,500, or 0.10. Variances will be 90% of those shown above, and standard errors will be about 95% of those shown in the table.
B.2 Information Collection Procedures
Organizer Survey (Initial and 6-month follow-up)
Within one week following the Communities Talk event, the point of contact at each CBO that hosted an event will be e-mailed instructions on how to access the Organizer Survey (see Attachment 7) and a deadline by which to complete the survey. The coded survey will be provided through a web-based survey system, such as Survey Monkey. As respondents complete each page of the survey and click Next, data entered will be automatically saved. The respondents will be allowed to return to the survey until the last question is answered. Once the survey is completed (i.e., the last question is answered), CBOs will not be allowed to go back into the survey to make changes. Organizers who do not complete the Organizer Survey by the deadline will be sent a reminder e-mail requesting them to complete the survey by another defined date (see Attachment 8).
The last question on the Organizer Survey provides respondents with the opportunity to opt in to be contacted about 6 months after the initial survey is completed, to follow up on any actions that were taken as a result of the Communities Talk event. All coded surveys with recorded “yes” responses will be eligible to receive the Organizer Survey – 6 month follow up instrument. About 6 months after the survey completion date, the request to complete the follow up survey will be sent to the email address of the CBO point of contact. The same procedures used to distribute the Organizer Survey will be utilized for the 6 month follow up instrument (see Attachment 9 and 10).
Participant Form
Sampled CBOs will be asked to obtain feedback from event attendees using the Participant Form at the conclusion of their Communities Talk event. At least 2 weeks prior to the scheduled event, CBOs will be provided copies of the Participant Form and instructions for handling the form. The instructions will include information on, but not limited to, distributing the forms to attendees; performing visual quality control checks on the forms (e.g., reviewing the form to ensure that the date and location of the event is recorded); and how and where to send the completed forms.
CBOs that already have a mechanism in place to collect feedback from event attendees will be given the option to incorporate questions from the Participant Form into their instrument. Organizers will be provided a preaddressed postage-paid envelope to return Participant Forms to the Communities Talk assessment team within 30 days of their event. CBOs may, at their own discretion, also submit the data electronically to a designated e-mail address. The assessment team will perform data entry of the Participant Forms.
B.3 Methods to Maximize Response Rates
Several methods will be used to maximize response rates:
Contacts at participating CBOs will be sent an invitation to participate e-mail, which highlights key expectations for agreeing to host Communities Talk events. One of the key expectations is that CBOs will document their events and planned follow up activities by completing the Organizer Survey. CBOs will be informed that they may be selected to obtain feedback at the conclusion of their event from event attendees using the Participant Form. Additionally, CBOs will be informed about the opportunity at the end of the initial Organizer Survey for event hosts to opt in to participating in a 6-month follow-up survey to provide details on any actions that were taken as a result of the Communities Talk event that was hosted in the community.
Organizers of THM events will be sent an e-mail, within one week following their THM event, containing information on how to access the Organizer Survey. A customized code will be used to track the submission of the survey.
Organizers who do not complete the Organizer Survey by an established deadline will be sent a reminder e-mail requesting them to complete the survey by another defined date.
Organizers who are unable to access the Organizer Survey online may be sent an offline version of the survey through e-mail. Organizers will be asked to return the survey by e-mail or toll-free fax to the assessment team. These same methods to maximize response rates for the Organizer Survey will be utilized for the 6 month follow up Organizer Survey.
CBOs selected to solicit feedback from event attendees will be sent a reminder e-mail or contacted by phone if Participant Forms are not returned more than 30 days following their Communities Talk event.
The data collection for the 2016 Communities Talk cycle is underway. SAMHSA anticipates that the response rate to all instruments will be at least 80 percent.
B.4 Tests of Procedures
The updated Organizer Survey was sent to individuals in three CBOs that organized a 2016 Communities Talk event. These individuals were asked to identify any question that they did not understand or thought they would not be able to answer. They were also asked to report the amount of time it took them to complete the instrument. Those individuals easily understood all of the questions, and no questions were identified as being difficult to answer. Based on the review of comments, no changes were made to the instrument. In light of this input and the simple, straightforward nature of the testing procedures, pretesting is not necessary.
B.5 Statistical Consultants
The following individuals provided statistical consultation in development of the surveys and data collection methodology:
M. Cornelius Pierce Public Health Analyst HHS/SAMHSA/CSAP/DSD 5600 Fishers Lane Room 16E85A Rockville, MD 20857 Phone: (240) 276–2551 |
Jane Tobler Vice President/Project Director Vanguard Communications 2121 L Street. NW Suite 650 Washington, DC 20037 Phone: (202) 248-5452 |
Christina H. Zurla Senior Manager ICF 530 Gaither Road 6th Floor Rockville, MD 20850 Phone: (301) 572–0869 |
Shelby Giles Account Supervisor/Deputy Project Director Vanguard Communications 2121 L Street. NW Suite 650 Washington, DC 20037 Phone: (202) 248-5458 |
Ronaldo Iachan, Ph.D. Statistician ICF 530 Gaither Road 7th Floor Rockville, MD 20850 Phone: (301) 572–0538 |
Hope Cummings Manager ICF 530 Gaither Road 6th Floor Rockville, MD 20850 Phone: (301) 572–0517 |
Rená A. Agee Manager ICF 530 Gaither Road 7th Floor Rockville, MD 20850 Phone: (301) 572–0400 |
Lisa Swanberg Vice President/Corporate Monitor Vanguard Communications 2121 L Street. NW Suite 650 Washington, DC 20037 Phone: (202) 248-5489 |
Linda Sabelhaus/Project Coordinator Account Supervisor Vanguard Communications 2121 L Street. NW Suite 650 Washington, DC 20037 Phone: (202) 248-5459 |
|
|
|
|
References
Bureau of Labor Statistics. (30 March 2016). May 2015 Occupational Employment Statistics. Web. Retrieved from http://www.bls.gov/oes (accessed October 11, 2016).
Center for Behavioral Health Statistics and Quality. (2016a). 2015 National Survey on Drug Use and Health: Detailed Tables. Retrieved from http://www.samhsa.gov/data/sites/default/files/NSDUH-DetTabs-2015/NSDUH-DetTabs-2015/NSDUH-DetTabs-2015.pdf (accessed October 10, 2016).
Center for Behavioral Health Statistics and Quality. (2016b). Key substance use and mental health indicators in the United States: Results from the 2015 National Survey on Drug Use and Health (HHS Publication No. SMA 16-4984, NSDUH Series H-51). Retrieved from http://www.samhsa.gov/data (accessed October 10, 2016).
Kann L, McManus T, Harris W, et. Al. (2016). Youth Behavior Surveillance—United States, 2015. MMWR Surveill Summ 2016;65(No. SS-65.6):6. Retrieved from http://www.cdc.gov/healthyyouth/data/yrbs/pdf/2015/ss6506_updated.pdf (accessed October 10, 2016).
Masten, A., Faden, V., Zucker, R., & Spear, L. (April 2008). Underage drinking: A developmental framework. Pediatrics, 121 (Supplement 4), S235–S251.
National Institute on Alcohol Abuse and Alcoholism. (2016). Underage Drinking. National Institutes of Health. September 2016. Retrieved from http://pubs.niaaa.nih.gov/publications/UnderageDrinking/Underage_Fact.pdf (accessed October 10, 2016).
Office of Personnel Management. (January 2016). Salary Table 2016-DCB. Web. Retrieved from http://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2016/DCB.pdf (accessed October 11, 2016).
U.S. Department of Health and Human Services. (2007). The Surgeon General’s Call to Action To Prevent and Reduce Underage Drinking. Department of Health and Human Services, Office of the Surgeon General. Retrieved from https://www.ncbi.nlm.nih.gov/books/NBK44360/ (accessed October 11, 2016).
U.S. Department of Labor. (18 December 2013). Wage and Hour Division. Web. Retrieved from https://www.dol.gov/whd/minimumwage.htm (accessed October 11, 2016).
List of Attachments
Organizer Survey
Participant Form (English version)
Participant Form (Spanish version)
Organizer Survey – 6 month Follow-up
ICF IRB Review Findings Form
Invitation to a Communities Talk event
Initial E-Mail to Organizers: Organizer Survey
Reminder E-Mail to Organizers: Organizer Survey
Initial E-Mail to Organizers: Organizer Survey – 6 month Follow-up
Reminder E-Mail to Organizers: Organizer Survey – 6 month Follow-up
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Assessment of the Reach Out Now |
Author | Sandra.S.Chipungu |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |