Youth Knowledge, Attitudes, and Feedback to
Inform Choose Respect Implementation
formerly known as
Youth Advice and Feedback to Inform Choose Respect to Implementation
Supporting Statement B
OMB# 0920-0816
Department of Health and Human Services
Centers for Disease Control and Prevention
National Center for Injury Prevention and Control
Division of Violence Prevention
Technical Monitor:
Marie Boyle, M.S.
Phone: 770-488-2040
Email: [email protected]
Fax: 770-488-0701
June 2010
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
The proposed research will not employ statistical methods to sample respondents for either the focus groups or the online surveys. This section justifies the decision not to use statistical sampling and analysis for the data collection methods that the project will employ.
The project’s proposed data collection procedures are described below.
The methods are:
Focus groups composed of 8 or fewer respondents per group.
Online surveys of 200 respondents per survey.
The anticipated data collections will be small in scale because they are intended to inform an iterative process of developing a health communication campaign, not to be generalized to a specified respondent universe. These audience-specific methods rely not on statistical power, but on the theoretical premise that language is interpreted through shared cultural knowledge and frameworks (Glaser and Strauss, 1967). To increase the likelihood that a message will be noticed, to avoid miscommunication, and to guard against insensitivity in specialized communication to sub-cultural groups, the proposed data gathering techniques provide “…a ’window’ on a particular worldview” (Priest, 1996).
By incorporating qualitative and quantitative elements in various mixtures, these methods allow the flexibility for in-depth probing; can be feasible in time-sensitive situations; and have worked well in campaign development for commercial advertising (Glaser and Strauss, 1967).
B. 1. Respondent Universe and Sampling Methods
The respondents who will inform the Choose Respect campaign are youth ages 11 to 18, particularly youth living in high-risk inner-city communities (YHRICC).
For the in-person focus groups, the sample will be drawn from youth in or near three target markets selected based on geographic diversity and access to the target audience. The markets have not yet been determined, but for budgeting purposes we have assumed that the focus groups will be held in Atlanta, Georgia; Detroit, Michigan; and Baltimore, Maryland. These markets were tentatively selected because they are geographically dispersed and are urban centers that make a high-risk urban youth recruit possible. Participants for the in-person focus groups will be recruited through research vendors’ community-level contacts. Research vendors will contact leaders of community organizations to help them identify youth that fit study criteria and may be interested in participating in the focus group discussions. All groups will be recruited from high-risk urban areas based on age, gender, race/ethnicity, and family income (e.g., Hispanic, males, ages 13 to 14; African American, females, ages 17 to 18). Recruitment for each focus group will continue until the targeted number of participants has been achieved.
For the online survey research, the sample will be drawn from youth whose parents are members of Harris Interactive’s existing database of over 6 million adults (the Harris Poll Online panel) who have expressed interest in participating in online research. Harris Interactive rigorously recruits for and maintains this database of participants to represent demographic characteristics comparable to the U.S. population. One hundred percent of the database participants have confirmed through a two-step process that they want to be part of the database and to be offered opportunities to participate in online research (Harris, 2008).
For online surveys, the project will use convenience sampling to select participants, and all youth will be recruited through their parents. For the in-person focus groups, the project will identify potential participants through community leaders. Because we will not be using a statistical method for sampling the respondents, participants and their responses to study questions will not necessarily be representative of the full universe of youth ages 11 to 18. However, because these data will be collected for program improvement purposes only, a non-sampling approach is appropriate for our needs.
Based on past experience conducting online research, we expect that the response rate for the online surveys will be somewhere between 5 percent and 50 percent. Because we will not be using a statistical method to sample and recruit participants, this relatively low response rate will not affect the accuracy or usefulness of the data obtained. An examination of response rates for the in-person focus group research is not relevant given that, as a research method, focus groups are not intended to function as a representative sample of a larger population, but rather to provide insights about the range of ways the target audience perceives a situation or tactic (Krueger, 1988; Stewart and Shamdasani, 1990). That said, based on past experience, we expect at least six of the eight participants we recruit for each focus group to show up for the group.
B. 2. Procedures for the Collection of Information
Focus Groups
For the in-person focus groups, Ogilvy will work with specialized research vendors to recruit participants using in-person recruitment and a screener instrument (See Attachment M, “Focus Group Screening Instrument for Youth and Script for Obtaining Verbal Consent of Parent”). All participants will be recruited through personal contacts, and permission to participate will be obtained from parents/guardians. For more information on recruiting, see section A.1.
A screener instrument is a questionnaire that has been designed for recruiters to use to identify qualifying participants during the course of a brief conversation, or, in the case of online surveys, completion of a brief online questionnaire. “Screeners” are carefully structured so that the questioning process is short, easy to-understand, friendly, and efficient (See Attachment M). Screeners for the in-person groups will be administered to youth during a short conversation, during which recruiters will explain that the youth will be compensated for participation in the focus group. As described in Section A.9, incentives will be required to ensure acceptable response rates and high-quality data. Incentives will vary slightly across groups, based on local cost of living differences. The amount of compensation will be roughly $50 - $60 per person.
Upon successful recruitment, the parent or guardian of each participating youth will receive a confirmation letter from the research vendor. The letter will contain the logistical information (e.g., address and directions to the focus group location, reminder about the date and time) and an informed permission form for the parent (See Attachment K, “Focus Group Parental Permission Form”). Each parent or guardian will be required to return the signed permission form prior to their youth’s participation in the focus group. Upon arriving to the focus group location, the youth also will read and be asked to sign an assent form prior to the start of the focus group (See Attachment L, “Focus Group Youth Assent Form.”).
After assenting to participate in the study upon arrival to the focus group location, the respondents will be asked to complete a brief written survey while they wait in the waiting room for the focus group to start (See Attachment N, “Focus Group Survey”). Once all surveys have been completed, the respondents in each group will meet in a room with a trained moderator and closed-circuit television, which will allow CDC and Ogilvy staff to listen/observe. (Note: One-way mirrors will not be used because the described focus groups will not be carried out in traditional focus group facilities.) The moderator will explain the study, inform the group of taping and observation, and lead a discussion using one of the moderator’s guides (See Attachments D-1 and D-2). Responses will be collected by audiotape, and the observers will take notes. Following each focus group, the tapes will be transcribed for qualitative analysis by Ogilvy staff. The tapes and observer notes will be destroyed once the final focus group report has been submitted to CDC.
Online Surveys
The youth online survey respondents will be recruited through their parents, allowing the project to collect parental permission in addition to youth assent, as well as protect the privacy of the youth participant.
Harris Interactive has an existing database of adults (the Harris Poll Online panel) who have expressed an interest in participating in online research. Harris maintains basic demographic information about the members of the panel, including presence and age of children in the household. Harris will select a random sample of adults who have children in the household. Each parent will receive an email from Harris Interactive (see Appendix E) explaining the general topic of the survey and containing a password protected link to a secure Web site for the survey. The password-protected link will be uniquely assigned to the parent’s email address to ensure that a respondent only completes the survey one time.
After clicking on the link, parents will be directed to the parent screener (Appendix F), which will be used to determine whether the adults have children living in their households who qualify for the study. Parents also will be provided with information about the purpose of the survey and an opportunity to either provide or decline parental permission for their children to participate.
If a parent gives permission for their child to participate, and if the child is determined to be eligible based on the parent’s responses to the screener questions, the parent will be directed to either bring their child to the computer at that time to complete the youth screener and the survey, or given instructions on how to have their child resume at a later time.
The child will then complete a short screener requesting their grade, age, and gender to confirm their qualification and that they are the child for whom the parent provided permission. The child then will be provided with a brief description of the project and asked for their assent to participate. Upon obtaining assent, the Web site will direct the youth to another page within the secure site to complete the survey. The child will not be able to return to the parent portion of the survey. Once youth have completed a survey, they will be able to see how their responses to select questions compare to the aggregate of responses to the survey. Please see Appendix H for the online youth screener and Appendix I for the youth assent script. Appendix C contains the sample survey.
B. 3. Methods to Maximize Response Rates and Deal with Nonresponse
To encourage participation, focus group meetings will be held in locations that are convenient and easily accessible by public transportation, and where parking also is safe and easy. The group discussions will be held in clean, safe, and comfortable environments. In addition, the letter that the parents will receive a few days following the initial telephone post-recruitment/permission call will serve as a reminder about the focus group. Based on past experience, we expect at least 6 of the 8 youth recruited for each group to show up, providing at least a 75 percent response rate.
For the online surveys, we expect a response rate of between 5 percent and 50 percent, based on past experience administering similar surveys. Because these data will be collected for program improvement purposes only, and the project is not using a statistical method for selecting participants, this relatively low response rate will not affect the usefulness or the accuracy of the data collected. To improve the response rate, one reminder invitation will be emailed two days after the initial invitation to those parents whose children have not yet completed the survey.
B. 4. Tests of Procedures or Methods to be Undertaken
The online survey (See Attachment C, “Online Survey”) and focus group moderator guide (See Attachments D-1 and D-2 for two types of moderator’s guides) were developed using standard focus group discussion and online survey design procedures. The nature and framing of the questions are consistent with those that successfully have been posed among youth audiences on behalf of other national health communication initiatives, including the CDC’s VERB campaign, a national, multicultural, social marketing initiative to increase and maintain physical activity among youth ages 9 to 13; and the Office of National Drug Control Policy’s National Youth Anti-Drug campaign, a national initiative to keep youth drug-free, which targets 9- to 18-year-olds. In addition, several of the online survey questions (questions 1 and 6-13) were used in the 1997 household survey of U.S. youth, YouthStyles. These questions had an average response rate of 96 percent, indicating that youth did not have difficulty understanding or answering them.
The moderator guides and survey questions have been thoroughly reviewed by CDC and Ogilvy staff, as well as by our research partners. In addition, the questions will be pretested internally, using no more than nine individuals to estimate the length of time it will take to complete the questions, as well as to identify any questions that are confusing or difficult to answer.
B. 5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
CDC Staff:
Marie Boyle, MS
770-488-2040
The persons who assisted with designing the data collection and who will analyze the data are:
Ogilvy Staff:
Jennifer Wayman, M.H.S.
202-729-4161
Michael Briggs
202-729-4198
Patricia Taylor, Ph.D.
202-729-4271
Nancy Accetta, M.H.S., CHES
202-729-4167
Jennifer Scott, Ph.D.
212-880-5260
Harris Interactive:
Annette Abell, M.B.A.
585-214-7386
Dana Markow, Ph.D.
212-212-9676
REFERENCES
Andreasen A. Marketing social change: changing behavior to promote health, social development, and the environment. San Francisco: Jossey-Bass; 1995.
Asbury LD, Wong FL, Price SM, Nolin MJ. 2008. The VERB™ campaign: applying a branding strategy in public health. American Journal of Preventive Medicine 34 (6): S183-S187.
Bergman L. (1992). Dating violence among high school students. Social Work, 37:21-27.
Berlin, M., Mohadjer, L., Waksberg, J., Kolstad, A., Kirsch, I., Rock, D., & Yamamoto, K. (1992). An experiment in monetary incentives. In the American Statistical Association (ed.), Proceedings of the American Statistical Association Section on Survey Research Methods (pp. 393-398). Alexandria, VA: American Statistical Association.
Black, MC, Noonan, R, Legg, M, Eaton, D, & Breiding, MJ 2006. Physical dating
violence among high school students--United States, 2003. MMWR Weekly, 55, 532-535.
Bowman, RL, & Morgan, HM. (1998). A comparison of rates of verbal and physical abuse on campus by gender and sexual orientation. College Student Journal, 32, 43-52.
Campbell, J. (2002). Health consequences of intimate partner violence. The Lancet, 359, 1331-1336.
Cano, A, Avery-Leaf, S, Cascardi, M, & O’Leary, DK. (1998). Dating violence in two high school samples: Discriminating variables. Journal of Primary Prevention, 18, 431-446.
Carlin, DB, & McKinney, MS. (Eds.). (1994). The 1992 presidential debates in focus. Westport, CT: Praeger.
Carver, K, Joyner, K, & Udry, JR. (2003). National estimates of adolescent romantic relationships. In P. Florsheim (Ed.), Adolescent romantic relations and sexual behavior: Theory, research, and practical implications. Mahwah, NJ: Erlbaum.
Centers for Disease Control and Prevention. 2003. Costs of intimate partner violence against women in the United States. Atlanta, GA. Available online at http://www.cdc.gov/ncipc/pub-res/ipv_cost/ipv.htm [Accessed on February 13, 2008].
Centers for Disease Control and Prevention. Surveillance Summaries, June 6, 2008. MMWR:57(No.SS-4). Table 11.
Centers for Disease Control and Prevention. 2010a. Rationale for Prevention of TDV in High-Risk [Economically Disadvantaged], Inner City Populations. Atlanta, GA. (Unpublished)
Centers for Disease Control and Prevention. 2010b. Review of the research: planning for the Teen Dating Violence Prevention Initiative. (Unpublished)
Coreil J, Bryant CA, Henderson JN. 2000. Social and Behavioral Foundations of Public Health. Thousand Oaks, CA: Sage
Church, A.H. (1993). Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta Analysis. Public Opinion Quarterly, 57, 62 79.
Creswell, JW. (2003). Research Design. Qualitative, Quantitative, and Mixed Methods Approaches. Second Edition. Thousand Oaks: Sage Publications.
Creswell, JW, Plano Clark, VL, Gutmann, ML, & Hanson, WE. (2003). Advanced mixed methods esearch designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social & behavioral research (pp. 209-240). Thousand Oaks, CA: Sage.
Delli Carpini, MX, & Williams, BA. (1994). Methods, metaphors, and media research: The uses of television in political conversation. Communication Research, 21, 782-812.
Feiring, C. (1996). Lovers as friends: Developing conscious views of romance in adolescence. Journal of Research on Adolescence, 7, 214-224.
Findlay, J.S., & Shaible, W. L. (1980). A Study of the Effect of Increased Remuneration on Response in a Health and Nutrition Examination Survey. In the American Statistical Association (ed.), Proceedings of the American Statistical Association Section on Survey Research Method. (pp. 590-594). Washington, D.C.: American Statistical Association.
Foshee, V, et al. 1996. Gender differences in adolescent dating abuse: prevalence, types and injuries. Health Education Research: Theory & Practice 11(3): 275-286.
Glaser, BG, & Strauss, AL. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine Publishing Company.
Grier, S, & Bryant, CA. (2005). Social marketing in public health.
Annual Review of Public Health, 26, 319-339.
Harris Interactive. (2008). What To Look for When Considering Online Research: The Fundamentals. New York: Harris Interactive.
Heckathorn, DD. (2002). "Respondent-Driven Sampling II: Deriving Valid Estimates from Chain-Referral Samples of Hidden Populations". Social Problems 49: 11-34.
Kotler P, Roberto N, Lee N. 2002. Social marketing: improving the quality of life. Thousand Oaks (CA): Sage.
Krueger, RA. (1988). Focus Groups: A Practical Guide for Applied Research. Thousand Oaks, CA: Sage Publications.
Kulka, R. A. (1994) The Use of Incentives to Survey “Hard-to-Reach” Respondents: A Brief Review of Empirical Research and Current Practice. Paper prepared for the Council of Professional Associations on Federal Statistics’ Seminar on New Directions in Statistical Methodology. Bethesda, MD.
Lane K, Gwartney-Gibbs P. Violence in the context of dating and sex. Journal of Family Issues 1985;6:45-59.
Lenhart A, Madden M. 2007. Social Networking Websites and Teens: An Overview. Washington. Pew Internet & American Life Project. Available online at http://www.pewinternet.org/pdfs/PIP_SNS_Data_Memo_Jan_2007.pdf [Accessed on February 25, 2008].
Magdol, L, Moffitt, TE, Caspi, A, & Silva, P. (1998). Developmental antecedents of partner abuse: A prospective-longitudinal study. Journal of Abnormal Psychology, 107, 375-389.
Makepeace JM. Social factors and victim-offender differences in courtship violence. Family Relations 1987;36:87-91.
Morgan, DL. (1988). Focus Groups As Qualitative Research. Newbury Park: Sage Publications.
MRI. (2008). MRI Teenmark Study.
Olsen S. 2007. Kids say email is like sooo dead. CNET.com. Available online at http://www.news.com/Kids-say-email-is,-like,-soooo-dead/2009-1032_3-6197242.html?tag=nefd.lede [Accessed May 1, 2008].
O’Keefe, M. 2005. Teen dating violence: a review of risk factors and prevention efforts. National Resource Center on Domestic Violence (2).
O’Keefe M. (1998). Factors mediating the relationship between witnessing interparental violence and dating violence. Journal of Family Violence, 13:39-57.
O’Leary, KD, & Slep, AS. (2003). A dyadic longitudinal model of adolescent dating aggression. Journal of Clinical Child and Adolescent Psychology, 32, 314-327.
Priest, SH. (1996) Doing Media Research: An Introduction. Thousand Oaks California: Sage Press
Rennison, CM, Welchans, S. (2000). Intimate partner violence. U.S. Department of Justice, Bureau of Justice Statistics Special Report. Retrieved September 16, 2006 from http://www.ojp.usdoj.gov/bjs/pub/pdf/ipv.pdf
Salganik, MJ & Heckathorn, DD. (2004). "Sampling and Estimation in Hidden Populations Using Respondent-Driven Sampling". Sociological Methodology 34: 193-239.
Sigelman CK, Berry CJ, Wiles KA. (1984). Violence in college students dating relationships. Journal of Applied Psychology. 5(6):530-548.
Silverman, JG & Williamson, GM. (1997) Social Ecology and Entitlements Involved in Battering by Heterosexual College Males: Contributions of Family and Peers. Violence and Victims, 12(2): 147-164.
Singer, E., Gebler, N., Raghunathan, T., VanHoewyk, J., & McGonagle, K. (in press). The Effect of Incentives on Response Rates in Face-to-Face, Telephone, and Mixed Mode Surveys. Journal of Official Statistics.
Smith,
PH, White, JW, Holland, LJ. 2003. A longitudinal perspective on
dating violence among adolescent and college-age women. American
Journal of Public Health 93 (7): 1104–9.
Stets J, Henderson D. Contextual factors surrounding conflict resolution while dating: Results from a national study. Family Relations 1991;40:29-36.
Stewart, DW, Shamdasani, PN. (1990). Focus Groups: Theory and Practice. Volume 20. Newbury Park: Sage Publications.
Sugarman, DB & Hotaling, GT. (1989) Dating violence: Prevalence, context, and risk markers. In M. Pirog-Good and J. Stets (Eds.), Violence in dating relationships: Emerging social issues (pp. 3-32). New York: Praeger.
Plichta,
SB. Violence and abuse: implications for women’s health. In:
Falik MM, Collins KS, editors. Women’s health: the commonwealth
survey. Baltimore (MD): Johns Hopkins University Press; 1996.
Taylor, H. (2007). The case for publishing (some) online polls. The Polling Report, 23, 1.
Teen Research Unlimited (TRU). 2008. The TRU Study 2008: U.S. Teen Addition.
U.S. Department of Health and Human Services, National Institutes of Health, National Cancer Institute. 1989. Making Health Communication Programs Work. Washington (DC): Available online at http://www.cancer.gov/pinkbook/page1 [Accessed on July 28, 2008].
U.S. Department of Labor, Bureau of Labor Statistics. 2009. May 2009 National occupational and wage estimates (United States). Washington, DC. Available online at www.bls.gov/oes/current/oes_nat.htm [Accessed on May 19, 2010].
Wolfe, DA, Wekerle, C, Scott, K. 1997. Alternatives to violence: empowering youth to develop health relationships. Thousand Oaks (CA): Sage.
YPulse. July 14, 2006. Ten Biggest Themes of ‘What Teens Want. Available online at http://ypulse.com/archives/2006/07/the_ten_biggest.php [Accessed on May 1, 2008].
www.Bloomberg.com
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | mle4 |
File Modified | 0000-00-00 |
File Created | 2021-02-02 |