SS Part A_2010

SS Part A_2010.docx

Youth Knowledge, Attitudes, and Feedback to Inform Choose Respect Implementation

OMB: 0920-0816

Document [docx]
Download: docx | pdf








Youth Knowledge, Attitudes, and Feedback to

Inform Choose Respect Implementation


formerly known as


Youth Advice and Feedback to Inform Choose Respect to Implementation




Supporting Statement A



OMB# 0920-0816







Department of Health and Human Services

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

Division of Violence Prevention



Technical Monitor:

Marie Boyle, M.S.

Phone: 770-488-2040

Email: [email protected]

Fax: 770-488-0701




June 2010


Table of Contents


A. Justification

A1. Circumstances Making the Collection of Information Necessary 3

A2. Purpose and Use of the Information Collection 11

A3. Use of Improved Information Technology and Burden Reduction 12

A4. Efforts to Identify Duplication and Use of Similar Information 13

A5. Impact on Small Businesses or Other Small Entities 15

A6. Consequences of Collecting the Information Less Frequently 15

A7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 15

A8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 15

A9. Explanation of Any Payment or Gift to Respondents 16

A10. Assurance of Confidentiality Provided to Respondents 19

A11. Justification for Sensitive Questions 22

A12. Estimates of Annualized Burden Hours and Costs 23

A13. Estimates of Other Total Annual Cost Burden to Respondents or
Recordkeepers 25

A14. Annualized Cost to the Government 25

A15. Explanation for Program Changes or Adjustments 26

A16. Plans for Tabulation and Publication and Project Time Schedule 26

A17. Reason(s) Display of OMB Expiration Date Is Inappropriate 34

A18. Exemptions to Certification for Paperwork Reduction Act Submissions 34


B. Collections of Information Employing Statistical Methods

B1. Respondent Universe and Sampling Methods 35

B2. Procedures for the Collection of Information 36

B3. Methods to Maximize Response Rates and Deal with Nonresponse 38

B4. Tests of Procedures or Methods to be Undertaken 38

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 39


References

List of Tables

Table 1: Estimated Annualized Burden Hours 24

Table 2: Estimated Annualized Burden Costs 25

Table 3: Estimated Annualized Cost to the Government 26


List of Attachments

A. Section 301 of the Public Health Services Act (42 USC 241)

B. 60-Day Federal Register Notice

C. Online Survey

D-1. Attitudinal Exploratory Focus Group Moderator’s Guide

D-2.Strategic Exploratory Focus Group Moderator’s Guide

E. Online Survey Email Invitation

F. Online Survey Screening Instrument for Parents

G. Online Survey Parental Permission Form

H. Online Survey Screening Instrument for Youth

I. Online Survey Youth Assent Form

J-1. IRB Approval Letter

J-2. IRB Approval of Continuation of Protocol

K. Focus Group Parental Permission Form

L. Focus Group Youth Assent Form

M. Focus Group Screening Instrument for Youth and Script for Obtaining Verbal Consent of Parent

N. Focus Group Survey

o. Response to Public Comment

P. Public Comment

A. JUSTIFICATION

Please note that this justification statement has been revised to:


1) Reflect minor adjustments in the age of youth who will participate in the focus group studies and surveys (11 to 14 will become 11 to 18). Rationale: Youth ages 11 to 14 will remain the core audience for CDC’s communications campaign. However, we are adding youth ages 15 to 18 to the research, because of the influence that these ‘near peers’ have on the younger audiences. We need to conduct research to better understand this 15- to 18-year-old audience and the influence they have regarding healthy dating relationships. Note: This revised submission does not request an increase in the overall burden to the public.


2) Obtain permission to ask knowledge and attitude questions at some of the focus groups. Rationale: CDC must understand youths’ knowledge and attitudes to inform the development of message and materials that will resonate with youth.


A. 1. Circumstances Making the Collection of Information Necessary


Background


This Information Collection Request (ICR) is a revision of previously approved OMB# 0920-0816. This is a three year study and includes a revision to the study name from, “Youth Advice and Feedback to Inform Choose Respect Implementation” to, “Youth Knowledge, Attitudes, and Feedback to Inform Choose Respect Implementation”, and adding youth ages 15 to 18 to the research. The proposed participants will be youth, ages 1l to 18, specifically youth living in high-risk, inner-city communities (YHRICC).


This research will be conducted because CDC’s Division of Violence Prevention is developing an initiative to promote respectful, nonviolent dating relationships for 11- to 14-year-old YHRICC—a group that may be at an increased risk of teen dating violence (TDV) due to concentrated poverty, lack of resources, and exposure to community violence (2010a). In addition, few, if any, TDV prevention programs exist for this audience. As part of the initiative, CDC plans to engage 15- to 18-year-olds who have significant influence on their younger peers (CDC, 2010b), to help reinforce messages about healthy dating relationships. This is why 15- to 18-year-olds will also participate in the proposed information collections.


Dating abuse is defined as physical, sexual, or psychological/emotional violence within a dating relationship, and it often starts at an early age. One in 11 high school students reports that they have suffered physical dating violence within the past 12 months, and psychological and physical abuse occurs in as many as one in three adolescent relationships ( O’Keefe, 2005). In addition to the mental and physical consequences, dating abuse is a problem because of its association with other risky adolescent behaviors (e.g., fighting, binge drinking, sexual activity, and suicide attempts) (Black, Noonan, Legg, Eaton, & Breiding, 2006) and negative long-term health outcomes for women (e.g., chronic pain, gastrointestinal disorders, poorer pregnancy outcomes, depression, and post-traumatic stress disorder) (Campbell, 2002; Plichta, 1996).


All too often, dating abuse becomes a pattern. Evidence indicates an association between a history of dating abuse and the likelihood of current dating violence (Cano, Avery-Leaf, Cascardi, & O’Leary, 1998). Youth who report perpetrating physical violence against their partners are likely to perpetrate violence with the same partner again (O’Leary & Slep, 2003). In addition, abused teens often carry the patterns of violence into future relationships. Physically abused teens are three times more likely than their non-abused peers to experience violence during college. In adulthood, they are more likely to be involved in intimate partner violence (Smith, 2003). An estimated 5.3 million incidents of intimate partner violence occur among adult women in the U.S. each year, resulting in approximately 2 million injuries and 1,300 deaths (CDC, 2003).


Focusing on youth who are beginning to initiate dating may be warranted to prevent the establishment of abusive beliefs, attitudes, and behavioral patterns of abusive interactions (Magdol, Moffitt, Caspi, & Silva, 1998). Evidence suggests that adolescents who develop skills such as negation, compromise, and conflict resolution may be better prepared to establish healthy, nonviolent relationships with others (Wolfe and Wekerle 1997). Considering that the National Crime Victimization Survey found that females age 16 to 24 experienced the highest rates of intimate partner violence (IPV) (Rennison & Welchans, 2000), and evidence suggests that dating relationships have been initiated by about 25 percent of 12 year-olds (Carver, Joyner, & Udry, 2003), up to 50 percent of 15 year-olds (Feiring, 1996), and nearly 75 percent of 8th and 9th graders (Foshee et al., 1996), early adolescents appear to be an appropriate, and strategic, audience for prevention efforts.

Focusing on the sub-population of YHRICC is further justified by research that indicates that these youth are at greater risk for becoming involved in unhealthy dating relationships. For example, both inflicting and being a victim of TDV has been found to be significantly related to low socioeconomic status (Makepeace, 1987; O’Keefe, 1998; Sigelman, Berry & Wiles, 1984; Stets and Henderson, 1991); and TDV is more common among students from urban than rural areas (Bergman, 1992; Lane and Gwartney-Gibbs, 1985; Makepeace, 1987). Furthermore, analysis of CDC’s state-level surveys demonstrates that in 2007, African American students were the most likely to be victims of dating violence (14 percent), followed by Hispanic students (11 percent) and white non-Hispanic students (8 percent) (CDC, 2008).



As the nation’s premier prevention agency, the Centers for Disease Control and Prevention (CDC) has included among its top research priorities efforts aimed at preventing IPV, sexual violence, and child maltreatment. Section 301 of the Public Health Services Act (42 USC 241) authorizes CDC to conduct research relating to the prevention and control of disease. A copy of this legislation is provided in Attachment A, “Section 301 of the Public Health Services Act (42 USC 241).”


CDC priorities in the area of violence prevention include the following:

  • Evaluating programs and policies that intervene with perpetrators and potential perpetrators before violence occurs, focusing on programs and policies that address multiple types of violence, and

  • Identifying attitudes that support intimate partner violence, sexual violence, and child maltreatment, and evaluate strategies to change them, as has been done with such problems as smoking and risky behaviors.


Congruent with these priorities, in May 2006, the CDC’s National Center for Injury Prevention and Control (NCIPC) introduced the Choose Respect initiative to help adolescents (ages 11 to 14) form healthy relationships to prevent dating abuse before it starts. The initiative is designed to do the following: provide effective messages for adolescents, parents, caregivers, and teachers that encourage them to treat themselves and others with respect; create opportunities for adolescents and parents to learn about positive relationship behaviors; increase adolescents' ability to recognize and prevent unhealthy, violent relationships; and promote ways for a variety of audiences to get information and other tools to prevent dating abuse. To date, Choose Respect has focused on strategies and tactics to reach and engage influencer audiences, such as parents, in messages about dating abuse prevention. Moving forward, CDC’s priority is to target and engage youth directly in the campaign, particularly YHRICC.


Research conducted in 2002 (which included formative research with youth, an omnibus survey, information obtained from experts in the field of IPV, and a literature review) indicated that the Choose Respect initiative and its associated materials should: target adolescents early before unhealthy habits are already formed; speak to settings where youth spend time; contain content that addressed realistic situations and scenarios; represent and reflect phrases and words that real teens would use; contain adolescents from diverse backgrounds; and contain messages to change social norms around healthy/unhealthy relationship behaviors. This research also recommended that Choose Respect messages be credible and thought-provoking, and the executions of the campaign be relevant, age-appropriate, meaningful, and appealing.


However, since the original Choose Respect research was conducted, adolescents have changed. The cohort of youth with whom CDC conducted testing and research in 2002 has aged out of the campaign’s target audience of 11- to 14-year-olds, and an entirely new cohort of youth now takes their place. Furthermore, the ways adolescents today spend their time, access information, and interpret messages has changed. Fueled by new technologies, Web sites, and social network domains, such as Facebook and MySpace, large numbers of adolescents today share and create materials online (Olsen, 2007). For example, 64 percent of online adolescents ages 12 to 17 engage in at least one type of content creation for Web sites or blogs, up 57 percent from online teens in 2004 (Lenhart and Madden, 2007). Forty-one percent of the adolescents who use MySpace, Facebook, or other social network sites say they send messages to friends via those sites every day (TRU, 2008). In addition, more than 16 million 13- to 17-year-olds (approximately 60 percent) use mobile phones in the U.S. today, and the numbers continue to grow (YPulse, 2006). Incidentally, research suggests that general population youth and those living in high-risk urban communities have similar technology and media consumption habits (MRI, 2008).


These new communication tools, techniques, and practices affect teens’ lives in other ways beyond providing an outlet for communication. For many teens, these new communication mechanisms are now an integral part of the system of communication that they use to carry out daily activities. Adolescents use these communication mechanisms to get information on everything from health topics to complete homework assignments. Adolescents are also increasingly using their social networks to learn about new products and ideas, and unlike older generations, young people see the digital space as just another place to interact with their friends (TRU, 2008). Their interactions online are categorized by an expanded sense of “community” and a desire to make their online interactions reflect personal feelings, thoughts, and desires (Olsen, 2007).



While youth endorse and engage in more interactive communication, their attitudes towards traditionally-used health message dissemination channels, such as public service announcements, educational videos, and advertisements, are deteriorating. Only 6 percent of adolescents think advertisements tell the truth, 11 percent believe what famous people say about products is true, and 3 percent trust what someone in a chat room says about a product. The majority think it is inappropriate to be contacted for advertising purposes through IMing (97 percent), text messaging (94 percent), and social-networking sites (91 percent) (TRU, 2008). This suggests that in order to continue to be effective and reach today’s adolescents, including YHRICC, Choose Respect must better understand current adolescent motivations, habits, communication tools, and preferences. This information can then be used to inform the update of communication strategies, subsequent messages, and delivery channels.


The development of effective materials and communication strategies depends on a deep understanding of the audience. Supported by Ogilvy Public Relations Worldwide (Ogilvy), the contractor awarded the communications task order, CDC is seeking to conduct additional audience research with 11- to 14-year-olds (CDC’s target audience) and 15- to 18-year-olds (those who will help reinforce healthy dating messages for younger peers) to inform the development of and test Choose Respect creative concepts, messages, materials, and planned communication strategies and tactics. The information and feedback gained through this process will be used to develop, revise, and enhance content, materials, and communication approaches to ensure that the campaign effectively reaches 11- to 14-year-olds, particularly YHRICC.


Ogilvy has conducted limited informal background research to inform initial campaign planning, including a series of interviews with experts in the field of youth risk behavior change (see Section A.4). All of experts emphasized the importance of gathering feedback and input directly from adolescents themselves in order to create an authentic youth voice for the campaign. They reinforced how the youth communications environment is rapidly changing, making it critical for the campaign to be nimble and able to implement changes quickly with respect to communication channels and campaign messages.


In order to reflect and express an authentic voice through relevant channels and realistic messages, it is essential for the Choose Respect initiative to tap the audience at frequent time points to understand youth’s knowledge and attitudes and gather feedback on possible channels, messages, and materials. For example, the experts suggested that youth ages 11 to 12 may require different messages than those ages 13 to 14, in order for the campaign to appeal to them. If this recommendation is supported through the campaign’s other audience research, individual youth will age out of their audience segment every one to two years. Given the rapidly changing communication landscape in which today’s youth operate, what tested well one year likely will be irrelevant or downright “uncool” the next. The proposed research approach will allow Choose Respect to integrate current, relevant information and youth feedback into campaign planning and decision-making on an ongoing basis. In order to base planning for the Choose Respect initiative on a deep understanding of YHRICC from the earliest stages of program development, it is essential to conduct this research as soon as possible.


Privacy Impact Assessment


The proposed research will be conducted through up to four online surveys per year (with up to 200 respondents per fielded survey) and two rounds of in-person focus groups with 36 groups per round, for a total of up to 72 focus groups per year (with groups consisting of no more than 8 youth).


This data collection will be limited to: 1) knowledge and attitudes regarding healthy and unhealthy dating relationships; 2) feedback on draft strategic and creative concepts, messages, and materials the campaign is developing; and 3) feedback on possible communication channels the campaign is considering using to reach CDC’s target audience. We will not collect last names and any other personally identifiable information



Please see below for additional information related to the Privacy Impact Assessment.


Overview of the Data Collection System


The process by which audiences endorse, engage, and relate to a message or ideal can be understood through social marketing theory and a branding approach (Andreasen, 1995). Social marketing, which uses commercial marketing techniques and principals to influence an audience to voluntarily change their attitudes and behaviors for the sake of improving health and preventing injury (Kotler, Roberto, and Lee, 2002), has been applied to a number of public health issues including increasing fruit and vegetable consumption, promoting breastfeeding, decreasing fat consumption, promoting physical activity, and influencing a wide variety of other preventive health behaviors (Coreil, Bryant, and Henderson, 2000). Like commercial marketing, the primary focus is on the consumer—on learning what people want and need rather than trying to persuade them to buy what marketers happen to be producing. The widespread adoption of social marketing in public health has garnered important successes. Among these is VERB, a national, multicultural, social marketing program coordinated by CDC (Asbury, Wong, Price, and Nolin, 2008).


The defining features of social marketing emanate from marketing’s conceptual framework and include exchange theory, audience segmentation, competition, “the marketing mix,” consumer orientation, and continuous monitoring. Although social marketing shares many features with other related public health planning processes, it is distinguished by the systematic emphasis marketers place on the strategic integration of the elements in marketing’s conceptual framework (Grier and Bryant, 2005). The following research questions will be addressed:


  • What terms are used for self-identification of “dating relationships,”, (e.g., “teens,” “young adults,” and “dating”)?

  • What are the perceptions and opinions of dating relationships among this audience?

  • What messages and materials can be used to facilitate behavior change?

  • Where and when will the target market acquire these messages and materials?

  • What communication channels have the greatest credibility and use among the target market?


To answer these multilayered and theory-based questions, a type of mixed methods research design known as concurrent nested strategy of inquiry will be employed (Creswell, 2003; Creswell, Plano Clark, Gutmann, & Hanson, 2003). According to Creswell’s typology of mixed methods, concurrent studies have simultaneous qualitative and quantitative data collections and the findings from each method provide elaboration and confirmation for the findings from the other method. Concurrent studies can be contrasted with sequential studies, which are two-stage procedures in which the second method provides elaboration of the first.


Using a concurrent nested strategy, both qualitative and quantitative data are collected during a single phase of data collection and, unlike some other concurrent approaches; there is a predominant method (either qualitative or quantitative). The method with lower priority (qualitative or quantitative) is embedded within the dominant method (qualitative or quantitative) (Creswell, 2003).


For the proposed study, we will conduct both qualitative-dominant and quantitative-dominant phases of research. Focus group research will employ a social marketing theory-based qualitative exploration of knowledge and attitudes about healthy dating relationships and how potential campaign strategies, materials, and communication channels are interpreted by YHRICC ages 11 to 18, but will also include some quantitative questions to enrich our description of participants’ reactions to proposed channels. These quantitative questions will be administered via a waiting room survey that participants complete once they arrive at the focus group location, while they are waiting for their group to begin. In contrast, online survey research will rapidly provide predominantly quantitative data to assess audience feedback and reactions to proposed communication channels, draft strategies and materials, and sample content and messaging, but will also include some qualitative questions to yield data on questions that cannot be answered quantitatively.


Both qualitative- and quantitative-dominant approaches are necessary for the proposed study because different types of data will be needed to answer different questions about audience reactions to proposed messages, strategies, channels, and tactics. For example, quantitative surveys will be useful for gathering feedback quickly to learn what types of online widgets or materials would be useful to youth who want to communicate campaign-related information to their friends and peers. And research demonstrates that, online survey research is as valid as research collected via other widely-accepted methods, including telephone surveys (Taylor, 2007). Online surveys, however, cannot reveal how youth make meaning from the sample materials. Focus groups, on the other hand, while more time-consuming to plan and implement, are ideal for understanding audience perceptions because they can explore “the fluid and dialogic aspects of opinion formation” (Delli Carpini & Williams, 1994). The open-ended nature of focus group interaction helps to “provide insights into why people believe what they do, how they perceive verbal and nonverbal messages..., and what they consider important information and why” (Carlin & McKinney, 1994). Qualitative focus group research also allows researchers to note group dynamics and interpersonal factors that play a role in how materials and channels are received, which is particularly relevant for this campaign since early research suggests that social media will be important for reaching the youth audience.


As described by Creswell (2003), this mixed methods model has several strengths. This approach provides the advantages of both quantitative and qualitative data and, by using the two methods in this fashion, researchers can gain perspectives from the different types of data (Creswell 2003). Plus, collecting two types of data in a single data collection is time-efficient.


CDC will work with the Ogilvy-contracted research firm Harris Interactive to conduct the online surveys. To organize and facilitate the in-person focus groups, CDC will work with the Ogilvy-contracted independent research vendors that specialize in working with YHRICC. While CDC and Ogilvy cannot contract with a qualitative research vendor until this information collection is approved, preliminary research has identified Why-Q (www.why-q.com/welcome.html) as an example of a vendor that is experienced in recruiting for and conducting focus groups with YHRICC.


Harris Interactive will store the survey data for four weeks following the data collection process, to allow sufficient time for the data to be tabulated and reviewed.


The research vendors will destroy the records collected during the screening process once the in-person focus groups have been held, which will be approximately two weeks following the start of the recruitment process. The focus group data and participant responses to questions in the moderator’s guide only will be recorded on audiotapes and the corresponding transcripts. The transcripts will exclude participant names and any other identifying participant information. The audiotapes will be destroyed once Ogilvy delivers the final focus group report to CDC.


Items of Information to be Collected


All data collection activities will be conducted in full compliance with the CDC and OMB regulations to maintain the privacy of data obtained on persons and to protect the rights and welfare of human subjects, as contained in Title 28 of the Code of Federal Regulations, Parts 22 and 46.


No individually identifiable information is being collected. The data collection will be limited to: 1) knowledge and attitudes about healthy dating relationships; 2) feedback on possible communication channels the campaign is considering using to reach YHRICC; and 3) feedback on draft strategies, creative concepts, messages, and materials the campaign is developing. Examples of possible topics that will be covered during the data collection include:


  • Youth’s knowledge and attitudes toward healthy and unhealthy dating relationships.

  • Perceptions around prevalence of healthy vs. unhealthy relationships.

  • Feedback on specific events where Choose Respect materials could be displayed (e.g., music concerts for particular bands or musicians)

  • Which potential business/organization partners are highly used/respected/recognized by youth ages 11 to 14

  • Where Choose Respect should distribute information (e.g., Boys & Girls Clubs, specific social networking sites)

  • What types of materials would be useful to youth who wanted to communicate campaign-related materials to their friends and peers

  • Feedback on specific ways in which the campaign could engage the target audience to promote messages (e.g., poster contests, T-shirt design contests)

  • Draft content developed for Choose Respect campaign materials

  • Draft designs developed for Choose Respect campaign materials


A sample online survey (Attachment C – “Online Survey”) and two moderator guides (Attachment D-1 – “Attitudinal Exploratory Focus Group Moderator’s Guide” and Attachment D-2 – “Strategic Exploratory Focus Group Moderator’s Guide”) are provided as attachments.


As stated above, the source for information collected through the proposed research will be 11- to 18-year-old YIRUCC who are recruited to participate in either online surveys or in-person focus groups. While our initiative will focus on 11- to 14-year-olds as the target audience, we will also conduct research with 15- to 18-year-olds, “near peers,” for the following reasons: 1) 15- to 18-year-olds are likely to have some influence over dating-related attitudes and behaviors of younger teens and tweens; 2) 15- to 18-year-olds will be able to provide youth “wisdom” that may eventually be imparted to youth through CDC messaging; and 3) 15- to 18-year-olds can offer insights about channels for communicating messages to younger teens and tweens.


Online Survey

As a matter of policy, Harris Interactive, the research vendor for the online survey research, does not store any Information in Identifiable Form (IIF) on youth participants. Youth participants in the online research will be recruited through their parents or legal guardians.1 Harris Interactive maintains an existing database of participants rigorously recruited and maintained to represent demographic characteristics comparable to the U.S. population. One hundred percent of the database participants have confirmed through a two-step process that they want to be part of the database and to be offered opportunities to participate in any/all of online research conducted by Harris (Harris, 2008). Note: This confirmation process conducted by Harris does not constitute an additional burden placed on the public due to the proposed research.


  • Parents who are part of Harris Interactive’s existing database of participants will receive an email explaining the general topic of the survey (See Attachment E, “Online Survey Email Invitation”) and containing a link to a secure Web site where they will complete a short screener (See Attachment F, “Online Survey Screening Instrument for Parents”) (Harris, 2008).

    • The screener will be used to determine whether there are any children in the household who qualify for the survey, and to collect demographic information to confirm youth participant eligibility.

    • Parents also will receive information about the purpose of the survey and will be asked to indicate whether they provide parental permission for their child to participate in the survey (see Attachment G, “Online Survey Parental Permission Form”).

  • If a parent gives permission for their child to participate, and if the child is determined to be eligible based on the parent’s responses to the screener questions, the parent either will be asked to bring their child to the computer at that time to complete the youth screener and the survey, or given instructions for having their child resume at a later time.

  • Children then will complete a short screener (see Attachment H, “Online Survey Screening Instrument for Youth”) that requests their grade, age, and gender to confirm their qualification and that this is the child for whom the parent provided permission. As part of the screening, the child will be provided with a brief description of the project and asked for their assent to participate (See Attachment I, “Online Survey Youth Assent Form”).

    • Upon obtaining assent, the Web site will direct the youth to another page within the secure site to complete the actual survey (See Attachment C, “Online Survey,” for sample survey). The child will not be able to return to the parent portion of the survey.

  • All data collected will be submitted to CDC in the aggregate, without any names or other identifiers associated with the respondents or their parents. Please see Section B. 2 for a more detailed description of procedures for recruitment and obtaining permission and assent.


Focus Groups

Because YHRICC are not typically found in standard market research databases, the project’s research vendor will rely on personal contacts (e.g., directors of community organizations) in various urban communities to help recruit youth that meet participation criteria. Oftentimes, the actual recruiting will take place in person, instead of over the phone.

  • When potential participants are identified, they will be screened through conversation guided by the study screener (see Attachment M, “Focus Group Screening Instrument for Youth and Script for Obtaining Verbal Consent of Parent”).

  • When a candidate meets recruiting criteria (e.g., gender, age, race) and agrees to participate, his/her parent or guardian will be contacted to provide information about the study, date, and location, and to obtain consent (see second part of Attachment M, “Focus Group Screening Instrument for Youth and Script for Obtaining Verbal Consent of Parent.

After the focus groups are completed, all transcripts will be submitted to CDC “anonymously” (e.g., no names will be associated with quotes). All data will be reported in the aggregate. The final transcripts, aggregate findings, and conclusions will be reported collectively. Upon completion of the larger project, the transcripts will be destroyed. Please see Section A.10 for a more detailed description of the process for de-identifying data.


Identification of Web site(s) and Web site Content Directed at Children Under 13 Years of Age


The in-person focus groups will not involve the moderator or participants showing or viewing any Web sites or Web site content directed at children under the age of 13.


The online surveys will be conducted through a secure Web site that will be viewed by youth ages 11 to 18 after their parents have given permission for their child to do so. The site can only be accessed by authorized Harris Interactive personnel. No cookies or other persistent identifiers, such as respondent ID, will be used. As indicated in section A.10, Harris will use controls to minimize the possibility of unauthorized access, use, or dissemination of Identifiable Information used during recruitment, as well as the information collected. Controls include passwords, firewalls, encryption, and an intrusion detection system.



A. 2. Purpose and Use of Information Collection


The information gathered under the proposed data collection will be used to:

  • Ensure quality and prevent waste in the dissemination of health information by CDC to the public

  • Develop messages that will resonate with youth audiences

  • Determine the most effective and efficient outreach tactics, channels, and dissemination strategies

  • Refine message concepts and test draft materials for clarity, salience, appeal, and persuasiveness to target audiences

  • Determine whether particular types of materials should be developed


The results will be a critical component of the campaign’s development, driving its messaging, tactics, communication channels, and implementation. The findings from this information collection will be used to develop, revise, and improve Choose Respect messages and materials before they are distributed to target audiences and to select communication outreach tactics and channels. Research and evaluation of ongoing health communications programs have affirmed the value of developing and pretesting communication concepts, messages, materials, and approaches with representatives of the target audiences (U.S. Department of Health and Human Services, 1989). Pretesting methods can help determine which of several alternative executions of an item may be most effective, or it can identify strengths and weaknesses in a single execution (U.S. Department of Health and Human Services, 1980). This testing and refinement process is one of the essential elements of a social marketing program. Without this formative research, CDC and Ogilvy would be at risk of expending scarce resources on a campaign that is likely to be ineffective due to lacking data on the types of messages, materials, and channels that truly resonate among the target audience.


The proposed data collection methods are focus groups and online surveys.


Focus Groups


Focus group data will be collected in person, working closely with our research vendors. The planned approach calls for up to 36 focus groups up to two times per year. Please note that the broader study population will be segmented into groups by age, gender, and race (e.g., African American boys ages 11 to 14, Hispanic girls ages 15 to 18).


Online Surveys


Survey data will be collected online, working closely with our research vendor, Harris Interactive. The planned approach calls for up to four online surveys per year. Please see Section A.12 for a table detailing the audience segments for the proposed research.


Participants will be recruited for the surveys using convenience-driven (Salganik, M.J. and D.D. Heckathorn, 2004; D.D. Heckathorn, 2002) sampling techniques, a method that has been shown to successfully recruit difficult-to-identify populations for survey research.


Privacy Impact Assessment Information


No IIF is being collected.


The proposed data collection will have no effect on respondents’ privacy as we do not plan to ask questions about personal behavior.


Data obtained from this program improvement research will inform CDC of critical content for messaging and appropriate elements of the draft messages and materials to either include or not include in the final versions. It will also provide CDC with information about the types of proposed communication outreach tactics and channels that offer the greatest chance of success in communicating with target audiences about healthy relationships (e.g., music concert vs. community organization signage). CDC and Ogilvy will use the research findings to make decisions about messaging, and how draft materials and planned communication strategies should be enhanced. Multiple data collection points allow the project the ability to test draft materials and communication strategies at multiple points during campaign planning and implementation, thus providing a critical assessment to help prevent Choose Respect from expending considerable resources on an approach before gathering feedback on whether it is likely to be an effective communication method. This approach also allows the campaign to be nimble and to incorporate new directions and channels quickly, which will be critical for reaching the youth audience.


A. 3. Use of Improved Information Technology and Burden Reduction


The in-person focus groups will not involve any automated, electronic, or technological collection techniques. Participant responses to focus group questions will be recorded on audiotape and in observer notes. To minimize respondent burden, the moderator’s guides (Attachments D-1, “Attitudinal Exploratory Focus Group Moderator’s Guide” and D-2, “Strategic Exploratory Focus Group Moderator’s Guide”) are structured to ensure that the discussion is limited to 90 minutes in length, and that the questions are well-organized, flow well together, and are easy to understand and address.

In order to maximize efficiency and reduce burden, an online survey design is proposed for a portion of the data collection (see Attachment C for a sample survey). Completed at a secure Web site, the survey will be structured for easy respondent use, allowing the automatic administration of skip patterns while maintaining a simple, seamless navigation. The use of a Web-based survey offers many advantages, including:

  • All responses are automatically recorded, allowing for rapid tabulation and analysis of findings.

  • Online surveys create time and cost efficiencies because respondents complete them during a much shorter window of time than other survey methods, and at a substantially reduced cost (e.g., less labor is involved than in the case of telephone or in-person surveys, and because no postage is required as would be the case for mail-based surveys).

  • Online surveys allow for a great deal of geographic and regional diversity.

  • Respondents potentially have the option of answering questions in a setting where they feel comfortable and at ease (e.g., at home).

  • In many cases, respondents do not have to travel or make an extra trip to a specific location, such as a focus group facility, in order to participate in the research.

  • Preliminary campaign research suggests that providing information in an online format is convenient and consistent with the way the target audience communicates and spends leisure time.


While online surveys will be essential for providing rapid feedback from a geographically diverse sample of our target audience, traditional in-person focus groups will complement the online data collection by generating additional qualitative input. In-person focus groups provide the opportunity to note body language, facial expressions, and other non-verbal reactions to draft materials and proposed outreach strategies, as well as to observe how group dynamics influence individuals’ preferences and reactions. Together, these two methodologies will provide Choose Respect with the rapid, rich, and detailed feedback from a geographically diverse sample of participants that the initiative needs to implement effective outreach.


A. 4. Efforts to Identify Duplication and Use of Similar Information


In designing the proposed data collection activities, we have taken several steps to ensure that the proposed data collection effort does not duplicate ongoing efforts and that no existing data sets would address the proposed study questions.

We conducted an extensive review of the literature by examining several large periodical journal databases. We identified published articles or books containing the keywords “adolescent or youth,” “dating violence,” and “prevention or intervention.” Findings from this literature review confirmed the importance of focusing prevention efforts on younger youth and adolescents, rather than on young adults, as well as the powerful influence played by peer groups on young people's attitudes, beliefs, and behaviors related to intimate partner relationships. Pressure to conform to peer group norms is particularly strong during adolescence, and peers may exert even more influence than the adolescent’s family (Silverman & Williamson, 1997). Peers are a key influence on YHRICC, the subgroup that will be the target audience for CDC’s initiative (CDC, 2010a). Based on these findings, researchers recommend that IPV communication efforts tap the powerful influence of peer groups and seek to diminish the peer group's support for violent behavior.

Researchers also recommend addressing the social processes through which young people learn that aggressive behavior is an acceptable, even expected part of intimate partner relationships. Specifically, communication efforts may aim to support the positive role that peer groups can play by withholding support for dating violence (Sugarman & Hotaling, 1989). The literature review also showed consistent findings across studies that males and females perpetrate intimate partner violence at similar rates. Based on these findings, researchers recommend that interventions target couples rather than just males (or just females) (Bowman & Morgan, 1998; Sugarman & Hotaling, 1989). At the same time, however, communication efforts should reflect gender differences in underlying motivations. For example, females tend to perpetrate IPV as a result of anger, jealousy, and in self-defense, while males tend to perpetrate IPV to dominate and intimidate.

In addition to the published information we reviewed, we conducted additional research to inform program planning and implementation. This additional research included one-on-one interviews with the following four well-known youth culture experts in the areas of academic research, communications, and marketing: Susannah Stern, PhD, from the University of San Diego; Nicole Dorrler and Mary Sullivan from the American Legacy Foundation; and Peter Picard from Teen Research Unlimited (TRU).

We also completed an environmental scan to identify existing programs and campaigns that promote messages related to healthy relationships and teen dating violence awareness and prevention. The environmental scan encompassed initiatives targeting youth as a primary audience, as well as those that primarily target influencer audiences, and included efforts at the international, national, and state levels.

We are also currently finalizing a comprehensive audience analysis of youth ages 11 to 18, including YHRICC. This analysis uses publicly available data, research, and trend reports to profile the youth audience in terms of: demographics; dating violence risk and protective factors; knowledge, attitudes, and behaviors; lifestyle and psychographic factors; and communications channels and media use.

Internet searches were also performed on several Internet search engines, including Google, Yahoo, AltaVista, Medline, and Science Direct, using search terms “adolescent,” “dating violence,” and “prevention.” “tween/teen,” “teen dating,” “tween/teen relationships,” and “tween/teen friendships.” We have also reviewed program announcements, requests for applications (RFAs), and requests for proposals (RFPs) from other federal agencies. To date, no duplication of effort has been identified.

The results of the literature search, expert interviews, environmental scan, audience analysis, and consultation with experts in the field revealed that although a small amount of research has been conducted on adolescents as an audience segment, the research does not exist in relationship to promoting healthy relationships and preventing dating violence. In addition, the previous research and data collection efforts have been formative/exploratory in nature, rather than tactical. Thus, there are no similar or duplicate data available to use or adapt for the purpose of this research.

The proposed research will gather data to inform CDC’s tactical planning, message and materials development, and implementation of a youth-focused initiative to promote healthy relationships and prevent dating violence. Given that this is a new direction for the Choose Respect campaign, slated to be launched during winter/spring 2011, the data collected will be critical for ensuring the campaign’s success.


A. 5. Impact on Small Businesses or Other Small Entities


No small businesses will be involved in this data collection.



A. 6. Consequences of Collecting the Information Less Frequently


There are no legal obstacles to reduce the burden.


The research design and nature of the objectives are such that implementation of these data collection methods will be required to gather feedback on tactical campaign elements from the target audience of males and females who are 11 to 18 years of age, during the campaign’s implementation phase. Without conducting the data collection at the stated frequency, we will be unable to gather feedback on the various components that will be developed and implemented on an ongoing basis throughout the life of the campaign. In addition, the data collection is structured to ensure that feedback is collected from all members of the target audience, including groups segmented by age, gender, geography, language, and culture/race/ethnicity. Without the stated frequency of research, the tactical components may fail to effectively communicate the campaign message, or fail to appeal to the entire target audience. The campaign’s overall success likely will suffer if data are collected less frequently. Again, as described above, no individual respondent will participate in more than one focus group or more than one survey.



A. 7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


There are no special circumstances involved in this data collection. This request fully complies with the regulation 5 CFR 1320.5.



A. 8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


A. A 60-day Notice (Attachment B) was published in the Federal Register - Volume 75, Number 76: Wednesday, April 21, 2010, pp 20849-50.


B. In 2008 and 2010, several individuals outside the CDC were consulted with regarding the proposed information collection. They include the following Choose Respect initiative staff, research vendor contacts, and youth culture experts:


  • Jennifer Wayman, M.H.S., Ogilvy Public Relations Worldwide

202-729-4161; [email protected]


  • Michael Briggs., Ogilvy Public Relations Worldwide

202-729-4198; [email protected]


  • Patricia Taylor, Ogilvy Public Relations Worldwide

202-729-4271; [email protected]


  • Nancy Accetta, M.H.S., CHES, Ogilvy Public Relations Worldwide

202-729-4167; [email protected]

  • Jennifer Scott, Ph.D., Ogilvy Public Relations Worldwide

212-880-5260; [email protected]


  • Annette Abell, M.B.A., Harris Interactive

585-214-7386; [email protected]


  • Dana Markow, Ph.D., Harris Interactive

212-212-9676; [email protected]


  • Susannah Stern, PhD, University of San Diego


  • Nicole Dorrler, American Legacy Foundation


  • Mary Sullivan, American Legacy Foundation


  • Peter Picard, Teen Research Unlimited (TRU)


These individuals will be consulted with, as needed, during the study period.



A. 9. Explanation of Any Payment or Gift to Respondents


Participants in the focus groups and online surveys will receive incentives as described in detail below.


Our conversations with several moderators who regularly conduct focus groups across the country indicate that the proposed incentives are consistent with current rates for participation in focus group research studies. Incentives will take the form of cash, gift certificates, and information. As described at the end of this section, online panel surveys use an incentive program to improve response rates and maintain membership.


Reviewed literature revealed that payment of incentives can provide significant advantages to the government in terms of direct cost savings and improved data quality. It also should be noted that message testing is a marketing technique, and it is standard practice among commercial market researchers to offer incentives as part of respondent recruitment.


Jennifer Scott, Ph.D., [212-880-5260], Managing Director of Insights and Research at Ogilvy and an expert in health communication research and the methodologies proposed for this study, explained, “Social and behavioral science studies inevitably compete with commercial marketing research for study participants. Because standard practice among commercial research is to provide incentives, health communication research projects have little choice but to provide incentives as well in order to successfully recruit participants. I also recommend incentives for all studies because it generally results in respondents being more involved in the research as well as respondents feeling respected and appreciated.”

Background on the Use of Response Incentives

A review of survey methodologists and practitioners in October, 19922 recommended that OMB “seriously consider the use of incentives” for surveys that: target difficult-to-engage respondent populations; are long or time-consuming; have items that are potentially sensitive or require detailed record keeping; have relatives serve as gatekeepers to respondent access; and are part of longitudinal panels.


In fact, as Kulka (1994) noted, “The greatest potential effectiveness of monetary incentives appears to be in surveys that place unusual demands upon the respondent [or] require continued cooperation over an extended period of time.”


Other studies agreed with Kulka’s assessment on the effectiveness of incentives. Singer and her colleagues expanded his argument to include other groups. They noted, “... paying an incentive is effective in increasing response rates in telephone and face-to-face surveys, as has been demonstrated consistently in mail surveys. This is true in all types of surveys, not merely those involving high burden for the respondent...it appears to be true for panel respondents, fresh respondents, and those who have refused to respond.” (Singer, Gebler, Raghunathan, VanHoewyk, and McGonagle, K. (in press).


Payments vs. Non-monetary Incentives

Cash incentives have been shown to be most effective in increasing survey response rates for one-time surveys of panel members. For example, Singer and her colleagues noted that, following a series of experiments on the impacts of incentives on various types of survey data collection, “...gifts in this study were less effective in increasing response rates than cash, even with the value of the incentive controlled.”


Research on participation in consumer research indicates that, without providing minimal levels of compensation, insufficient numbers of individuals will participate and results will not be useful (Kruegar, 1994; Berlin, 1992).


This finding replicates previous research on the effectiveness of incentives, including a meta-analysis of 38 experiments and quasi-experiments conducted by Church (1993) in which gifts were significantly less effective than cash in generating survey response, and noted that offering prepaid monetary incentives yielded an average increase of 19.1 percentage points over comparison groups. Moreover, the impacts of monetary incentives seem greater than the impacts of promised charitable donations, lotteries for cash prizes, and other non-monetary rewards.


Level of Incentive Payment

Despite its apparent logic, simply increasing the size of cash incentives to non-respondents does not always result in proportional increases in response rates. In fact, there is some evidence of diminishing returns as incentive levels increase. However, Findlay and Schaible (1980) found that increasing the incentive payments from $10 to $20 was successful in increasing overall response rates. This incentive was often supported in the literature. Meta-analyses conducted by Church noted incentives provided with initial mailings (e.g., not conditionally linked to the completion of the survey) were the most effective in encouraging increased response.


Reduced Bias

The most important aspect of an incentive plan may be its potential for reducing response bias, underreporting bias, and similar sources of error. Findings from the National Survey of Family Growth (a study in which highly sensitive and personal information is collected from young adults) demonstrated that incentives not only had positive effects on response rates, but they also increased the accuracy of reporting. Incentives are necessary for message testing in order to ensure that those who are willing to participate are as representative as possible of the wider public. Failure to provide a basic incentive is likely to bias samples in the direction of well-educated individuals, who are generally predisposed to be helpful (Findlay and Shaible, 1980).


Incentives

Incentive amounts for the proposed in-person focus groups will vary slightly across groups, based on local cost of living differences, ranging from roughly $50 - $60 per person.


Online Survey

For the online surveys, youth participants and their parents through whom they are recruited will receive different incentives. All parents who will be contacted about the online surveys are existing members of Harris Poll OnlineSM, an online panel of over 6 million cooperative respondents. Since parents are already members of the panel they will be awarded the standard HIpointsSM they would normally receive for completing a survey of similar length. HIpoints are incentive points that are issued and tracked by Harris Interactive. They cannot be redeemed for cash. Instead, once a critical mass of points has been earned, the panelist can redeem the points for a variety of rewards. The number of high points that a respondent receives for participating in online research is determined by the length of the survey they complete. For example, a two-minute survey typically awards 30 HIpoints, while a 10-minute survey typically awards 100 HIpoints. To estimate the value of the 30 HIpoints earned by a parent who completes the online screener for the Choose Respect project, a $5 Pizza Hut gift card requires 800 HIpoints. In addition, all panelists who qualify and complete a survey in a given month are automatically entered into Harris Interactive’s bi-monthly (e.g., six times per year) $10,000 sweepstakes, HIstakesSM.


Youth participants in the online surveys, on the other hand, will not receive HIpoints. Instead, their only incentive will be that, upon completing the survey, they will be able to view a Web page in which they can see how his/her responses to three to five nonproprietary questions compare to the aggregate respondent base.


These incentive levels were recommended by independent consultants and senior analysts employed by our research vendors.


A. 10. Assurance of Confidentiality Provided to Respondents


All data collection activities will be conducted in full compliance with the CDC regulations to maintain the privacy of data obtained on persons and to protect the rights and welfare of human assessment subjects, as contained in Title 28 of the Code of Federal Regulations, Parts 22 and 46. Data will be treated in a secure manner, unless otherwise compelled by law.


Although the research vendors, Harris Interactive and the focus group research vendor, will use identifiable information, such as phone numbers, to facilitate the collection of response data, procedures will be followed to limit the linkage of this information to response data. Furthermore, no identifiable information about participants will be included in the data provided to Ogilvy or CDC.


As a matter of policy, Harris Interactive, the research vendor for the online survey research, does not store any IIF on youth participants. Youth participants in the online research will be recruited through their parents. Harris Interactive maintains an existing database of participants rigorously recruited and maintained to represent demographic characteristics comparable to the U.S. population. One hundred percent of the database participants have confirmed through a two-step process that they want to be part of the database and to be offered opportunities to participate in online research (Harris, 2008). Parents who are part of Harris Interactive’s existing database of participants (Harris Poll Online) will receive an invitation email (Attachment E) explaining the general topic of the survey and containing a link to a secure Web site where they will complete a short screener.


The screener (Attachment F) will be used to determine if there are any children in the household who would qualify for the survey and to collect demographic information about the potential youth participants to confirm eligibility. After completing the screener, parents will be provided with additional information about the purpose of the survey and given the opportunity to indicate that they are providing their parental permission for their child to participate in the survey (Attachment G). While the parents will be retained as part of the parental permission process, their IIF will be stored on a separate server from both the parents’ and youths’ responses, and only authorized staff will be able to access the information. If a parent gives permission for their child to participate, and if the child is determined to be eligible based on the parent’s responses to the screener questions, the parent will be directed to either bring their child to the computer at that time or given instructions on how to have their child resume the survey at a later time. Children will then complete a short screener (Attachment H) asking grade, age, and gender to confirm their qualification and that this is the child for whom the parent provided permission. The child will then be provided with a brief description of the project and allowed the opportunity to assent to the project (Attachment I). Upon obtaining assent, the Web site will direct the youth to another page in the secure site to complete the survey (See Attachment C for sample survey). The child will not be able to return to the parent portion of the survey. No cookies or other persistent identifiers, such as respondent ID, will be used.


With respect to the in-person focus groups, the focus group research vendors will destroy the records collected during the screening process once the focus groups have been conducted, which will be approximately three weeks following the start of the recruitment process. Focus group data and participant responses to questions in the moderator’s guide will only be recorded on audiotapes and the corresponding transcripts, and in observer notes. Transcripts will be stripped of any names or other identifying information before they are delivered to Ogilvy and CDC. The audiotapes and observer notes will be destroyed once the final focus group report is delivered by Ogilvy to CDC, approximately four weeks following the completion of the focus groups.


IRB Approval


This (original) data collection request was reviewed and approved by CDC’s IRB through July 22, 2009(See Attachment J-1, “IRB Approval Letter” and through an extension to July 22, 2010 (See Attachment J-2, “IRB Approval of Continuation Protocol”). CDC plans to submit an updated research protocol for continuation review and approval immediately following the submission of this package.



Privacy Impact Assessment


This submission has been reviewed by ICRO, who determined that the Privacy Act does not apply.


  1. Project paperwork maintained by each focus group vendor and by Harris Interactive will never be submitted to CDC and will remain in a locked, secure location, available only to a minimum number of local project staff. It will not be reused or disclosed to any other person or entity except as required by law, for authorized oversight of the research project. Research vendors will destroy identifiers at the earliest opportunity, unless there is a public health or research justification for retaining the identifiers or they are required to by law.


No IIF will be collected during this data collection, although use of some IIF will be necessary during the recruitment process to screen and schedule potential participants. The following safeguards (controls) will be in place to minimize the possibility of unauthorized access, use, or dissemination of IIF used during recruitment, as well as of the information that is collected:


Technical Controls

  • User Identification

  • Passwords

  • Firewall

  • Encryption

  • Intrusion Detection System (IDS)


Physical Controls

  • Identification Badges

  • Key Cards


Administrative Controls

  • File Back-up

  • Least Privilege Access to the Data (access is “role based” on a “need to know” basis)


  1. For both online and in-person research, parental permission and youth assent will be collected.


Youth participants in the online research will be recruited through their parents. As described above, parents will be emailed a link to a secure Web site where they will read the parental permission form (see Attachment G) and indicate whether they provide parental permission for their child to participate in the survey. While the parents will be retained as part of the parental permission process, their IIF will be stored on a separate server from both the parents’ and youths’ responses, and only authorized staff will be able to access the information. The permission form will detail the purpose of the data collection, expected length of time to complete the survey, security of the information provided, and contact information for a project staff person who can address any questions about the data collection. Once parental permission has been provided, the parent will be asked to bring their child to the computer at that time or given instructions on how to have their child resume the survey at a later time. Via the secure Web site, youth participants will read the youth assent form (see Attachment I) and indicate whether they assent to participate in the study, prior to being directed to another page in the secure site to complete the survey.


For the in-person focus groups, upon successful recruitment, the parent or guardian will receive a call during which he/she will learn about the study, the child’s expressed interest in participating, and be asked for verbal consent (See second part of Attachment M, “Focus Group Screening Instrument for Youth and Script for Obtaining Verbal Consent of Parent”). After giving verbal consent, the parent/guardian will receive a letter/form that contains information about the purpose of the data collection effort, the expected length of time to complete the data collection, security of the information provided, and contact information for a CDC representative who can address questions about the data collection effort.


Additionally, the letter will include logistics information (e.g., address and directions to the focus group location, reminder about the date and time) and an informed permission form for the parent (See Attachment K, “Focus Group Parental Permission Form”). Each parent or guardian will be required to return the signed permission form prior to their child’s participation in the focus group. In addition, the youth will also read and be asked to sign an assent form prior to the beginning of the focus group (See Attachment L, “Focus Group Youth Assent Form”). Both the permission and assent form have been tested using the Fry Readability Scale. The permission form readability registers approximately at the 10th grade level, while the assent form readability registers approximately at the 7th grade level.


Through the permission process, respondents will be informed that their responses will be treated in a secure manner and will not be disclosed, unless otherwise compelled by law. Respondents also will be informed that CDC plans to release all project results in aggregate report formats that do not identify individual respondents. Information describing the provisions for safeguarding privacy will be provided in writing on the permission form, and also will be reviewed verbally by the moderator prior to initiating the focus groups.


All participants will be given notice about the study, including information on the purpose of the data collection effort, the expected length of time to complete the data collection, security of the information provided, name and telephone number of a project staff person for respondents to contact if they have questions about the data collection effort. The CDC will be provided with non-aggregated data; however, only summary data will be published.


Immediately prior to the conduct of each survey, the following points will be made regarding privacy of the respondent’s answers:


  • Participation in the survey is voluntary and will have no effect on any benefits for which the adolescent would otherwise be eligible.

  • Identifying information such as respondent’s name will not be collected on the surveys.

  • The participant may choose not to respond to any question.

  • All data collected by Ogilvy is for CDC use and will be kept in a secure manner, unless compelled by law. Neither the CDC nor Ogilvy will release or publish non-aggregated data directly to the public.

  • CDC will retain ownership of all data collected. When these data are submitted to the CDC, no identifying information will appear.

  1. Respondents will be advised of the nature of the activity, the length of time it will require, and that participation is purely voluntary. Respondents will be assured that failure to participate, lack of response to any specific question (probe), or withdrawal of permission will not result in any penalty or loss of benefits to which the subjects are otherwise entitled. This information will be reflected in both the parental permission and the youth assent forms.



A. 11. Justification for Sensitive Questions


Despite the sensitive nature of the campaign’s topics of healthy relationships and dating violence prevention, neither the online surveys, nor the focus group moderator guides, nor the focus group waiting room survey will contain sensitive questions. For example, no questions will ask about personal experiences related to dating abuse. Rather, they will focus on attitudes and beliefs related to healthy and unhealthy dating relationships. Because the research is intended to inform the development and implementation of the campaign, questions will be limited to those focusing on knowledge and attitudes, and preferred communication messages, vehicles, partnerships, and material formats.


In addition to avoiding sensitive questions, the surveys and moderator guides will not ask for any identifying information from respondents, such as social security numbers. The recruitment process will collect race and ethnicity data to ensure that we are recruiting and collecting feedback from agreed-upon segments of the research participants.


For the in-person focus groups, the recruiters will need to collect some personal information, such as telephone numbers and mailing addresses. However, this information only will be used for recruitment and scheduling purposes, and Ogilvy and CDC will not have access to it. For the online focus groups, Harris Interactive will collect some personal information, such as email addresses, from parents only. However, this information will be kept separate from the response data. No personal information will be collected from the youth respondents.


Lastly, all youth’s parents will be required to complete a permission form and all youth respondents will be required to complete an assent form before participating in data collection. The permission and assent forms will outline the extent of data collected.



A. 12. Estimates of Annualized Burden Hours and Costs


A. 12. A. Estimates of Annualized Burden Hours


Table 1 (see below) presents burden hour estimates for the data collection methods to be utilized in this information collection. These estimates encompass data collection in up to 36 in-person focus groups with 8 respondents per group - up to twice annually (288 x 2) youth ages 11 to 18, as well as up to four online surveys with 200 respondents per survey annually (200 x 4 = 800 youth ages 11 to 18 as seen in row 7).


Focus Groups

Each in-person focus group recruiting conversation (in-person and/or by phone) with use of the screener (Attachment M, “Focus Group Screening Instrument for Youth and Script for Obtaining Verbal Consent of Parent”) is expected to last five minutes with a prospective youth participant and a parent/guardian, as seen in row two. (Note: Because every youth will not agree to participate, we estimate that the research vendor’s in-community recruiter will successfully recruit one participant for every two youth he or she initially speaks with. Therefore, 576 youth/parent pairs will be screened to successfully recruit 288 youth up to twice per year over a study year, as seen in row two.) Each in-person focus group will last 90 minutes, as seen in row four. No child will participate in more than one focus group; however, the table displays “2” for the number of responses per respondent to reflect the number of times we expect to carry out the screening/focus group process with parents and youth. In other words, two is the number of times we may conduct focus groups in a given year.


Online Survey

To recruit youth ages 11 to 18 for the online surveys, parents will receive information and permission forms via email (See Attachments E and F). Upon providing permission for a child to participate in the study, a follow-up email with the invitation and child assent form will be sent for completion. We estimate that the recruitment process for the online surveys will take five minutes for parents and three minutes for youth, as seen in rows five and six. (Note: Again, we estimate that two of Harris’s households will participate in the screening process to successfully recruit one participant.) Each online survey will last 10 minutes, as seen in row seven. It is unlikely that any child will participate in more than one online survey; however, the table displays “4” as the number of responses per respondent to show the number of times we may field an online survey in a given year


The estimated burden for the proposed information collection is based on Ogilvy and CDC staff experience and expertise, as well as on internal instrument pretests conducted by Ogilvy with less than nine individuals.


The total annual burden hour request for this data collection is 1354 hours.



TABLE 1: ESTIMATED ANNUALIZED BURDEN HOURS



Type of Respondents



Form Name

No. of Respondent

No. of Responses per Respondent

Average Burden per Response (in hours)

Total Burden Hours

Youths ages 11 to 18 and parents of boys and girls, ages 11 to 18 and

Focus Group Screening Instrument for Youth and Script for Obtaining Parental Consent (Attachment M)

576

2

5/60

96

Youths ages 11 to 18

Focus Group Survey (Attachment N)

288

2

5/60

48

Youths ages 11 to 18

Focus Group Moderator’s Guide (Attachment D-1 and D-2)

288

2

1.5

864

Parents of boys and girls, ages 11 to 18

Online Survey Email Invitation AND Online Survey Screening Instrument for Parents (Attachments E and F)

400

4

5/60

133

Youths ages 11 to 18

Online Survey Screening Instrument for Youth (Attachment H)

400

4

3/60

80

Youths ages 11 to 18

Online Survey (Attachment C)

200

4

10/60

133

Total

1354



A. 12. B. Estimates of Annualized Burden Costs


Table 2 presents the annualized cost to parents based on the most current available average U.S. hourly wage rate, which is $20.90, as published by the Bureau of Labor Statistics (DOL, 2010 http://www.bls.gov/oes/current/oes_nat.htm). To minimize the likelihood that youth participation in the in-person focus groups will entail a loss of regular parent income, recruitment/permission calls will be made during evening hours. The average wage for adult and youth ($14.08) was calculated by averaging the adult wage ($20.90) and youth wage ($7.25). The cost to parents ($5580.00) was calculated by multiplying the burden hours from Table 1 by the hourly rate. To estimate the annualized cost to youth, the average hourly rate of $7.25 is used (DOL, 2010). The cost to youths ($8156.00) was calculated by multiplying the burden hours from Table 1 by the hourly rate ($7.25). The estimated burden cost to parents and youths, per study year, is $5029.41.



TABLE 2: ESTIMATED ANNUALIZED BURDEN COSTS


Type of Respondents

No. of Respondent

No. of Responses per Respondent

Average Burden per Response (in hours)

Total Burden Hours

Hourly Wage Rate

Total Respondent Cost

Youths ages 11 to 18 and parents of boys and girls, ages 11 to 18 and

576

2

5/60

96

$14.08

$1,351.68

Youths ages 11 to 18

288

2

5/60

48

$7.25

$348.00

Youths ages 11 to 18

288

2

1.5

864

$7.25

$6264.00

Parents of boys and girls, ages 11 to 18

400

4

10/60

267

$20.90

$5580.30

Youths ages 11 to 18

400

4

3/60

80

$7.25

$580.00

Youths ages 11 to 18

200

4

10/60

133

$7.25

$964.25

Total


$15088.23




A. 13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers


There are no capital or start-up costs associated with this project for respondents.



A. 14. Annualized Cost to the Government


The Choose Respect campaign is seeking approval to conduct up to four online surveys and up to two sets of 36 in-person focus groups each year for three years. Table 3 presents the estimated annualized cost to the Federal government for each of these years. As this information collection will be conducted under a contract awarded to Ogilvy, the estimated costs reflect Ogilvy’s costs and 10% of a CDC FTE’s (Grade 13) time for oversight and supervision of the data collection. While the scope of work will remain the same each year, Ogilvy’s estimated costs for Years 2 and 3 reflect an average annual escalation of 3.5 percent to account for increases in the cost of living, inflation, and wage increases. Estimated Ogilvy labor costs are $163,720 for Year 1. These labor costs were budgeted by estimating the number of hours of staff at the various wage levels that are required, multiplying by the applicable wage rates, and multiplying the results subtotals by factors to cover fringe benefits, overhead, and fee. Wage levels for the labor categories expected to contribute to this project range from $20.19 per hour for Account Executive labor to $96.01 per hour for Senior Management labor. The basis for estimating other direct costs varies with the type of cost being estimated. During Year 1, direct costs associated with the in-person focus groups are estimated to be $271,000, while direct costs for the online surveys are estimated to be $91,000 during the first year. Annual Ogilvy telephone costs are estimated to be $500 per year. Additionally, Ogilvy travel costs related to this data collection are estimated to be $12,250 for Year 1 for travel to two markets twice per year. All direct costs were multiplied by a factor to cover fee, as well as by the average annual escalation of 3.5 percent. The 10% of a CDC FTE’s time for oversight and supervision is estimated to be $10,991.


TABLE 3: ESTIMATED ANNUALIZED COST TO THE GOVERNMENT




Year

Ogilvy

CDC

Total


Direct Costs


Labor


Travel

Ogilvy Total

Year 1

$362,500

$163,720

$12,250

$538,470

$10,991

$549,461

Year 2

$375,188

$169,450


$12,680

$557,318

$10,991

$568,309

Year 3

$388,319

$175,380

$13,125

$576,824

$10,991

$587,815

Total

$1,126,007

$508,550

$38,055

$1,672,612

$32,973

$1,705,585



Annual cost to the federal government, calculated by dividing the total cost of the project by the time period (3 years), is estimated to be $568,528.


A. 15. Explanation for Program Changes or Adjustments


There is no change in the overall burden time requested; CDC simply seeks to “exchange” some of the previously approved burden time to conduct research with older teens in addition to younger tweens/teens. CDC is revising this ICR for the following reasons:

  • Research indicates that 11- to 14-year-old YHRICC may be at increased risk for teen dating violence (TDV) due to concentrated poverty, lack of resources, and exposure to community violence (CDC, 2010a).

  • Older peers are known to be influential in the lives of younger teens (CDC, 2010b). As such, CDC’s future campaign or communication campaign will likely engage older youth to help reinforce messages about health dating relationships. It is necessary to understand their thoughts on healthy dating as well as their perceptions of themselves as influencers and this older 15-18 year old audience was not included in previously approved submission.

  • Youth knowledge and attitudes regarding healthy and unhealthy dating relationships must be understood to inform the development of messages that will resonate with YHRICC. Knowledge and attitude questions were not in the previously approved submission.

A. 16. Plans for Tabulation and Publication and Project Time Schedule

The below table (Table 4) outlines the project time schedule for the data collection. CDC and Ogilvy envision collecting data over three years through online surveys up to four times per year, and through traditional in-person focus groups twice per year (up to 72 groups per year). The research will be limited to data collection for program improvement purposes, including: 1) feedback on possible campaign creative concepts, messages, and communication channels for reaching 11 to 14 year olds; and 2) feedback on draft campaign materials. The data will be collected for program improvement purposes only.


TABLE 4: PROJECT TIME SCHEDULE



Activity


Time Schedule

Year 1



Round 1 online survey questionnaire design


2 weeks after OMB approval


Round 1 online survey programming, quality assurance review, and testing



3 weeks after OMB approval


Round 1 online survey data collection


1 month & 1 week after OMB approval


Round 1 online survey analysis


1 month & 3 weeks after OMB approval


Round 1 online survey report to CDC


2 months after OMB approval


Round 2 online survey questionnaire design


3 months after OMB approval


Round 2 online survey programming, quality assurance review, and testing


3 months & 3 weeks after OMB approval


Round 2 online survey data collection


4 months & 1 week after OMB approval


Round 2 online survey analysis


4 months & 3 weeks after OMB approval


Round 2 online survey report to CDC


5 months after OMB approval


Round 1 focus group screening


5 months after OMB approval


Round 1 focus group testing


5 months & 2 weeks after OMB approval


Round 1 focus group analysis


6 months after OMB approval


Round 1 focus group report to CDC


6 months & 2 weeks after OMB approval


Round 3 online survey questionnaire design


6 months after OMB approval


Round 3 online survey programming, quality assurance review, and testing



6 months & 3 weeks after OMB approval


Round 3 online survey data collection


7 months & 1 week after OMB approval


Round 3 online survey analysis


7 months & 3 weeks after OMB approval


Round 3 online survey report to CDC


8 months after OMB approval


Round 4 online survey questionnaire design


9 months after OMB approval


Round 4 online survey programming, quality assurance review, and testing



9 months & 3 weeks after OMB approval


Round 4 online survey data collection


10 months & 1 week after OMB approval


Round 4 online survey analysis


10 months & 3 weeks after OMB approval


Round 4 online survey report to CDC


11 months after OMB approval


Round 2 focus group screening


10 months after OMB approval


Round 2 focus group testing


10 months & 2 weeks after OMB approval


Round 2 focus group analysis


11 months after OMB approval


Round 2 focus group report to CDC


11 months & 2 weeks after OMB approval

Year 2



Round 1 online survey questionnaire design


14 weeks after OMB approval


Round 1 online survey programming, quality assurance review, and testing



15 weeks after OMB approval


Round 1 online survey data collection


13 month & 1 week after OMB approval


Round 1 online survey analysis


13 month & 3 weeks after OMB approval


Round 1 online survey report to CDC


14 months after OMB approval


Round 2 online survey questionnaire design


15 months after OMB approval


Round 2 online survey programming, quality assurance review, and testing



15 months & 3 weeks after OMB approval


Round 2 online survey data collection


19 months & 1 week after OMB approval


Round 2 online survey analysis


19 months & 3 weeks after OMB approval


Round 2 online survey report to CDC


17 months after OMB approval


Round 1 focus group screening


17 months after OMB approval


Round 1 focus group testing


17 months & 2 weeks after OMB approval


Round 1 focus group analysis


18 months after OMB approval


Round 1 focus group report to CDC


18 months & 2 weeks after OMB approval


Round 3 online survey questionnaire design


18 months after OMB approval


Round 3 online survey programming, quality assurance review, and testing



18 months & 3 weeks after OMB approval


Round 3 online survey data collection


19 months & 1 week after OMB approval


Round 3 online survey analysis


19 months & 3 weeks after OMB approval


Round 3 online survey report to CDC


20 months after OMB approval


Round 4 online survey questionnaire design


21 months after OMB approval


Round 4 online survey programming, quality assurance review, and testing



21 months & 3 weeks after OMB approval


Round 4 online survey data collection


22 months & 1 week after OMB approval


Round 4 online survey analysis


22 months & 3 weeks after OMB approval


Round 4 online survey report to CDC


23 months after OMB approval


Round 2 focus group screening


22 months after OMB approval


Round 2 focus group testing


22 months & 2 weeks after OMB approval


Round 2 focus group analysis


23 months after OMB approval


Round 2 focus group report to CDC


23 months & 2 weeks after OMB approval

Year 3



Round 1 online survey questionnaire design


26 weeks after OMB approval


Round 1 online survey programming, quality assurance review, and testing



27 weeks after OMB approval


Round 1 online survey data collection


25 month & 1 week after OMB approval


Round 1 online survey analysis


25 month & 3 weeks after OMB approval


Round 1 online survey report to CDC


26 months after OMB approval


Round 2 online survey questionnaire design


27 months after OMB approval


Round 2 online survey programming, quality assurance review, and testing



27 months & 3 weeks after OMB approval


Round 2 online survey data collection


28 months & 1 week after OMB approval


Round 2 online survey analysis


28 months & 3 weeks after OMB approval


Round 2 online survey report to CDC


29 months after OMB approval


Round 1 focus group screening


29 months after OMB approval


Round 1 focus group testing


29 months & 2 weeks after OMB approval


Round 1 focus group analysis


30 months after OMB approval


Round 1 focus group report to CDC


30 months & 2 weeks after OMB approval


Round 3 online survey questionnaire design


30 months after OMB approval


Round 3 online survey programming, quality assurance review, and testing



30 months & 3 weeks after OMB approval


Round 3 online survey data collection


31 months & 1 week after OMB approval


Round 3 online survey analysis


31 months & 3 weeks after OMB approval


Round 3 online survey report to CDC


32 months after OMB approval


Round 4 online survey questionnaire design


33 months after OMB approval


Round 4 online survey programming, quality assurance review, and testing



33 months & 3 weeks after OMB approval


Round 4 online survey data collection


34 months & 1 week after OMB approval


Round 4 online survey analysis


34 months & 3 weeks after OMB approval


Round 4 online survey report to CDC


35 months after OMB approval


Round 2 focus group screening


34 months after OMB approval


Round 2 focus group testing


34 months & 2 weeks after OMB approval


Round 2 focus group analysis


35 months after OMB approval


Round 2 focus group report to CDC


35 months & 2 weeks after OMB approval


The CDC also will prepare a paper for publication in a scholarly journal. A potential title is “Using focus groups and online surveys to develop a healthy relationships campaign targeting youth in high-risk, inner city communities.” Table 5 outlines publication dates and other activities. Please see the “Analysis Plan” section below for a description of data analysis procedures.


TABLE 5: PUBLICATION TIME SCHEDULE



Activity


Time Schedule

Year 2



Manuscript finalized for submission to journal


16 months after OMB approval


Receive reviewers’ feedback/comments and questions



17 months & 2 weeks after OMB approval


Respond to reviewer questions and revise manuscript revised per feedback/comments



21 months after OMB approval


Resubmit revised manuscript to journal


21 months after OMB approval


Receive additional feedback/comments from reviewers



25 months after OMB approval


Revise manuscript per feedback/comments


27 months after OMB approval


Resubmit revised manuscript to journal


27 months after OMB approval


Receive notification of acceptance from journal



29 months after OMB approval


Review of article proof


30 months after OMB approval


Publication


32 months after OMB approval



Analysis Plan


Under the guidance and direction of CDC, Ogilvy will conduct quantitative and qualitative analyses of the various data collected. Given that the purpose of the proposed research is to gather feedback to assess quickly the value of implementing tactical campaign elements, only top line reports outlining major themes will be prepared as the data are collected.


Focus Groups

The in-person focus groups will provide opportunities to explore youth participant knowledge and attitudes, and reactions and responses to possible campaign creative concepts, messages, materials, and channels in detail and to observe the effect of group dynamics on participant perceptions and reactions to draft materials and approaches (Krueger, 1988). Ogilvy and its research vendors will use a variety of documentation and assessment methods to analyze and summarize findings.


For this effort, the focus groups will be audiotaped and transcribed. In addition, at least one observer of each focus group will take notes. Conducting transcription and note-taking is described by numerous researchers as common focus group research practices (Stewart and Shamdasani, 1990; Krueger, 1988; Morgan, 1988). Data from the waiting room surveys will be tallied.


Ogilvy and CDC will determine the final focus group findings through the following steps:


Part 1: Analysis of Qualitative Data:

  • Pre-Analysis: Debriefings with observers who attended focus groups will be conducted immediately after each focus group session. As described by Krueger (1988) and Morgan (1988), these debriefing sessions will include the moderator walking through the guide to review the trends, questions, and comments for each topic.

  • Analysis: Systematic review of each transcript by at least two to three people—independently from one another—to identify common themes and unusual perceptions and comments relevant to each topic. Each reviewer will code responses within a qualitative framework that follows the research questions as guides, coding responses for relevant themes (Stewart and Shamdasani, 1990; Morgan, 1988). As themes are developed, the researcher will assign a working definition to each code. This process, called constant comparison (Glaser and Strauss, 1967), will be continually used to compare the categories and codes of the transcript with existing categories and codes in order to more fully develop the properties of the overarching categories for the individual codes. This process will continue until saturation is reached.

  • Subsequent discussion about areas of agreement and conflict with respect to themes and perceptions, followed by additional transcript review until a general consensus is achieved. This "notes-based" analysis is a commonly accepted process for qualitative research assessment. Morgan (1988), for example, describes this analysis procedure and states that “there is likely to be a cycling back and forth between the raw material in the transcripts and the more abstract determination of what topics will go into the ultimate report.”


Part 2: Analysis of Quantitative Data:

As described by Creswell (2003), when using a concurrent nested mixed methods research strategy, data from the embedded research approach must undergo a process of data transformation before it can be integrated in the analysis phase of the research. In the case of the proposed focus groups, the dominant method is qualitative, and the embedded method is quantitative. As a result, the quantitative data that are collected by the waiting room surveys must be qualified. Unfortunately, Creswell (2003) points out that little has been written at this point to guide researchers in the process of transforming data and integrating findings from two different research methods into a single phase of analysis. CDC and Ogilvy will conduct data transformation and integration of findings through the following steps, which are based on Creswell’s outlined approach:

  • Development of Descriptive Statistics: Computation of simple descriptive statistics for the data collected from each question on the survey (e.g., mean, median, mode, frequency of each response).

  • Development of Qualitative Themes: Systematic review of the descriptive statistics by at least two to three people—independently from one another—to identify potential patterns and themes based on the numerical data. Each reviewer will develop their own set of findings and possible themes.

  • Subsequent discussion about areas of agreement and conflict with respect to themes, followed by additional review of the numeric data until a general consensus is achieved on a single set of survey themes.


Part 3: Integration of Qualitative and Quantitative Themes and Findings:

  • At least two to three people will independently review all of the themes identified through examination of the quantitative survey data and the themes identified through the qualitative analysis of the focus group discussions. Based on their reviews, they will identify areas of agreement and conflict between the two sets of themes.

  • Subsequent discussion about areas of agreement and conflict with respect to themes, followed by additional review of the numeric data and transcripts until a general consensus is achieved. As Creswell (2003) emphasizes, however, the purpose of the quantitative data in a concurrent nested strategy such as this is to supplement and enhance findings from the dominant approach (qualitative). Therefore, our focus will be on identifying how the themes developed from the quantitative data can be used to enrich the understanding of audience perceptions that we gain from the qualitative analysis.


Online Surveys

As described earlier, the purpose of the online surveys is to gather directional, largely descriptive data to aid planners in making decisions about whether or not to implement tested communication tactics. The project will not analyze the data using any tests of statistical significance.


For this effort, responses to each survey question will be tallied. Ogilvy, CDC, and Harris Interactive will then determine the final survey research findings through the following steps:


Step 1: Analysis of Quantitative Data

  • Examination of Data: Researchers will examine the data collected in response to the quantitative (close-ended) questions in the surveys. We will compute simple descriptive statistics for the data collected from each question on the survey (e.g., mean, median, mode, frequency of each response), reviewing the total number of positive and negative responses received by each tested concept, message, and channel to determine which approaches were most favored by members of the target audience. Researchers also will assess whether the top-rated approaches received a significantly greater number of positive responses, or whether there were other approaches that were also highly rated.


Step 2: Analysis of Qualitative Data

While the majority of data collected via the online surveys will be quantitative, some open-ended questions will be included as well to collect qualitative data. As described above, Creswell (2003) explains that data from the embedded strategy must undergo a process of data transformation before it can be integrated into the analysis. In the case of the proposed online surveys, the dominant method is quantitative (since most questions will be close-ended), and the embedded method is qualitative (represented by the open-ended questions). As a result, the qualitative data that are collected using open-ended questions in the survey must be quantified. Unfortunately, little has been written at this point to guide researchers in the process of transforming data and integrating findings from two different research methods into a single phase of analysis (Creswell, 2003). However, CDC and Ogilvy will follow an approach outlined by Creswell to conduct data transformation and integration of findings:

  • Qualitative Analysis: Systematic review of all written comments received in response to open-ended questions by at least two to three people—independently from one another—to identify common themes and unusual perceptions and comments relevant to each topic. Each reviewer will code responses within a qualitative framework that follows the research questions as guides, coding responses for relevant themes (Stewart and Shamdasani, 1990; Morgan, 1988). As themes are developed, the researcher will assign a working definition to each code. This process, called constant comparison (Glaser and Strauss, 1967), will be continually used to compare the categories and codes of the written comments with existing categories and codes in order to more fully develop the properties of the overarching categories for the individual codes. This process will continue until saturation is reached.

  • Identification of Qualitative Themes: Subsequent discussion about areas of agreement and conflict with respect to themes and perceptions, followed by additional review of the written comments until a general consensus is achieved. This "notes-based" analysis is a commonly accepted process for qualitative research assessment (Morgan, 1988; Krueger, 1988).

  • Quantification of the Qualitative Data: Once a final set of themes are developed, researchers will count the number of times that each theme occurs to arrive at a numeric value for the frequency of each theme (Creswell, 2003).


Step 3: Integration of Quantitative and Qualitative Findings and Themes

  • Researchers will review both sets of quantitative data to identify areas of agreement and conflict between the two sets. As Creswell (2003) emphasizes, however, the purpose of the qualitative data in a concurrent nested strategy such as this is to supplement and enhance findings from the dominant approach (quantitative). Therefore, our focus will be on identifying how the qualitative data can be used to add depth and richness to the quantitative findings, such as by providing detail about aspects of the target audiences’ reactions to tested messages and channels that cannot be measured quantitatively, or by providing possible explanations for why particular channels and approaches tested better than others.


A. 17. Reason(s) Display of OMB Expiration Date is Inappropriate


This evaluation does not seek approval to be exempted from displaying the expiration date for OMB approval of the information collected.


A. 18. Exceptions to Certification for Paperwork Reduction Act Submissions


There are no exceptions to the certification.

1Please note: Throughout the information collection request, all references to parents refer to parents or legal guardians.

2 The “Symposium on Providing Incentives to Survey Respondents,” sponsored jointly by OMB and the Council of Professional Associations on Federal Statistics (COPAFS), considered a number of incentive-related issues, including the impacts on response rates, biases, and incentive types.


32

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOMB Information Collection Request
Authormullenj
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy