Ssa - 2-8-2013

SSA - 2-8-2013.docx

Evaluation of Dating Matters: Strategies to Promote Healthy Teen Relationships

OMB: 0920-0941

Document [docx]
Download: docx | pdf

25


HHS/CDC/NCIPC

SUPPORTING STATEMENT FOR

OMB INFORMATION COLLECTION REQUEST


OMB# 0920-0941


Part A


1/30/2013


Evaluation of Dating Matters: Strategies to Promote Healthy Teen Relationships


Supported by:


Department of Health and Human Services

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

Division of Violence Prevention


Government Project Officers:

Point of Contact for OMB

Andra Tharp, PhD (Project Lead)

[email protected]/

770-488-3936



Table of Contents



Abstract

A. JUSTIFICATION

1. Circumstances Making the Collection of Information Necessary

2. Purpose and Use of Information Collection

3. Use of Information Technology and Burden Reduction

4. Efforts to Identify Duplication and Use of Similar Information

5. Impact on Small Business or other Small Entities.

6. Consequences of Collecting Information Less Frequently.

7. Special Circumstances Relating to the Guidelines of 5CFR 13205

8. Comments in Response to Federal Register and Efforts to Consult Outside the Agency.

9. Explanation of Any Payments or Gifts to Respondents.

10. Assurance of Confidentiality Provided to Respondents.

11. Justification for Sensitive Questions.

12. Estimates of Annualized Burden Hours, and Costs.

13. Estimates of Other Total Annual Cost Burden to Respondents or Recordkeepers.

14. Annualized Costs to the Federal Government

15. Explanation for Program Changes

16. Plans for Tabulation, Publication, and Project Time Schedule.

17. Reason(s) Display of OMB Expiration Date Inappropriate.

18. Exemptions to Certification for Paperwork Reduction Act Submissions


LIST OF ATTACHMENTS


Attachment A:

Public Health Service Act: Sections 301 (42 U.S.C. 241)

Attachment B:


B.1 - Published 60-Day Federal Register Notice

B.2 - Public Comment

Attachment C:

IRB Approval

Attachment D:

Student Outcome Survey Baseline

Attachment E:

Student Outcome Survey Follow-Up



Attachment G:

School Indicators

Attachment H:

Parent Outcome Survey Baseline

Attachment I:

Educator Outcome Survey

Attachment J:

Brand Ambassador Implementation Survey

Attachment K:

School Leadership Capacity and Readiness Survey

Attachment L:

Parent Program Fidelity 6th Grade Session 1

Attachment M:

Parent Program Fidelity 6th Grade Session 2

Attachment N:

Parent Program Fidelity 6th Grade Session 3

Attachment O:

Parent Program Fidelity 6th Grade Session 4

Attachment P:

Parent Program Fidelity 6th Grade Session 5

Attachment Q:

Parent Program Fidelity 6th Grade Session 6

Attachment R:

Parent Program Fidelity 7th Grade Session 1

Attachment S:

Parent Program Fidelity 7th Grade Session 3

Attachment T:

Parent Program Fidelity 7th Grade Session 5

Attachment U:

Student Program Fidelity 6th Grade Session 1

Attachment V:

Student Program Fidelity 6th Grade Session 2

Attachment W:

Student Program Fidelity 6th Grade Session 3

Attachment X:

Student Program Fidelity 6th Grade Session 4

Attachment Y:

Student Program Fidelity 6th Grade Session 5

Attachment Z:

Student Program Fidelity 6th Grade Session 6

Attachment AA:

Student Program Fidelity 7th Grade Session 1

Attachment BB:

Student Program Fidelity 7th Grade Session 2

Attachment CC:

Student Program Fidelity 7th Grade Session 3

Attachment DD:

Student Program Fidelity 7th Grade Session 4

Attachment EE:

Student Program Fidelity 7th Grade Session 5

Attachment FF:

Student Program Fidelity 7th Grade Session 6

Attachment GG:

Student Program Fidelity 7th Grade Session 7

Attachment HH:

Student Program Fidelity 8th Grade Session 1 (comprehensive)

Attachment II:

Student Program Fidelity 8th Grade Session 2 (comprehensive)

Attachment JJ:

Student Program Fidelity 8th Grade Session 3 (comprehensive)

Attachment KK:

Student Program Fidelity 8th Grade Session 4 (comprehensive)

Attachment LL:

Student Program Fidelity 8th Grade Session 5 (comprehensive)

Attachment MM:

Student Program Fidelity 8th Grade Session 6 (comprehensive)

Attachment NN:

Student Program Fidelity 8th Grade Session 7 (comprehensive)

Attachment OO:

Student Program Fidelity 8th Grade Session 8 (comprehensive)

Attachment PP:

Student Program Fidelity 8th Grade Session 9 (comprehensive)

Attachment QQ:

Student Program Fidelity 8th Grade Session 10 (comprehensive)

Attachment RR:

Communications Campaign Tracking

Attachment SS:

Local Health Department Capacity and Readiness

Attachment TT:

Student Program Participant Assent Form

Attachment UU:

Parent of Student Program Participant Consent Form

Attachment VV:

Parent Program Participant Consent Form

Attachment WW:

Educator Consent Form

Attachment XX:

Student Brand Ambassador Assent Form

Attachment YY:

Parent of Student Brand Ambassador Consent Form

Attachment ZZ:

Student participant focus group guide

Attachment AAA:

Student curricula implementer focus group guide

Attachment BBB:

Parent curricula implementer focus group guide

Attachment CCC:

Student Program Fidelity 8th Grade Session 1 (standard)

Attachment DDD:

Student Program Fidelity 8th Grade Session 2 (standard)

Attachment EEE:

Student Program Fidelity Grade Session 3 (standard)

Attachment FFF:

Student Program Fidelity Grade Session 4 (standard)

Attachment GGG:

Student Program Fidelity Grade Session 5 (standard)

Attachment HHH:

Student Program Fidelity Grade Session 6 (standard)

Attachment III:

Student Program Fidelity Grade Session 7 (standard)

Attachment JJJ:

Student Program Fidelity Grade Session 8 (standard)

Attachment KKK:

Student Program Fidelity Grade Session 9 (standard)

Attachment LLL:

Student Program Fidelity Grade Session 10 (standard)

Attachment MMM:

Screen shots of LHD C/R Assessment (Attachment SS)

Attachment NNN:

Screen shots of School Leadership C/R Assessment (Attachment K)

Attachment OOO:

Student curricula implementer consent for focus group participation

Attachment PPP:

Parent curricula implementer consent for focus group participation

Attachment QQQ:

Student participant assent for focus group participation

Attachment RRR-1:

Screen shots educator survey baseline (Attachment I)

Attachment RRR-2:

Screen shots educator survey follow-up (Attachment IIII)

Attachment SSS-1:

Screen shots parent survey baseline (Attachment H)

Attachment SSS-2:

Screen shots parent survey follow-up (Attachment EEEE)

Attachment TTT:

Parent consent for student participation in focus groups

Attachment UUU

Initial Educator Web Survey Invitation Cover Letter

Attachment VVV

Initial Implementer Focus Group Invitation

Attachment WWW

Initial Parent Web Survey Invitation Cover Letter

Attachment XXX

Alternative Contact form

Attachment YYY

Text of parent follow-up reminder

Attachment ZZZ

Text of educator follow-up email

Attachment AAAA

Follow-up Parent Survey Telephone Script

Attachment BBBB

Script for General Information contact email and voicemail

Attachment CCCC

List of participating schools that have signed the MOU with local health dept

Attachment DDDD

Student program master trainer technical assistance log

Attachment EEEE

Parent Outcome Survey Follow-Up



Attachment GGGG

Funding Opportunity Announcement for Dating Matters Communities

Attachment HHHH

Screenshots for Community Capacity/Readiness Assessment (Attachment JJJJ)

Attachment IIII

Educator Outcome Survey Follow-Up

Attachment JJJJ

Community Capacity/Readiness Assessment

Attachment KKKK

Communications Focus Groups Guide

Attachment LLLL

Parent Program Manager Technical Assistance Tracking Form

Attachment MMMM

6th Grade Curricula Parent Satisfaction Questionnaire

Attachment NNNN

7th Grade Curricula Parent Satisfaction Questionnaire

Attachment OOOO

Student Assent for Communications Focus Group Participation

Attachment PPPP

Parent Consent for Student Communications Focus Group Participation




ABSTRACT


Dating Matters: Strategies to Promote Healthy Teen Relationships™ is the Centers for Disease Control and Prevention’s teen dating violence prevention initiative, which is based on three important facts:

  1. Dating violence has important negative effects on the mental and physical health of youth, as well as on their school performance.

  2. Violence in an adolescent relationship sets the stage for problems in future relationships, including intimate partner violence and sexual violence perpetration and/or victimization throughout life. Therefore, early help is needed to stop violence in youth relationships before it begins and keep it from continuing into adult relationships.

  3. Although evidence suggests dating violence is a significant problem in economically disadvantaged urban communities, where oftentimes due to environmental factors an accumulation of risk factors for violence exists. To date, there have been few attempts to adapt the developing evidence base for prevention of dating violence to address these communities.


Recently, efforts to prevent teen dating violence have grown, particularly in schools and among policymakers and sexual violence and domestic violence prevention groups. Now many states and communities also are working to stop teen dating violence. However, these activities vary greatly in quality and effectiveness. To address the gaps, CDC has developed Dating Matters, a comprehensive teen dating violence prevention program based on the current evidence about what works in prevention. Dating Matters focuses on middle school youth in high-risk, urban communities. It includes preventive strategies for individuals, peers, families, schools, and neighborhoods. The primary goal of the current proposal is to conduct an outcome and implementation evaluation of Dating Matters to determine its feasibility, cost, and effectiveness. This evaluation of Dating Matters is conducted in the following cities: Alameda County, California; Baltimore, Maryland; Broward County, Florida; and, Chicago, Illinois.

`

A. Justification


A.1. Circumstances Making the Collection of Information Necessary


Background


The Centers for Disease Control and Prevention (CDC) is seeking a revision to the currently OMB-approved Information Collection Request entitled, “Evaluation of Dating Matters: Strategies to Promote Healthy Teen Relationships,” (OMB# 0920-0941, Expiration 6/30/2015). Dating Matters is a comprehensive approach to prevent teen dating violence among youth in high-risk urban communities. The current evaluation takes place in the following communities: Alameda County, California; Baltimore, Maryland; Broward County, Florida; and, Chicago, Illinois. Dating Matters consists of evidence-based or evidence-informed prevention strategies implemented at each level of the social ecology. The participants are middle school students, middle school parents, student brand ambassadors (i.e., slightly older youth who promote the communications messages with the target group), educators, school leadership, program implementers, student program master trainers, community representatives, and local health department representatives in up to four high-risk urban communities.


In order to address gaps in effective prevention programming for youth in urban communities with high crime and economic disadvantage, who may be at highest risk for teen dating violence (TDV) perpetration and victimization (O’Leary & Slep, in press), Dating Matters employs universal primary prevention focused on middle school youth in order to build a foundation of healthy relationship skills among all youth before dating and/or severe TDV is initiated.


Dating Matters takes a novel approach to TDV prevention that bridges diverse areas of public health by drawing on the best available research in areas such as TDV, youth violence, and sexual risk prevention (e.g., Vivolo, Holland, Teten, Holt, et al., 2010). As such, Dating Matters takes an approach that addresses the co-occurrence of a constellation of adolescent risk behaviors and violence (Whitaker, Morrison, Lindquist, Hawkins, et al., 2006) that may be particularly relevant in urban environments. Risk behaviors may have common influences including fundamental problems with how youth interact in relationships and how parents communicate with youth about healthy relationships.


The programmatic activities to be implemented in Dating Matters are described in Table 1. Because it is unclear what the effectiveness, cost, and sustainability of comprehensive TDV prevention is, the current proposal seeks to evaluate the two models of prevention implemented in Dating Matters: a standard approach—Safe Dates (Foshee, Bauman, Arriaga, Helms, et al., 1998) implemented in 8th grade—and a comprehensive approach, which includes implementation of prevention strategies across levels of the social ecology for youth, parents, and educators in 6th-8th grade, in addition to policy change efforts and communications strategies. Because of its effectiveness in preventing TDV and its widespread use, Safe Dates is considered “standard practice.” Therefore, the current evaluation compares outcomes and implementation of this standard practice to the outcomes and implementation of a comprehensive approach.


Table 1. Two prevention approaches in Dating Matters Strategies to Promote Healthy Teen Relationships

Standard Practice

Grade

Youth/Peers

Parent/Guardian

Educators

Communications

Policy

8th

Safe Dates

--

--

--

--

Comprehensive Approach

Grade

Youth/Peers

Parent/Guardian

Educators

Communications

Policy

6th

Adapted Student Curriculum*

Adapted Parent’s Matter!*

Dating Matters online training

Communications Strategies*

Policy Enhancement or Development

7th

Adapted Student Curriculum*

Adapted Parent Curriculum*

8th

Adapted Safe Dates

Adapted Families for Safe Dates

*CDC has developed curriculum and communications strategies


The data collection involves the evaluation of Dating Matters, which has been implemented in four urban communities (Alameda County, California; Baltimore, Maryland; Broward County, Florida; and, Chicago, Illinois) in 2012-2016. The evaluation utilizes a cluster randomized design in which 44 schools in four funded communities are randomized to either Dating Matters or standard practice. Participating schools were selected by local health department applicants in response to a funding opportunity announcement (FOA) (Attachment GGGG). As outlined in the FOA, eligible schools must be considered “high-risk” as operationalized, by having both above average rates of community or school crime and above average rates of school or community economic disadvantage. Applicants outlined in their response to the FOA the data that they used to support their school selection (see Proposed Community section of FOA). Applications underwent a thorough competitive objective review at CDC resulting in our four funded communities. For more information about the eligibility criteria and contents of the MOUs with schools, see the FOA in Attachment GGGG.


The details of the data collection design and samples are in Section B. In summary, students from 44 schools from 4 sites (12 each in 3 sites and 8 schools in one site) form the sample population. As per Figure1, in Alameda County for example, 4 schools were randomized to the comprehensive Dating Matters intervention (inclusive of a 6th – 8th grade student intervention, a parent intervention, a communications campaign, as described in more detail below), and 4 schools receive the Standard Practice (Safe Dates in the 8th grade). The distribution of schools to the Dating Matters treatment condition and the Standard Practice control group for the remaining three sites is also illustrated in Figure 1 below.


The data collection described in this proposal describes data collected by contractors (NORC at the University of Chicago, Ogilvy Public Relations, and RTI) that is used for the outcome evaluation and implementation evaluation of Dating Matters.


The content and direction of Dating Matters and the data collection reflects the input of several federal agencies. Collaboration with other agencies ensures the Dating Matters reflects the best available science and practice on teen dating violence and is unique from other federal efforts to prevent such violence. For example, representatives from the Department of Education, National Institute of Justice, and National Institute on Drug Abuse participated in the expert panels described in detail below; the federal inter-agency workgroup on teen dating violence was briefed on the initiative; and, CDC staff have presented on Dating Matters at a Congressional Briefing (Why Middle School Matters) on February 10, 2011, at the Office of Safe and Drug Free Schools (Department of Education) 2011 Meeting, and on a Teen Dating Violence Prevention and Awareness Month webinar hosted by the Administration for Children and Families. Dating Matters has also been included in briefings to and by multiple federal agencies. Finally, the announcement of the Dating Matters demonstration communities was announced to the public by Vice President Biden on September 13, 2011.


Shape1

*Please note that Figure 1 includes data on our total anticipated eligible sample size. (See Section B for anticipated participation rates). Samples sizes reflect variation in enrollment across schools, so standard schools, although more numerous have approximately the same enrollment as comprehensive.

Shape2


The data collection fits into the National Center for Injury Prevention and Control (NCIPC) Research Agenda Priorities in Preventing Sexual Violence and Intimate Partner Violence (http://www.cdc.gov/injury/ResearchAgenda/index.html) with regard to Tier 1 Part E to “Evaluate the efficacy and effectiveness of programs, strategies, and policies across all levels of the social ecology to prevent and interrupt development of perpetration of sexual violence and intimate partner violence” and Tier 2 Part H to “Evaluate the economic efficiency of programs, strategies, and policies to prevent perpetration of sexual violence and intimate partner violence” and Tier 2 Part J to “Conduct dissemination and implementation research regarding programs, strategies, and policies used in the primary prevention of sexual violence and intimate partner violence.”


Authority for CDC’s National Center for Injury Prevention and Control to collect this data is granted by Section 301 of the Public Health Service Act (42 U.S.C. 241) (Attachment A). This act gives federal health agencies, such as CDC, broad authority to collect data and do other public health activities, including this type of study.


Privacy Impact Assessment


  1. Overview of the Data Collection System


The evaluation of Dating Matters captures the outcomes and implementation of the initiative. The evaluation determines the effectiveness, feasibility, and sustainability of Dating Matters in order to inform decisions about whether or not Dating Matters should be more widely disseminated. Most information is collected via self-report surveys administered to program participants, implementers, and key stakeholders. Additional information about student participants is extracted from their school records. Contextual information about the implementation of the communications campaign, student and parent curricula is obtained via focus groups.


  1. Items of Information to be Collected


The various survey instruments to be used in the evaluation collect information about the outcomes and implementation of Dating Matters. The outcome evaluation of Dating Matters measures outcomes including but not limited to healthy relationships behaviors, dating violence, school-related disciplinary issues, school climate, and positive parenting, parent-child communication, and parental monitoring/supervision.


For the purposes of this package, the implementation evaluation refers to process evaluation, such as monitoring fidelity, and to tracking and monitoring the context (e.g., organizational capacity and readiness), participant satisfaction with program materials (e.g., focus groups, satisfaction questionnaires), and characteristics of implementation (e.g., cost, technical assistance provided).


Identifiable information is obtained from student participants, curricula implementers (i.e., a list of individuals attending each session), and parents in order to track exposure to program components and to track and link participants and effects over time. All other respondents complete surveys or participate in focus groups anonymously.


Due to the number and labeling of attachments, we have summarized the attachments and their purpose below:


Summary of Attachments


Outcome evaluation


  • Instruments in Attachments D, E, H, I,, EEEE, IIII

  • Screen shots of instruments administered electronically in Attachments RRR-1 & 2 and SSS-1 & 2

  • Educator, Implementer, and Parent Contact letters, emails, and telephone scripts in Attachments UUU-WWW and YYY-BBBB

  • Alternative contact form in Attachment XXX


Implementation Evaluation

  • Student and Parent Curricula Fidelity, Satsifaction, Technical Assistance, and Cost:

    • Attachments L-QQ, CCC-LLL, DDDD, MMMM, NNNN


  • Capacity and Readiness and Cost:

    • Instruments in Attachment K, SS, HHHH

    • Screen shots of instruments administered electronically in Attachments MMM, NNN, JJJJ


  • Communications Activities and Cost:

    • Instruments in Attachment J and RR


  • Focus Groups:

    • Materials in Attachments ZZ-BBB, KKKK



Outcome Evaluation:

Students, parents, and educators fill out surveys as part of the outcome evaluation. Students in the 6th, 7th, and 8th grades and parents of middle school students in these grades at both the comprehensive and standard-practice schools complete surveys during the school year. Students and parents participate in surveys at the beginning and end of the school year, while educators participate in surveys only once at the end of each school year (except in the first year when the educators do an additional baseline survey at the beginning of the year). In addition, school records with extant administrative data on each student ise obtained for use in the outcome evaluation. We track and assess the same students and parents over time, and therefore we collect personally identifiable data from them. Unique identifier codes are created and given to each student and parent participant; one tracking database contains the participants’ personally identifiable information, one contains the participants’ actual survey data identified only by the unique identifier code, and a third database contains the link between the participant identity (name and birthdate) and the unique identifier code. These data are collected and stored by the evaluation contractor (NORC) during the contract. The contractor employs database security measures compliant with CDC information security guidelines, including ITSO Encryption Best Practice (Version 1.00.11 of September 21, 2010) and NIST Guide to Protecting the Confidentiality of Personally Identifiable Information (NIST Publication 800-122). At a minimum, all databases are encrypted and kept on password-secured platforms, and all hard copy data are kept in locked storage cabinets in locked facilities. At the end of each school year, the evaluation contractor gives CDC the databases as a contract deliverable. All data deliveries to CDC are encrypted in compliance with Federal Information Processing Standard (FIPS) 140-2, Level 2. The contractor will destroy all the data at the end of their 5 year contract, after it has been safely and successfully handed over to CDC and after CDC has had an opportunity to verify the accuracy and completeness of the data. CDC will destroy the tracking database and the linking database at the end of Dating Matters data collection, but will maintain the survey database with unique identifier codes indefinitely for analysis purposes. Only selected individuals at the contracting firm (e.g., database manager) and at CDC (Craig Bryant, OMB PI and data manager) will have access to the tracking database and the survey data database. Fewer identified individuals at the contracting firm and CDC (no more than 2 at each workplace) have access to the third database linking participants with their unique identifier codes. Likewise, the privacy of parent survey data will be ensured through the use of unique identifier codes, with these codes, the links between the codes and individual names/birthdates, and the individual survey data itself maintained in separate databases. The secure maintenance and transfer of these databases will be carried out under the same security guideless and encrypted transfer protocols as outlined for the student identities and data above. Educator surveys, in contrast, are not collecting personally identifiable information. These data are anonymous and identified only by school at which the educator works. The database containing survey data is handled with the same security measures as the other survey data databases, but we do not have tracking or linking databases for the educators. This data will be delivered to CDC on the same deliverable schedule as the other survey data databases, and will be maintained and destroyed on the same schedule as the other survey data databases.


Most of the information needed to link the implementation and outcome data is stored in a separate crosswalk database and merged in during data cleaning, via the researcher-assigned ID number. However, a limited amount of additional information to link implementation and outcome data is collected on outcome evaluation instruments (examples provided below):


Attachment

Survey Name

Linking Variables to be Collected

Attachments D-E

Student Baseline and Follow-Up Surveys

Researcher-Assigned ID Number, Date of Birth `Program year, Student Survey ID, School Number, Classroom Number, Survey Iteration

Attachment G

School Indicators

Student Survey ID, School Number, Date, Program Year

Attachments H, EEEE

Parent Baseline and Follow-Up Outcome Surveys

Program year, Parent Survey ID, School Number, Survey Iteration

Attachments I, KKKK

Educator Baseline and Follow-Up Outcome Surveys

Site Number, Program Year, Survey Iteration

Attachment TT

Student Assent Forms

Date, Student Name, Student Tracking ID, Initial School, Current School, Student Status, Student Contact Info, Student Assent Information, Parent Consent Information

Attachment UU

Parent Consent for Student Form

Student Name, Student Grade, Date, Parent Name, Parent Address, Phone, and Email,

Attachment VV

Parent Consent for Parent Form

Parent Tracking ID, Origin School Number, Current School Number, Parent Name, Status, Parent Class, Parent Consent Information, Contact Info.

Attachment WW

Educator Consent Form

Educator Survey ID, School Number, School Name, Educator ID Status, Contact Info, Educator Consent Date, Educator Consent Withdrawal Date

Attachment XXX

Alternative Contact form

Date, Name, Survey ID Number, Tracking ID Number


Implementation Evaluation

Process and implementation data is collected from local health departments, program participants, community members, school leadership, and program implementers (e.g., teachers and parents implementing curricula, program trainers, communications coordinators and youth brand ambassadors implementing communications campaign). Implementation data includes information about what activities were conducted (and whether they were modified in their delivery), which activities generated active participation and which seemed to generate barriers in terms of student or parent receptivity, nature of technical assistance provided, the capacity and readiness of schools, the local health department, and community advisory boards to implement Dating Matters, implementation costs, and fidelity of implementation.


The evaluation contractor (NORC) conducts focus group meetings with the program implementers and students. The implementer focus groups covers the implementers’ perception of student receptiveness to the material; reports of adherence to curriculum; and what challenges were encountered. The student focus groups would help assess whether the intervention was implemented with high fidelity from the student’s perspective and help assess outcomes associated with the interventions (in a way that helped inform the quantitative outcome results). For example, we would ask participants whether there have been changes in the incidence of behaviors targeted by the instructionverbal abuse, inappropriate language, controlling and violent/harassing behavior. Our team then asks the participants to describe events upon which their judgment is based. The communications contractor (to be named) conducts focus groups with youth in the Dating Matters communities to inform revision of communications materials or use of new technologies.


In order to link outcome and implementation evaluation data, and to track fidelity over time, we collect personally identifiable data from student curricula, parent curricula, and Brand Ambassador implementers (on consent and assent forms only) including information about who implemented each session and who attended each session (e.g., see attendance logs in Attachments L-QQ, CCC-LLL) Unique identifier codes are created and given to each implementer, student, and parent participant; one tracking database contains the implementer and participants’ personally identifiable information, one contains the implementer’s and participants’ actual survey data identified only by the unique identifier code, and a third database contains the link between the participant identity and the unique identifier code. These data are collected and stored by the evaluation and technical assistance contractors during the contract. The contractors employs database security measures compliant with CDC information security guidelines, including ITSO Encryption Best Practice (Version 1.00.11 of September 21, 2010) and NIST Guide to Protecting the Confidentiality of Personally Identifiable Information (NIST Publication 800-122). At a minimum, all databases are encrypted and kept on password-secured platforms, and all hard copy data are kept in locked storage cabinets in locked facilities. At the end of each school year, the contractor will give CDC the databases as a contract deliverable. All data deliveries to CDC will be encrypted in compliance with Federal Information Processing Standard (FIPS) 140-2, Level 1. The contractor will destroy all the data at the end of their 4 year contract, after it has been safely and successfully handed over to CDC and after CDC has had an opportunity to verify the accuracy and completeness of the data. CDC will destroy the tracking database and the linking database at the end of Dating Matters data collection, but will maintain the survey database with unique identifier codes indefinitely for analysis purposes. Only selected individuals at the contracting firm (e.g., database manager) and at CDC (Craig Bryant, OMB PI and data manager) will have access to the tracking database and the survey data database. Fewer identified individuals at the contracting firm and CDC (no more than 2 at each workplace) will have access to the third database linking participants with their unique identifier codes.


In order to link implementation and outcome data, the following information is collected on implementation evaluation instruments and are included in each attachment (examples provided below):


Attachment

Survey Name

Linking Variables to be Collected

Attachment J

Brand Ambassador Implementation Survey

Survey Date, Site Number, Program Year, Survey Iteration

Attachment K

School Leadership Capacity and Readiness Survey

School Number, Survey Date, Program Year, Survey Iteration

Attachment L-T

Parent Program Fidelity Grade X Session Y

Implementer Name, Implementer Survey ID, Classroom Number , School Number, Session Number, Parent/Guardian Names, Program Year, Response Date/Time

Attachment U-QQ, CCC-LLL

Student Program Fidelity Grade X Session Y

Implementer Name, Implementer Survey ID, Classroom Number, School Number, Session Number, Student Names, Program Year, Response Date/Time, Grade

Attachment RR

Communications Campaign Tracking

Site Number, Program Year, Survey Date, Communications Tracking Number

Attachment SS

Local Health Department Capacity and Readiness and Cost

Site Number, Program Year, Survey Date

Attachment XX

Brand Ambassador Assent Form

Response Date, Site Number, Brand Ambassador Name, Brand Ambassador ID, Program year, Brand Ambassador Tracking ID Number, Brand Ambassador Status

Attachment YY

Brand Ambassador Parent Consent Form

Response Date, Site Number, Brand Ambassador Name, Brand Ambassador ID , Program year, Brand Ambassador Tracking ID Number, Brand Ambassador Status

Attachment ZZ

Student participant focus group guide

School Number, Survey Date

Attachment AAA

Student curricula implementer focus group guide

School Number, Survey Date

Attachment BBB

Parent curricula implementer focus group guide

School Number, Survey Date

Attachment DDDD

Student program master trainer and Parent Program Manager TA forms

Survey Date, Site Number, Trainer ID, Trainer Name, School Number/Name


Contractors directs the collection of all data.


  1. Identification of Website and Website Content Directed at Children Under 13 Years of Age


The information collection does not involve any websites or website content that is directed at children less than 13 years of age.


2. Purpose and Use of the Information Collection


All data is used in the evaluation of the Dating Matters initiative, which includes evaluations both of the program outcomes and implementation. No teen dating violence comprehensive program has been developed and implemented specifically for high risk urban communities. Further, no other data source exists to examine these questions. Therefore, this data collection is critical to understand the effectiveness, feasibility, and cost of Dating Matters and to inform decisions about disseminating the program to other communities.


Privacy Impact Assessment Information


  1. Why Information is being Collected


Outcome Evaluation

The comprehensive approach in Dating Matters is being compared to the existing standard of care approach (Safe Dates as published, implemented in 8th grade). It is critical to collect survey data from students, parents, and educators in order to compare the effectiveness of the two models of prevention on key outcomes such as students’ engagement in dating violence behaviors, healthy relationship behaviors, and other known risk factors such as attitudes toward dating violence, substance use, etc. Although theory and empirical evidence suggest that a comprehensive approach will have more impact over time, only this type of comparison will allow us to examine whether the comprehensive approach is more effective at reducing teen dating violence than Safe Dates delivered to 8th graders as the standard of care and is therefore worth the extra resource investment from communities. Collection of the outcome data informs our understanding the relative impact of the comprehensive approach. Also, associated with the outcome evaluation is the collection of school indicator data, including administrative data regarding individual students.


Implementation Evaluation

The primary purpose of the implementation evaluation is to track and monitor program implementation and fidelity, utility of program materials, cost of implementation, technical assistance provided, and organizational and community capacity/readiness to implement Dating Matters. A major component of this evaluation is monitoring fidelity of implementation through implementer session logs and understanding barriers to implementation though the session logs and focus groups. By feeding this information back to the implementers through the TA provider, these data assists us in providing specific and timely technical assistance to the implementers. The data also assists us in interpreting the outcome evaluation results. Not collecting this would prevent us from understanding how well the program activities were implemented, what barriers existed for implementation, and, thus, would prevent us from providing appropriate and targeted training and technical assistance (T/TA). Furthermore without this data it would be unclear what aspects of the initiative program participants received and would challenge the interpretability of the outcome evaluation findings. Furthermore, cost estimates would not be possible and would hinder our ability to make informed recommendations about more widespread implementation of the program.


  1. Intended Use of the Information


Because all data is linked in some way, all data are treated with the strictest security in order to protect the privacy of all participants. For the student and parent outcome surveys a Certificate of Confidentiality was obtained on June 5, 2012, in order to protect the confidentiality of data from external requests/subpoena for the data. All data collection and data management staff are well-trained in maintaining information security at all stages of the data collection and data management process. Protocols for data collection at schools and at other data collection sites ensures that names, birthdates, and all other personally identifiable information is kept secure during all stages of data collection. Recruitment lists, consent forms, and all survey data are never be left unattended while data collectors are in the field, and all data are kept in locked, secure facilities when safely delivered back to the contracting firm (NORC at the University of Chicago) where data are stored.


All data are stored in encrypted databases on password secured data platforms. As mentioned previously, identified data are linked only with a unique identifier code and be kept in a separate database from personally identifiable data, and a third database with extremely limited personnel access (only CDC and contractor database manager have access) contains information linking participants with their unique identifier codes. Identified data are initially stored and maintained by the evaluation contractor, but will be handed over to CDC on an annual basis. Only Craig Bryant, CDC Data Manager has access to identified data. The same information security protocol is followed at both facilities. The contractor will be required to destroy all data at the end of their contract (July 31, 2016; with the provison that the data has been safely and successfully handed over to CDC and CDC has had an opportunity to verify the accuracy and completeness of the data). Once data collection is completed and data have been checked and cleaned for the final time, CDC will destroy the database containing personally identifiable data and the database linking participants’ identities to their unique identifier codes. The de-identified survey database will be maintained until all analysis has been completed.


Due to the large nature of this data collection and federal expenditure to collect it, restricted-use datasets are created from the outcome evaluation data. These restricted-use datasets are completely de-identified, and any demographic information or other variables with such low endorsement that might allow the identification of respondents will be removed from the dataset before publication. Restricted use datasets will be constructed and made available to researchers per CDC and HHS procedures.


All publication of this data will be in aggregate form. No respondent would ever be able to be identified from the information provided to the public at the aggregate level.


Outcome evaluation

CDC uses the outcome data to compare the relative effectiveness of the comprehensive Dating Matters approach to the existing standard of care approach (Safe Dates, as published, implemented in 8th grade). Schools have been selected and randomly assigned to receive one or the other model of prevention., We measure intended outcomes (student’s engagement in healthy relationship behaviors and dating violence, attitudes, delinquency, substance use, peer relationships, etc.; parents’ monitoring of dating, knowledge about teen dating violence, etc; and educators’ perceptions of school climate and school norms). CDC analyzes and examine the data over time in order to determine the relative effectiveness of the two models and determine whether the comprehensive approach is worth the extra resource investment required to implement it. CDC will disseminate results to peer-reviewed journals readers and professional conference participants, as well as through an executive summary and a full report. The executive report will be written in clear language to be understandable by a wide range of audiences (local health departments, schools, researchers). If the Dating Matters approach is effective at reducing teen dating violence, it is likely that press releases will be shared with the media in order to communicate the information with the general public. Data on efficacy may lead to appropriate reports and recommendations for widespread implementation from CDC. The longitudinal nature of the student dataset will also allow us to examine risk and protective factors for dating violence over time.


Implementation Evaluation

A critical component of an independent evaluation is the collection of multiple forms of data that can serve both as inputs to an ongoing process evaluation and to the eventual outcomes evaluation. CDC will use the implementation evaluation data to (1) provide appropriate and targeted training and technical assistance, (2) inform future implementations of Dating Matters (e.g., address common barriers, etc), and (3) track and monitor aspects of the implementation that will provide descriptive information about the program (e.g., cost) and will assist us in interpreting the results (e.g., capacity/readiness, student and implementer focus group feedback).


The implementation evaluation results will be used to inform the dissemination of Dating Matters (if effective). CDC will disseminate results to peer-reviewed journals readers and professional conference participants, as well as through an executive summary and a full report. The executive report will be written in clear language to be understandable by a wide range of audiences (local health departments, schools, researchers).


3. Use of Improved Information Technology and Burden Reduction

We utilize advanced technology to collect and process data to reduce respondent burden and make data processing reporting more timely and efficient. In all data collections, the numbers of questions are held to the absolute minimum required for the intended use of the data. Due to the student respondents’ age and due to practical considerations, we do not collect student surveys via online electronic survey forms; instead, student survey data are collected using scannable paper-and-pencil questionnaires.


Parent and educator surveys and the capacity and readiness assessments for the local health department, community advisory boards, and school leadership take place online using electronic survey forms. . In addition, the parent surveys are available in the paper and pencil format. All other instruments are administered using paper and pencil format. Screen shots of all questions to be administered electronically are included in Attachments MMM (Local Health Department Capacity and Readiness), NNN (School Leadership Capacity and Readiness), JJJJ (Community Capacity and Readiness), RRR-1 & RRR-2 (Educator Outcome Baseline & Follow-Up Surveys), and SSS-1 & SSS-2 (Parent Outcome Baseline & Follow-up Surveys).


4. Efforts to Identify Duplication and Use of Similar Information

No other data exists that could be used to evaluate the outcome and implementation of Dating Matters. Dating Matters has never been implemented. No publically available data on teen dating violence exists and as such no other existing data may be used to assess the variables of interest in the current proposal. In summary, Dating Matters represents a new approach to preventing TDV which will be implemented for the first time in CDC funded communities in 2012-2016 and as such a new information collection is required to evaluate its outcome and implementation.


5. Impact on Small Businesses or Other Small Entities

As required by law, if CDC contractors for the evaluation are not small businesses, they will subcontract with small businesses. NORC at the University of Chicago will employ the services of several small vendors for formatting and scanning the paper and pencil surveys, printing and mailing surveys and other respondent recruitment materials, and programming the online questionnaires. However, beyond this there is no anticipated impact on small businesses related to this data collection.


6. Consequences of Collecting the Information Less Frequently


Outcome evaluation:

The present study will provide the primary outcome data needed for local, state, and federal policy makers to assess the effectiveness of the Dating Matters program on dating violence perpetration and victimization among adolescents. Students and parents will participate in surveys at the beginning and end of the school year, while educators will participate in surveys only once at the end of each school year (except in the first year when the educators will do an additional baseline survey at the beginning of the year). Data will be extracted from student school records once a year. Adolescence is a time of enormous growth and developmental change; thus, frequent assessment of main outcomes and hypothesized mediators are necessary in order to best capture program effects and determine causality. Less frequent outcome evaluation data collection of each of the three respondent groups would not allow for adequate measurement of the relative impact of the two models of prevention on key outcomes.


Implementation evaluation:

The study will provide information on the fidelity, cost, and context of implementation as well as information about what activities were completed, which were most successful, which need improvement, and which participants were exposed to which intervention. Cost questions are includes throughout different implementation instruments. Capacity and readiness assessments will completed annually by each local health department, community advisory board, and by school leadership in comprehensive schools only. Annual assessments will allow us to examine capacity and readiness as the sites progress from an early implementation stage to full implementation. Less frequent implementation evaluation data collection from curricula implementers would not allow for this data to be used in providing targeted and appropriate training and technical assistance.


Brand Ambassadors will complete surveys two times per year. The first survey will be administered at the midpoint of participation. A follow up survey will be administered at the conclusion of the program, which will occur at the end of the academic year. These data collections are necessary to inform both the quality of the training as well as overall perception of the program. Less frequent data collection would not allow for timely feedback and improvement of the program.


The Communications Coordinators (who will also oversee the Brand Ambassador program and other communications activities) will track communications activities quarterly using the Communications Campaign Tracking Form. Quarterly reporting of such information (e.g., use of social media) is necessary given the rapid changes in such media. Less frequent data collection would not allow for timely feedback and improvement of the program.


The fidelity logs will be completed by the implementers after each session. The purpose of these instruments is to document program activities that occur, capture which participants were exposed to those activities, and identify any barriers to implementation. Information will be used to provide targeted and appropriate training and technical assistance to implementers. Due to the different lengths and implementation plans for the student (school-based with multiple implementers per site) and parent curricula (community-based with a small number of implementers per site), it is expected that student curricula implementers will conduct one round of the curricula each year and parent curricula implementers will complete up to three rounds of the curricula each year. Less frequent data collection would not allow timely feedback and improvement of program activities, including feedback to implementers about adherence to curricula and would not allow us this additional information with which to interpret the outcome evaluation results.


Parent program participants will complete a satisfaction form upon completing the 6th and 7th grade parent curricula. The purpose of these instruments is to gauge parent satisfaction with the parent program, obtain information about how parents have used the information and skills they have learned, and identify areas for improvement. A maximum of 18 parents participating in each group per grade per school, with up to 5 parent groups conducted per year will complete the survey.


The student program master trainer technical assistance form will be completed each time the master trainer receives a technical assistance question from a student program implementer to track the nature of technical assistance request and the response to each request. Each site will have a maximum of 3 master trainers and it is anticipated that they will receive no more than 50 requests per year. This information will be used in the implementation evaluation to understand how training can be improved and what additional training may be needed.


Similarly, the parent program manager technical assistance form will be completed each time the parent program manager receives a technical assistance question from a parent program implementer to track the nature of technical assistance request and the response to each request. Each site will have a maximum of 1 program manager and they will receive no more than 50 requests per year. This information will be used in the implementation evaluation to understand how training can be improved and what additional training may be needed.


Student focus groups (2 per site per year over 4 years) will be conducted with ten students from each school to assess the student curricula. Guided discussion topics will include what students think about the project including classroom and other related activities, and whether students think any of the project activities have changed other students’ attitudes or behaviors about violence and harassment in their school. Discussion about things students do, what they think about selected attitudes and behaviors, and their relationships with other people, including boyfriends or girlfriends or people they hang out with will contribute to the interpretation of the evaluation data collected via the student survey instruments. Students will be made to feel comfortable to participate in this discussion whether or not they have a current or past boyfriend or girlfriend.


Additional insight regarding the student curricula and the parent curricula will be collected via focus groups with the implementers of the interventions. In separate focus groups of 10 implementers for the two curricula (2 focus groups per site per year), the evaluation contractor (NORC at the University of Chicago) will guide discussion regarding which components of the curricula worked the best in terms of participant engagement and learning, and which components were difficult to present. Implementers’ feedback about the strengths and weaknesses of the curriculum will be elicited.


Communication focus groups will be conducted annually to inform the revision or updating of communications materials. Conducting fewer focus groups will not provide the rich feedback needed to update and revise communication strategies, including messages and materials.


7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

This request fully complies with the regulation 5 CFR 1320.5.


8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


A.8.1. A 60-day notice to solicit public comments was published in the Federal Register (volume 77, No. 195, pages 61413-61415) on October 9, 2012; Attachment B contains a copy of the notice. One public comment was received. The standard CDC response was sent.


A.8.2. External input.

A series of expert panels were held to inform the development, implementation, and evaluation of Dating Matters. The following outlines the panels and their participants:


Implementation Panel--Communications (December 8-9, 2009)

Catherine Stayton, PhD, Director, Injury Epidemiology Unit, NYC Dept. of Health & Mental Hygiene

Julia Perilla, PhD, Assistant Research Professor, Georgia State University

Kristin Schubert, MPH, Program Officer, Robert Wood Johnson Foundation

Ivan Juzang, President/Founder, MEE Productions

Heathe Luz McNaughton Reyes, PhD, MPH, Postdoctoral Researcher, Department of Health Behavior and Health Education, Gillings School of Global Public Health, University of North Carolina

Olis Simmons, Executive Director, Youth Uprising

Nneka Norville, Senior Public Affairs Manager, BET Networks

Lisa Witter, Chief Operating Officer, Fenton Communications


Implementation Panel--Policy (May 26-28, 2010)

Eve Birge, Education Program Specialist, U.S. Department of Education

Megan Foreman, Policy Specialist, National Conference of State Legislatures, Health Program

Deborah Gorman-Smith, Research Fellow, University of Chicago

Cheryl Grills, Professor and Chair of Psychology, Loyola Marymount University

Catherine Guerrero, Program Director, Colorado Department of Public Health and Environment

Carrie Mulford, Social Science Analyst, National Institute of Justice

Heather O'Beirne Kelly, Senior Legislative & Federal Affairs Officer, American Psychological Association

AJ Pearlman, State Policy Attorney, Break the Cycle

Brad Perry, Sexual Violence Prevention Coordinator, Virginia Sexual & Domestic Violence Action Alliance

Barri Rosenbluth, Expect Respect Program Director, Safe Place

Sally Schaeffer, Senior Public Policy Advocate, Family Violence Prevention Fund

David Wolfe, RBC Chair in Children's Mental Health Centre for Addiction and Mental Health

Caroline Ledlie, Program Officer, Centers for Disease Control Foundation

Kristin Schubert, Program Officer, The Robert Wood Johnson Foundation

Elizabeth Zurich, Health Policy Lead, Centers for Disease Control and Prevention

Kathleen Rutherford, Senior Mediator, Meridian Institute

Mark Jacobs, Mediator, Meridian Institute


Evaluation Methodology Panel (October 13-14, 2010)

Laura Leviton, Robert Wood Johnson Foundation

Rhonda BeLue, Penn State Methodology Center

Michael Cleveland, Penn State Methodology Center

Pamela Orpinas, University of Georgia

Leslie Snyder, University of Connecticut

Martie Thompson, Clemson University

Jacqueline Lloyd, National Institute on Drug Abuse


Implementation Panel—Capacity/Readiness (December 13, 2010)

Barbara Blumenthal, PhD, Independent Consultant; Visiting Lecturer, Blumenthal Consulting, LLC

Abigail Fagan, PhD, Assistant Professor, University of South Carolina

Paul Flaspohler, PhD, Assistant Professor, Miami University

Catherine Guerrero, MPA, Rape Prevention and Education Program Manager, North Carolina Division of Public Health

Pamela Jumper-Thurman, PhD, Senior Research Scientist, Colorado State University

Wendi Siebold, Senior Research Associate, Evaluation, Management, and Training Associates


Implementation Panel--Adaptation (January 10, 2011)

Barbara Ball, PhD, Program Evaluation Specialist; Start Strong Austin, Project Director SafePlace

Paul Flaspohler, PhD, Assistant Professor, Miami University

Warren Passin, MPH, MSW, Project Manager, ICF Macro

Hank Tomlinson, PhD, Behavioral Scientist, Centers for Disease Control & Prevention

Donna-Marie Winn, PhD, Research Scientist, University of North Carolina at Chapel Hill, FPG Child Development Center


9. Explanation of Any Payment or Gift to Respondents


In general, no payments or gifts will be given to respondents for participation in the evaluation surveys or focus groups. However, NORC anticipates that parent survey participants will need to be minimally compensated in order to maximize response rates. Theory (Blau 1964; Homans 1961) and experience (Dillman et al. 2009) dictate the provision of nominal incentives (Foster et al. 2010) to ensure adequate participation in the project without coercion. 


Our approach to incentives is also based on NORC’s decades of experience in survey research and the need to balance motivating respondents to participate without offering a coercive sum (i.e., a sum that a low-income individual would find difficult to refuse) (Dillman et al. 2009).  As the evaluation contractor, NORC considered alternative approaches, but selected a low-cost graduated incentive approach as the most effective design, based on the literature and their experience. NORC will implement a system of a graduated parent incentives (Foster, 2010), which begins with no incentives in the first contact.  Subsequently, NORC would conduct a small incentive experiment in which half of the initial non-responders would be offered a $2 incentive (in cash with the recruiting request) and the other half would be promised a $5 donation to the school on behalf of the parent for school equipment/resources once the completed survey is received (e.g., 100 completed surveys would yield $500 for a school).  Based on the results of the incentive experiment, one of the two incentive protocols will be selected for subsequent years.


Following the full course of parent survey recruitment, as designed, NORC will conduct a Non-Response Bias Experiment for a small sample of 5 non-responding parents from each of 44 schools.  Each sampled parent completing the survey over the telephone, on the web, or in person (up to n=220) would receive $25, delivered in person by the interviewer (in cash) or through a mailed check.


To conform to school policies, no incentives will be provided for educator completion of the Educator Surveys, implementer completion of session logs, student completion of surveys, completion of Brand Ambassador surveys, or most focus group participation. However, as is customary with market research, incentives (which will be determined based on market rate, likely between $75-$100 per 90 minutes) will be provided to the communication focus group participants to compensate them for their time. It is anticipated that without such compensation, attendance will be too low to have robust discussions and draw meaningful conclusions from the discussions since the focus groups take place outside of school or other space with mandatory attendance.


10. Assurance of Confidentiality Provided to Respondents


Outcome and Implementation Evaluations:

Because all data will be linked in some way, all data will be treated with the strictest security in order to protect the privacy of all participants. For the student and parent outcome surveys s), a Certificate of Confidentiality was obtained on June 5, 2012 in order to protect the confidentiality of data from external requests/subpoena for the data. All data collection and data management staff will be well-trained in maintaining information security at all stages of the data collection and data management process. Protocols for data collection at schools and at other data collection sites will ensure that names, birthdates, and all other personally identifiable information is kept secure during all stages of data collection. Recruitment lists, consent forms, and all survey data will never be left unattended while data collectors are in the field, and all data will be kept in locked, secure facilities when safely delivered back to the contracting firm where data will be stored.


For students and parents, the first time data is collected (parent consent and student assent for participation in the survey), the respondent will give us personally identifiable information in the form of his/her full name. In addition, in the parent consent for student participation, the parent will be asked to provide his/her address, phone, and email. Attached to the parent and student surveys will be a form to collect alternative contact information for people who will know how to contact the family if NORC cannot contact them, to facilitate follow-up at later iterations of the survey. Unique identification numbers will then be created for each consented parent and student. The scannable survey forms for each participant will be marked only by the unique identifier code assigned to that individual. The first page of the survey will contain information (on a removable label) with the respondent’s name. This will allow staff to distribute surveys easily in classrooms. Once the survey is handed to the correct respondent, the respondent can tear off the removable sticker containing her/his name, so that from that point on, only the unique identifier code can be connected with the information provided in the survey. This process will occur at all administrations of the surveys. The ID-to-name crosswalk will only be available to a limited number of evaluation contractor staff (NORC’s principal investigator and NORC’s project manager). After being extracted, all identifiable school data will contain the unique identifier code in lieu of the respondent’s name. Programming and server set-up for the online parent surveys will follow strict guidance for online data security. Personally identifiable data will be immediately encrypted upon entry, and there will be no way for anyone else to link the survey data with names or other personally identifiable data. Curricula implementers will also record the names of parents and students who attend each session to assess exposure to the curricula. This information will also be protected and coded to allow linking to the student and parent data.


Other process and implementation data, including data that tracks and monitors implementation will be identified by the linking variables included in Section A1ii. Data will only be presented and analyzed in aggregate form. The Certificate of Confidentiality protects the individual participants from release of personal data, even to students’ parents who request the information. With this Certificate, we cannot be forced to give any information about a child to any court or legal proceedings, even if they tried to get it. CDC might have to give information to DHHS if they needed to evaluate the overall study, but that is not likely. The only other time that CDC or NORC at the University of Chicago might have to share information is when researchers, who are required to protect a child through mandated reporting laws, learn from a child that he or she is being hurt by an adult or planning to hurt him/herself or someone else.


All data will be stored in encrypted databases on password secured data platforms. As mentioned previously, for students and parents, survey data will be linked only with a unique identifier code and be kept in a separate database from personally identifiable data, and a third database with extremely limited personnel access (only CDC and contractor database manager will have access) will contain information linking participants with their unique identifier codes. Identified data will initially be stored and maintained by the evaluation contractor, but will be handed over to CDC on an annual basis. Only Craig Bryant, CDC Data Manager will have access to identified data. The same information security protocols will be followed at both facilities. The contractor will be required to destroy all data at the end of their contract (September 12, 2016). Once data collection is completed and data have been checked and cleaned for the final time, CDC will destroy the database containing personally identifiable data and the database linking participants’ identities to their unique identifier codes. The de-identified survey database will be maintained until all analysis has been completed.


Due to the large nature of this data collection and federal expenditure to collect it, restricted-use datasets will have to be created from the outcome evaluation data. These restricted-use datasets will be completely de-identified, and any demographic information or other variables with such low endorsement that might allow the identification of respondents will be removed from the dataset before publication.


All publication of this data will be in aggregate form. No respondent would ever be able to be identified from the information provided to the public at the aggregate level.


Privacy Impact Assessment Information

  1. This project is not subject to the Privacy Act. The applicable System of Records Notice (SORN) is 09-20-0160, “Records of Subjects in Health Promotion and Education Studies”.

  2. Data that are collected will be stored physically and electronically by the contractors collecting the respective data at their offices. Electronic databases will be transferred to CDC on an annual basis. Hard copies of data will be destroyed after the data has been successfully entered, cleaned and backed up.

  3. Respondent assent/consent will be obtained prior to data collection. The following describes how assent and consent will be obtained for each of the instruments associated with human subjects research. These instruments have been approved by local IRBs and a Certificate of Confidentiality has been obtained. No other data collection instrument requires consent.


Outcome Evaluation Consent and Assent:


Student Participation:

The purpose of the school-wide surveys is to assess students’ engagement in healthy relationship behaviors, perpetration and victimization of physical, sexual and psychological dating violence, and other mediating variables (e.g., attitudes related to violence, drug/alcohol use). Research assessing such self-reported attitudes and behaviors poses no more than minimal risk to participants; and to provide the best estimates of these factors it is vitally important to maximize the sample size and representativeness of the samples. Due to the requirements of the review boards of the local school districts and public health departments, active parental consent will be sought for student participation in all aspects of the evaluation data collection.


Parent consent for student surveys. Parents will be informed about the survey, the potential release of school data, the topics covered by the survey, and the voluntary nature of the survey in several ways. First, a letter providing written notice about the survey and containing the parent consent form will be sent or given to all families. Second, a similar announcement about the survey will be included in the school newsletter or other school publication. Third, a flier about the survey will be given to all students to take home. The exact notification mechanism and content of the letters/announcement will be determined by the evaluation contractor (NORC), in consultation with each school in which either the comprehensive initiative or standard practice will be implemented.


Attached is the letter and consent form to be sent to the students’ parents or other caregivers (see Attachment UU). The letter and consent form contain the following elements: contact information for a school staff member, contact information for NORC staff member and IRB administrator, description of the protections of the Certificate of Confidentiality, the right to request a copy of the student survey, the right to ask questions about the survey, and/or to refuse permission for their child to take part in the survey. Permission will also be sought to obtain the student’s school data. Letters and parent consent forms will be distributed at least several weeks before the administration of the first survey. Parents will be able to consent or refuse their child’s participation until the last business day before the start of data collection.


Assent for student surveys. Student assent will involve the following procedures: Several weeks prior to data collection, students will be informed about the upcoming survey and encouraged to discuss the content and their potential participation with their parents. Students will be informed that their participation is voluntary. They will also be given the name and number of a school social worker or counselor with whom they can discuss any concerns prior to, during, or after the survey. All students who are in attendance on the days of data collection, who have been given permission by their parents, and who provide written assent will be eligible to participate in the school survey. The student survey will be administered in English only, meaning that non-English speaking students will not be eligible to participate.


At the first administration of the study survey, students will be verbally and visually presented with information about the research study and the survey, instructed as to their rights as a research participant, given the opportunity to have their questions answered, and asked for their assent. They will be informed that their participation is voluntary. They will also be given the contact information of a school staff member or counselor with whom they can discuss any concerns prior to, during, or after the survey. For those students who choose to participate, the survey will then be administered by the contractor in the regular classroom immediately following assent. Students will respond via forms that can be electronically scanned. Students whose parents prefer that they not participate and students who do not provide assent will be provided with alternative educational activities (e.g., crossword or word search puzzles).


Educator Participation.

Educator consent for surveys. Consent procedures for the educator surveys will include the following: Several weeks prior to data collection, educators will be informed about the upcoming survey and encouraged to discuss the content and their potential participation with the evaluation contractor (NORC). All educators – in both the standard practice and comprehensive initiative schools -- will be eligible to participate in the school survey. Following an email invitation to participate in the online survey, Educators will connect to an electronic online information page that provides an opportunity for informed consent to participate in the survey. Individual educators who confirm their consent by clicking on the appropriate button (e.g. “I agree to participate in this survey. I understand that my participation is voluntary and that I can stop participating at any time.”). After providing consent, the educator will enter his or her email address, and will automatically be sent a unique PIN and password to log into the survey. This email will contain another link to the survey, which the educator will click, and then enter his or her PIN and password. No link between the email addresses and the actual survey data will exist as all educator survey data will be collected anonymously.


At each data collection time point, educators will be emailed with information about the research study and the survey, instructed as to their rights as a research participant, given the opportunity to have their questions answered, and asked for their consent. Educators will be informed that their participation is voluntary. For those educators who choose to participate, the email will contain a link to the informed consent statement. As described above, when they click on the link, the first screen will contain the informed consent form, and educators will be told that by clicking to continue to the survey they are indicating their consent to participate. They will then enter their unique PIN and password to access the survey. These processes will be followed for each of the survey administrations throughout the project period.


Parent/guardian Participation.

Parent/guardian consent for surveys. Participants in the parent curricula will be recruited for the evaluation at the same time they are recruited for the parent curricula. We expect that they will be consented and complete the baseline survey at the first parent group, before they begin any curriculum content. In the case of Families for Safe Dates, which will be implemented with 8th grade parents, they will be mailed consent forms and the survey before the other materials and asked to complete and mail back the survey before beginning the curriculum. Parents in the standard of care schools will likely be recruited from some other parent class or parent group, in order to recruit a comparison group who is similar to the intervention group in terms of willingness to participate in a parent class or group. They either will be recruited and consented on site or they will be mailed invitation letters including a link to the online survey and a copy of the paper survey. Parents will be asked to participate in a follow-up survey at the end of the school year as well. At both time points, parents will be informed that their participation is voluntary. The exact notification mechanism and content of the letters/announcement will be determined by the contractor, in consultation with each school in which either the comprehensive initiative or standard practice will be implemented.


Again, although exact consent procedures will be determined by the contractor and the school, procedures will be similar to the following: Once the parent sample is identified, parents/guardians will be informed about the upcoming survey and encouraged to discuss the content and their potential participation with the evaluation contractor. Parent surveys will be completed onsite (at the parent curriculum group or similar group in standard of care schools; completed online through an emailed link as with the educator surveys; or mailed to the participants.


Our approach to the parent survey begins with the selection of a random sample of parents from the students in the study; augmenting the sample of parents participating in the parenting program, a small sample of parents not participating in the parenting program will be sampled. The parent contact information obtained on the parent consent for student form will be used to send an email or letter with an invitation to participate in the survey online or via hardcopy (which will be included in the mailing). Multiple email invitations will be sent to each of the available email addresses. The hard copy package will also contain a link for those parents who prefer to complete the survey online. NORC will then do multiple prompts by mail to acquire a completed survey. Once again, one parent/guardian (the primary caregiver) will be instructed via the cover letter to complete the survey and mail it to NORC using the provided postage paid envelope. Approximately 4 weeks after the mailing of the initial survey, NORC will send non-responding parents a replacement copy of the survey packet. The second packet will also include a letter of endorsement from a school official. Upon receipt of the completed Parent survey, NORC will update the case management system indicating that the survey has been returned and prepare the survey for scanning.


Implementation Evaluation Consent and Assent:

Human subjects approval and consent/assent is required for the student, communications, and implementer focus groups, as well as participation in the Brand Ambassador survey.


Student Focus Group Participation

Parent consent for student participation. Student participants will be provided a consent form for their parents to complete and return before students can participate in a student focus group (students will not be expected/invited to participate in more than one focus group over the life of the project). The consent form provides an overview of the purpose and design of a student focus group session. The form clearly notes a student’s participation is voluntary and participants can discontinue their participation at any time. Additionally, the form provides contact information for the Project Director of the evaluation contract and the NORC IRB administrator to address any questions parents may have.


Student assent for focus group participation. Student participants will be provided an assent form to complete and return before they can participate in a student focus group. The assent form provides an overview of the purpose and design of a student focus group session. The form clearly notes the evaluation is voluntary and participants can discontinue their participation at any time. Additionally, the form provides contact information for the student’s school counselor, a NORC staff person and the NORC IRB administrator to address any questions they may have.


Implementer Focus Group Participation

Implementer consent to participate. Implementers will be provided a consent form to complete before they can participate in a focus group (implementers will not be expected/invited to participate in more than one focus group). The consent form provides an overview of the purpose and design of the focus group session. The form clearly notes that the implementer’s participation is voluntary and participants can discontinue their participation at any time. Additionally, the form provides contact information for a NORC staff person and the NORC IRB administrator to address any questions they may have.


Communication Focus Groups

Parent consent. Parents will be provided a consent form, by the recruitment agency/focus group facility, to complete and return the day-of the focus groups. If a consent form is not provided, the youth cannot participate in the focus groups. The consent form provides an overview of the research and requirements of participation in the focus groups. The form clearly notes participation is voluntary and participants can discontinue their participation at any time. Additionally, the form provides contact information of the communications contractor representative leading research to address any questions parents may have.


Student assent. Parents will be provided with an assent form for the youth to complete and return the day-of the focus groups before they can participate in the focus groups. The assent form provides an overview of the research and requirements of participation in the focus groups. The form clearly notes participation is voluntary and participants can discontinue their participation at any time. Additionally, the form provides contact information of the communications contractor representative leading research to address any questions participants may have.


Brand Ambassador Survey Participation

Parent consent for brand ambassadors. Brand ambassador participants will be provided a consent form for their parents to complete and return before brand ambassadors can participate in the brand ambassador implementation evaluation (two data collections). The consent form provides an overview of the brand ambassador program and requirements of participation in the brand ambassador program. The form clearly notes that the evaluation is voluntary and participants can discontinue their participation at any time. Additionally, the form provides contact information of the manager of the brand ambassador program and the IRB administrator to address any questions parents may have.


Student assent for brand ambassadors. Brand ambassador participants will be provided an assent form to complete and return before they can participate in the brand ambassador implementation evaluation. The assent form provides an overview of the brand ambassador program and requirements of participation in the brand ambassador program. The form clearly notes the evaluation is voluntary and participants can discontinue their participation at any time. Additionally, the form provides contact information of the manager of the brand ambassador program to address any questions they may have.


11. Justification for Sensitive Questions

Only the outcome evaluation contains sensitive items.


Outcome Evaluation:

The student surveys, parent surveys, and school data to be collected in this proposal include sensitive questions. The primary outcome on which we expect Dating Matters to have an impact, perpetration and victimization of dating violence behaviors, is a sensitive topic, and in order to measure impact on dating violence, we must ask students directly about their perpetration and victimization of dating violence. In addition, many of the other empirically supported risk factors that we expect may change as a result of exposure to the two models of prevention (e.g., substance use, risky sexual behaviors, attitudes toward dating violence, engagement in delinquent behaviors, school disciplinary problems) are also sensitive topics, and in order to measure program impact, we must ask questions directly about these topics. Parents will be asked some sensitive questions, such as questions about their parenting behaviors and their own resolution of conflict in their relationships. The consent forms disclose to parents and students that some of the survey questions may be sensitive in nature and that they do not have to answer any questions that they do not want to answer. We obtained a Certificate of Confidentiality that will further insure the privacy and security of the respondents’ answers to such questions. School counselors or other staff will be available during data collections to assist any respondents who feel upset or disturbed by any of the questions. We cannot evaluate the impact of the comprehensive Dating Matters approach on teen dating violence without asking sensitive questions about dating violence and other related behaviors.


Implementation Evaluation:

No sensitive questions will be asked. Paper and writing implements will be provided to participants in the focus groups who wish to communicate a thought but feel uncomfortable doing so verbally in front of the group. Thus, sensitive comments can be communicated in writing to the focus group moderators directly.


12. Estimates of Annualized Burden Hours and Costs


Burden estimates were derived based on the number and nature of the questions, the administration methods (e.g., using scantrons, open-ended questions) and the age of the respondents. The number of respondents was based on the sampling plan and power analysis for the main hypotheses.


A.12.A. Burden

Table A.12- Estimate of Annual Burden Hours.1

Type of Respondent

Form Name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (Hours)

Total Burden (Hours)

Student Program Participant

Student Outcome Survey Baseline - Attachment D:

11,286

1

45/60

8465

Student Program Participant

Student Outcome Survey Follow-up - Attachment E:

10,692

1

45/60

8019

School data extractor

School Indicators - Attachment G

44

342

15/60

3762

Parent Program Participant

Parent Outcome Baseline Survey - Attachment H

1,919

1

1

1919

Parent Program Participant

Parent Outcome Follow-up Survey - Attachment EEEE

1,818

1

1

1818

Educator

Educator Outcome Survey (baseline) - Attachment I

1,672

1

30/60

836

Student Brand ambassador

Brand Ambassador Implementation Survey - Attachment J

80

2

20/60

53

School leadership

School Leadership Capacity and Readiness Survey - Attachment K

42

1

1

42

Parent Curricula Implementer

Parent Program Fidelity 6th Grade Session 1-Session 6 –

Attachment L - Q

210

3

15/60

158

Parent Curricula Implementer

Parent Program Fidelity 7th Grade Session 1, 3, 5 - Attachment R – T

126

3

15/60

95

Student Curricula Implementer

Student Program Fidelity 6th Grade Session 1-Session 6-

Attachment U -Z

480

1

15/60

120

Student Curricula Implementer

Student Program Fidelity 7th Grade Session 1- Session 7 –

Attachment AA-GG

560

1

15/60

140

Student Curricula Implementer

Student Program Fidelity 8th Grade Session 1-Session 10 (comprehensive) –

Attachment HH - QQ

800

1

15/60

200

Communications Coordinator

Communications Campaign Tracking - Attachment RR

4

4

20/60

5

Local Health Department Representative

Local Health Department Capacity and Readiness - Attachment SS

16

1

2

32

Student Program Participant

Student participant focus group guide (time spent in focus group) –

Attachment ZZ

80

1

1.5

120

Student Curricula Implementer

Student curricula implementer focus group guide (time spent in focus group) –

Attachment AAA

80

1

1

80

Parent Curricula Implementer

Parent curricula implementer focus group guide (time spent in focus group) –

Attachment BBB

80

1

1

80

Student Curricula Implementer

Safe Dates 8th Grade Session 1 –Session 10 (standard) - Attachment CCC-LLL

800

1

15/60

200

Student Master Trainer

Student program master trainer TA form –

Attachment DDDD

12

50

10/60

100

Educator

Educator Outcome Survey (follow-up) - Attachment IIII

1584

1

30/60

792

Community Advisory Board Member

Community Capacity/Readiness Assessment - Attachment JJJJ

80

1

1

80

Students

Communications Focus Groups - Attachment KKKK

96

1

1.5

144

Parent Program Manager

Parent Program Manager TA Tracking Form - Attachment LLLL

4

50

10/60

33

Parent Program Participant

6th Grade Curricula Parent Satisfaction Questionnaire - Attachment MMMM

1890

1

10/60

315

Parent Program Participant

7th Grade Curricula Parent Satisfaction Questionnaire - Attachment NNNN

1890

1

10/60

315






27923

The respondent burden has been estimated based on the number of respondents enrolled or otherwise involved in a given data collection effort (see sampling frames in SSB), the number of times each of these respondents needed to be contacted, and the estimated amount of time (expressed in hours or fractions thereof) required for a respondent to provide the requested information. This calculation of the total amount of time required of the respondents is then multiplied by an estimated hourly wage for the respondent population affected by the particular data collection instrument/ processes. The produce of the total amount of time require and the applicable estimated hourly cost to each respondent yields an estimate of the total respondent cost across multiple data collection instruments/processes and the four year data collection period of the project. The total estimated burden for this request is 27,923 hours per year.

A.12.B. Estimated Annualized Burden Cost

Type of Respondent

Form Name

Number of Respondents

Number of Responses per Re-spondent

Average Burden per Response (Hours)

Total Burden (Hours)

Hourly Wage Rate

Total Respondent Cost

Student Program Participant

Student Outcome Survey Baseline - Attachment D:

11,286

1

45/60

8465

$7.25

$61,371.25

Student Program Participant

Student Outcome Survey Follow-up - Attachment E:

10,692

1

45/60

8019

$7.25

$58,137.75

School data extractor

School Indicators - Attachment G

44

342

15/60

3762

$28.58

$107,517.96

Parent Program Participant

Parent Outcome Baseline Survey - Attachment H

1,919

1

1

1919

$15.23

$29,226.37

Parent Program Participant

Parent Outcome Follow-up Survey - Attachment EEEE

1,818

1

1

1818

$15.23

$27,688.14

Educator

Educator Outcome Survey- Attachment I

1,672

1

30/60

836

$28.58

$23,892.88

Student Brand ambassador

Brand Ambassador Implementation Survey - Attachment J

80

2

20/60

53

$7.25

$384.25

School leadership

School Leadership Capacity and Readiness Survey - Attachment K

42

1

1

42

$47.57

$1,997.94

Parent Curricula Implementer

Parent Program Fidelity 6th Grade Session 1-Session 6 –

Attachment L - Q

210

3

15/60

158

$15.23

$2,406.34

Parent Curricula Implementer

Parent Program Fidelity 7th Grade Session 1, 3, 5 - Attachment R – T

126

3

15/60

95

$15.23

$1,446.85

Student Curricula Implementer

Student Program Fidelity 6th Grade Session 1-Session 6-

Attachment U –Z

480

1

15/60

120

$28.58

$3,429.60

Student Curricula Implementer

Student Program Fidelity 7th Grade Session 1- Session 7 –

Attachment AA-GG

560

1

15/60

140

$28.58

$4,001.20

Student Curricula Implementer

Student Program Fidelity 8th Grade Session 1-Session 10 (comprehensive) –

Attachment HH – QQ

800

1

15/60

200

$28.58

$5,716.00

Communications Coordinator

Communications Campaign Tracking - Attachment RR

4

4

20/60

5

$15.23

$76.15

Local Health Department Representative

Local Health Department Capacity and Readiness - Attachment SS

16

1

2

32

$15.23

$487.36

Student Program Participant

Student participant focus group guide (time spent in focus group) –

Attachment ZZ

80

1

1.5

120

$7.25

$870.00

Student Curricula Implementer

Student curricula implementer focus group guide (time spent in focus group) –

Attachment AAA

80

1

1

80

$28.58

$2,286.40

Parent Curricula Implementer

Parent curricula implementer focus group guide (time spent in focus group) –

Attachment BBB

80

1

1

80

$15.23

$1,218.40

Student Curricula Implementer

Safe Dates 8th Grade Session 1 –Session 10 (standard) - Attachment CCC-LLL

800

1

15/60

200

$28.58

$5,716.00

Student Master Trainer

Student program master trainer TA form –

Attachment DDDD

12

50

10/60

100

$15.23

$1,523.00

Educator

Educator Outcome Survey (follow-up) - Attachment IIII

1584

1

30/60

792

$28.58

$22,635.36

Community Advisory Board Member

Community Capacity/Readiness Assessment - Attachment JJJJ

80

1

1

80

$15.23

$1,218.40

Students

Communications Focus Groups - Attachment KKKK

96

1

1.5

144

$7.25

$1,044.00

Parent Program Manager

Parent Program Manager TA Tracking Form - Attachment LLLL

4

50

10/60

33

$15.23

$502.59

Parent Program Participant

6th Grade Curricula Parent Satisfaction Questionnaire - Attachment MMMM

1890

1

10/60

315

$15.23

$4,797.45

Parent Program Participant

7th Grade Curricula Parent Satisfaction Questionnaire - Attachment NNNN

1890

1

10/60

315

$15.23

$4,797.45








$374,389.09


The respondent burden has been estimated based on the number of respondents enrolled or otherwise involved in a given data collection effort (see sampling frames in SSB), the number of times each of these respondents needed to be contacted, and the estimated amount of time (expressed in hours or fractions thereof) required for a respondent to provide the requested information. This calculation of the total amount of time required of the respondents is then multiplied by an estimated hourly wage for the respondent population affected by the particular data collection instrument/ processes. The produce of the total amount of time require and the applicable estimated hourly cost to each respondent yields an estimate of the total respondent cost across multiple data collection instruments/processes and the four year data collection period of the project. The hourly wage used to calculate the respondent costs are based on professions of comparable experience using the Department of Labor wage tables (www.dol.gov). Total Respondent Cost for this evaluation is $374,389.09 per year.

13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

Respondents will incur no capital or maintenance costs.

14. Annualized Cost to the Federal Government


Contract costs for evaluation

Outcome Evaluation: $640,000 per year

Implementation Evaluation (non-communications): $300,000 per year

Implementation Evaluation (communications): $75,000 per year


Federal employee costs (OMB point of contact, PIs, and co-PIs):

Salaries: 8 federal employees @ $88,000/year $704,000 per year


Total per year $1,719,000 per year


15. Explanation for Program Changes or Adjustments


The current revision request has two aims:

1) Request to revise follow-up outcome evaluation instruments and drop mid-year outcome evaluation student survey, and

2) Request to add process evaluation instruments to enhance implementation.


1) Regarding the first aim, Dating Matters grantees and contractors had one year for planning prior to implementation. During the planning year, contractors made edits to evaluation instruments to enhance the connection with the final program and to eliminate redundant or unnecessary items. As described in detail below, these changes are minor and fully consistent with the focus and scope of what was already approved.


In working with our grantees and schools over the planning year, it also became clear that our original plan to collect baseline, mid-year, and follow-up surveys from students would create too much burden for schools. To be responsive to the schools’ needs, we are proposing to drop the mid-year student survey. We believe that given the variation in implementation of the student curricula across schools (some classrooms in some schools will implement the curricula in the fall, some in the spring) and the added burden to schools, we could remove the mid-year survey without losing critical information about program effects. Therefore, the project is now requesting approval for these changes that resulted during the planning year as a function of finalizing the program materials, clarifying instructions, responding to school concerns, and shortening several instruments.


2) Regarding the second aim, as a result of grantees’ planning processes, we identified additional needs for the process evaluation. For example, it became clear that because the 6th and 7th grade parent curricula were new, we should assess not only implementer’s fidelity to the sessions (in previously approved instruments) but also parents’ satisfaction with the program. Therefore, we are requesting to add five new process evaluation instruments (with accompanying screenshots, consent, and assent forms as appropriate) and one new version of an outcome evaluation instrument that will assist CDC, our contractors, and grantees in monitoring and improving program implementation.


In addition, the development of our capacity and readiness assessment was delayed. Although two of the three parts of the assessment were completed in time for our initial submission (school leadership and local health department capacity and readiness assessments), our contractor did not complete the third component of the assessment (community advisory board capacity and readiness assessment) until this summer. Therefore, we are requesting to add this third and final component to the capacity and readiness assessment.


We want to emphasize that even with the revision of follow-up instruments and addition of process evaluation instruments, the methods and design of the Dating Matters evaluation are unchanged. There has been no change related to individually identifiable information.



16. Plans for Tabulation and Publication and Project Time Schedule


A.16.A. Tabulation and Analysis Plan Outcome Evaluation:


The data collection involves the evaluation of Dating Matters, which will be implemented in four urban communities in 2012-2016. The evaluation will utilize a cluster randomized design in which 8-12 middle schools in each city (44 schools total) are randomized to either Dating Matters or standard practice.


Outcome Evaluation.

The final analysis plan will be determined once preliminary analysis of the data can indicate the most appropriate plan of action for analysis. Intervention condition (Dating Matters or standard of care) will be randomly assigned at the school level, so all four sites will contain both prevention models. It is expected that random assignment will ensure relatively similar groups at baseline, but any initial differences between groups will be statistically controlled. We will analyze based on an intent-to-treat approach, but it is likely that we will also examine student and parent data with respect to exposure to the relevant curriculum. It is expected that analysis will include Hierarchical Linear Modeling (HLM) to test for intervention effectiveness, given that individuals are nested within schools which are nested within sites. HLM provides a conceptual framework and a flexible set of analytic tools to analyze the special requirements of our data emerging from a multi-stage sample from multiple sources (e.g., students, parents, schools, etc.). Classes are nested within schools and variables will be defined at any level of this hierarchy. Nesting occurs when a unit of measurement is a subset of a larger unit and the units clustered in the larger unit might be correlated. Some of our variables will be measured at the school level (e.g., school size and location), others will be derived from the class level (e.g., grade level and treatment group), and others at the student level (e.g., survey results on behavior). For example, in our models, student data will be included on level 1 and classroom data and school data will be included on levels 2 and 3 respectively, with site location included as a covariate. For example, for one of our tests we would use the following level-2 fixed effect equations: B0k = γ00 + γ01INTERVENTIONk + ∑γ0sWsk + u0k in which γ01 represents the fixed effect of the intervention at the school level on the outcome B0k, W represents s number of classroom-level confounding variables for control purposes, and u represents the level-2 classroom random effect. Coefficient γ01 represents the amount of the difference the intervention makes relative to the control group, by different grades levels. We will also estimate the reduction of the residual classroom effect unexplained by the Intervention predictor to gauge the proportion of variation explained by and assess the impact of the Intervention.


Implementation Evaluation.

Analyses for the implementation evaluation component will focus primarily on describing program implementation in order to enhance the program materials, training and technical assistance. Measures of fidelity will be used to compute a fidelity score which will be included as moderator of change in the modeling of program effects described above.


In addition a cost-effectiveness analysis will be applied to both standard and comprehensive approaches. Cost-effectiveness analysis results are expressed in a cost-effectiveness ratio which is interpreted as the cost per teen dating violence case prevented, and therefore can facilitate the comparison between standard and comprehensive approaches from an economic perspective. Costs will be classified by cost type into two broad categories: program costs and participant costs.


A.16.B. Publications


Table A.16-1. Time Schedule

Activity

Time Schedule

Award contracts for data collection

Contracts for data collection will be awarded prior to obtaining OMB approval, such that contractors may prepare for data collection and so that data collection may be initiated as soon as approval is obtained. Due to the complex nature of the Dating Matters evaluation, contracts awarded in FY11 and FY12 will support the evaluation.

Administer outcome and implementation evaluation.

Evaluation activities will begin within 30 days of OMB approval and will continue until implementation is complete in 2016.

Apply for OMB approval

In anticipation of expiration of our original OMB approval (expected to expire 2015) in 2014 we will prepare and submit an application for an extension.

Analyze evaluation results.

Analysis will begin within 60 days of receiving data from contractors. Data will be analyzed annually to monitor effects with ultimate analysis (to address study hypotheses with sufficient power); analysis will be initiated within 60 days of receiving the fourth year of evaluation data in 2016.

Develop products and publications based on the results of the evaluation.

Within a year of receiving the complete evaluation data (with four years of data collection) it is anticipated that the main publications examining the outcome and implementation of Dating Matters will be submitted for publication. In addition to scientific publications, research-in-briefs and other non-technical reports of the evaluation results will be prepared and disseminated to key stakeholders.


17. Reason(s) Display of OMB Expiration Date is Inappropriate

Not applicable.


18. Exceptions to Certification for Paperwork Reduction Act Submissions

Not applicable.




References


Anderman, C., Cheadle, A., Curry, S., & Diehr, P. (1995). Selection bias related to parental consent in school-based survey research. Evaluation Review, 19, 663-674.

Blau PM. Exchange and Power in Social Life. New York: John Wiley and Sons; 1964.

Dent, C.W., Galaif, J., Sussman, S., Stacy, A., Burton, D., & Flay, B.R. (1993). Demographic, psychosocial, and behavioral differences in samples of actively and passively consented adolescents. Addictive Behaviors, 18, 51-56.

Dillman DA, Smyth JD, Christian LM. Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method (3rd ed.). 3rd. ed. New Jersey: Wiley; 2009.

Foshee, V. A., Bauman, K. E., Arriaga, X. B., Helms, R. W., Koch, G. G., & Linder, G. F. (1998). An evaluation of Safe Dates, an adolescent violence prevention program. American Journal of Public Health, 88, 45-50.

Foster E, Frasier A, Morrison H, O’Connor K, Blumberg S. All Things Incentive: Exploring the Best Combination of Incentive Conditions. In: American Association for Public Opinion Research,; May 14, 2010.

Henry, K.L., Smith, E.A., & Hopkins, A.M. (2002). The effect of active parental consent on the ability to generalize the results of an alcohol, tobacco, and other drug prevention trial to rural adolescents. Evaluation Review, 26, 645-655.

Homans GC. Social Behavior: It’s Elementary Form. New York: Harcourt, Brace and World; 1961.

Morrison, H. (2010). The 2009-2010 NS-CSHCN incentive experiment. Internal Report. Chicago, IL: NORC at the University of Chicago,

O’Leary, K. D., & Slep, A. M. S. (in press). Prevention of partner abuse by focusing on males and females. Prevention Science.

Unger, J.B., Gallaher, P., Palmer, P.H., Baezconde-Garbanati, L., Trinidad, D.R., Cen, S., & Johnson, C.A. (2004). No news is bad news: Characteristics of adolescents who provide neither parental consent nor refusal for participation in school-based survey research. Evaluation Review, 28, 52-63.

Vivolo, A., Holland, K., Teten, A. L., Holt, M., & Sexual Violence Review Team (2010). Report from CDC: Developing sexual violence prevention strategies by bridging spheres of public health. Journal of Women’s Health, 19, 1811-1814.

Whitaker, D. J., Morrison, S., Lindquist, C., Hawkins, S. R., O’Neil, J. A., Nesius, A. M (2006). A critical review of interventions for the primary prevention of perpetration of dating violence. Aggression and Violent Behavior, 11, 151-166.




1 The column labeled Number of Respondents reflects a 95% response rate at baseline and 90% response rate a follow-up.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorhci3
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy