SSA_0941 DatingMatters_07JUN2016

SSA_0941 DatingMatters_07JUN2016.docx

Evaluation of Dating Matters: Strategies to Promote Healthy Teen Relationships

OMB: 0920-0941

Document [docx]
Download: docx | pdf

14



SUPPORTING STATEMENT: PART A


OMB# 0920-0941


December 9, 2015


Evaluation of Dating Matters: Strategies to Promote Healthy Teen Relationships



Point of Contact

Sarah DeGue, PhD (Project Lead)

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

4770 Buford Highway NE MS F-64

Atlanta, GA 30341-3724

Phone: 770-488-3899

Email: [email protected]



CONTENTS

Section Page


A. Justification 3


A.1. Circumstances Making the Collection of Information Necessary 3

A.2. Purpose and Use of Information Collection 4

A.3. Use of Improved Information Technology and Burden Reduction 5

A.4. Efforts to Identify Duplication and Use of Similar Information 5

A.5. Impact on Small Businesses or Other Small Entities 5

A.6. Consequences of Collecting the Information Less Frequently 5

A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5(d)2 5

A.8. Comments in Response to the Federal Register Notice and Efforts to

Consult Outside the Agency 5

A.9. Explanation of Any Payment or Gift to Respondents 7

A.10. Protection of the Privacy and Confidentiality of Information Provided

by Respondents 7

A.11. Institutional Review Board (IRB) and Justification for Sensitive

Questions 8

A.12. Estimates of Annualized Burden Hours and Costs 9

A.13. Estimates of Other Total Annual Cost Burden to Respondents or

Record Keepers 9

A.14. Annualized Cost to the Government 10

A.15. Explanation for Program Changes or Adjustments 10

A.16. Plans for Tabulation and Publication and Project Time Schedule 10

A.17. Reason(s) Display of OMB Expiration Date is Inappropriate 11

A.18. Exceptions to Certification for Paperwork Reducation Act Submissions 11



LIST OF ATTACHMENTS


Attachment A:

Public Health Service Act: Sections 301 (42 U.S.C. 241)

Attachment B:

Published 60-Day Federal Register Notice

Attachment B 1-2:

Public comment

Attachment C:

IRB Approval

Attachment D:

Student Outcome Survey Follow-Up

Attachment E:

Screen shots of web-based student outcome survey follow-up

Attachment F 1-8:

Student Program Participant Assent/Consent Form

Attachment G:

Parent of Student Program Participant Consent Form

Attachment H:

Privacy Impact Assessment (PIA)





Summary Table


  • Goal of the study: The primary goal of the current proposal is to continue longitudinal follow-up for CDC’s teen dating violence (TDV) prevention initiative, Dating Matters®: Strategies to Promote Healthy Teen Relationships. This initiative tests the effectiveness of a comprehensive approach to prevent TDV among youth in high-risk urban communities. In order to address gaps in effective prevention programming for youth in urban communities with high crime and economic disadvantage, who may be at highest risk for TDV perpetration and victimization, Dating Matters® focuses on middle school youth with universal primary prevention strategies aimed at building a foundation of healthy relationship skills before dating.



  • Intended use of the resulting data: To determine both short- and long-term effectiveness of Dating Matters®. The data collection involved in this revision request will allow the evaluation of the long-term impacts of Dating Matters® as the students in the study age and move on to high school, a developmental stage in which students become more engaged in more intimate dating relationships.



  • Methods to be used to collect: Cluster randomized controlled trial in which 46 schools in four funded communities were randomized to either Dating Matters® or standard practice.



  • The subpopulation to be studied: High school-aged youth from four high-risk, urban communities - Alameda County, California; Baltimore, Maryland; Broward County, Florida; and, Chicago, Illinois.



  • How data will be analyzed: Hierarchical modeling accounting for nested data.

Shape1


A. Justification


A.1. Circumstances Making the Collection of Information Necessary


This is a revision request for 3 years, for the currently approved “Evaluation of Dating Matters®: Strategies to Promote Healthy Teen Relationships”, OMB# 0920-0941, expiration date 5/30/2016.


Background


The Centers for Disease Control and Prevention (CDC) is seeking a revision request that will enable continued longitudinal follow-up for CDC’s teen dating violence (TDV) prevention initiative, Dating Matters®: Strategies to Promote Healthy Teen Relationships. The initial evaluation of this project, a cluster randomized controlled trial (RCT), is covered under the current OMB-approved Information Collection Request entitled, “Evaluation of Dating Matters®: Strategies to Promote Healthy Teen Relationships,” (OMB# 0920-0941, Expiration 5/30/2016). Approval of this revision request will allow us to continue to assess the effectiveness of the CDC-developed comprehensive approach to TDV for longer-term follow-up as the students in our sample age and their engagement in dating relationships increases. The current evaluation of Dating Matters® tests a comprehensive approach to prevent TDV among youth in high-risk urban communities. As outlined in the current OMB package, “high-risk” is operationalized by having both above average rates of community or school crime and above average rates of school or community economic disadvantage. The current evaluation takes place in the following communities: Alameda County, California; Baltimore, Maryland; Broward County, Florida; and, Chicago, Illinois. The Dating Matters® comprehensive approach consists of evidence-based or evidence-informed prevention strategies implemented at each level of the social ecology. The current request seeks to extend the existing approval through May 31, 2019 to continue collecting high school follow-up data on the remaining cohorts of middle school intervention participants.


In order to address gaps in effective prevention programming for youth in urban communities with high crime and economic disadvantage, who may be at highest risk for TDV perpetration and victimization (Niolon et al., 2015), Dating Matters® focuses on middle school youth with universal primary prevention strategies aimed at building a foundation of healthy relationship skills before dating and/or TDV is initiated. The Dating Matters® comprehensive approach, which includes implementation of prevention strategies across levels of the social ecology for youth, parents, and educators in 6th-8th grade, in addition to policy change efforts and communications strategies, is being compared in this randomized controlled trial (RCT) to a comparison condition in which the standard approach to teen dating violence prevention—Safe Dates (Foshee, Bauman, Arriaga, Helms, et al., 1998) -- is implemented in 8th grade alone. Programmatic activities, covered by OMB# 0920-0941, implemented as part of this Dating Matters® evaluation were initiated in 2011 and will conclude in 2016. These activities are described in Table 1.


Phase 1 collected short-term outcome data on cohorts of student participants in the Dating Matters® evaluation through the end of middle school (8th grade); this data collection will be completed by August 2016. In 2013, Phase 2 of the evaluation was initiated to collect additional follow-up data on student participants into high school to assess the long-term effects of Dating Matters®. Data collection for Phase 2 is also approved under Information Collection Request OMB# 0920-0941 through 5/20/16. The current request seeks to extend the existing approval through May 31, 2019 to continue collecting high school follow-up data on the remaining cohorts of middle school intervention participants.


Table 1. Two school-based prevention approaches evaluated in Dating Matters Strategies to Promote Healthy Teen Relationships

Standard Practice

Grade

Youth/Peers

Parent/Guardian

Educators

Communications

Policy

8th

Safe Dates

--

--

--

--

Dating Matters® Comprehensive Approach

Grade

Youth/Peers

Parent/Guardian

Educators

Communications

Policy

6th

Adapted Student Curriculum*

Adapted Parent’s Matter!*

Dating Matters online training

Communications Strategies*

Policy Enhancement or Development

7th

Adapted Student Curriculum*

Adapted Parent Curriculum*

8th

Adapted Safe Dates

Adapted Families for Safe Dates

*CDC has developed curriculum and communications strategies


The evaluation utilizes a cluster randomized design in which 46 schools in four funded communities (Alameda County, California; Baltimore, Maryland; Broward County, Florida; and, Chicago, Illinois), were randomized to either Dating Matters® or standard practice (see OMB# 0920-0941, Expiration 5/30/2016 for a detailed description of the original evaluation design), and we seek to continue evaluation activities in these four communities.


The data collection involved in the revision request is necessary because it will allow us to evaluate the long-term impacts of Dating Matters® as the students in our study age and move on to high school, a developmental stage in which students become more engaged in more intimate dating relationships. Therefore, we are more likely to see effects of the program on these primary outcomes during this period.


The details of the data collection design and sample are in SSB. In summary, students originally recruited from 4 sites under OMB #0920-0914 form the sample population. These students, recruited during the middle school years, will be followed under the current information collection request as they continue to age and matriculate into high school. The data collection described in this proposal describes data collected by the contractor.


Authority for CDC’s National Center for Injury Prevention and Control to collect this data is granted by Section 301 of the Public Health Service Act (42 U.S.C. 301) (Attachment A).


A.2 Purpose and Use of Information Collection


All data collected as part of this request will be used in the longitudinal outcome evaluation of the Dating Matters® initiative. No teen dating violence comprehensive program has been developed and implemented specifically for high risk urban communities. Further, no other data source exists to examine the effectiveness of the Dating Matters® initiative for preventing dating violence. Therefore, this data collection is critical to understand the effectiveness, feasibility, and cost of Dating Matters® and to inform decisions about disseminating the program to other communities.


A.3. Use of Improved Information Technology and Burden Reduction


We utilize improved information technology to collect and process data to reduce respondent burden and make data processing reporting more timely and efficient whenever possible. In all data collections, the numbers of questions are held to the absolute minimum required for the intended use of the data. The high school follow-up surveys take place in multiple methods including in-person in school online or paper and pencil format, in-person at the student’s home, on the phone, or online using electronic survey forms. Screen shots of all questions to be administered electronically are included in Attachment D and E (Survey and Screen shots of web-based student outcome survey follow-up).


A.4. Efforts to Identify Duplication and Use of Similar Information


Dating Matters® represents a new approach to preventing TDV. No other data exists that could be used to evaluate the long-term effectiveness of Dating Matters®. Dating Matters® has never been implemented prior to the current implementation described in OMB #0920-0941. No publically available data on teen dating violence exist and, as such, no other existing data could be used to assess the variables of interest in the original proposal or in this extension of that proposal to continue data collection on the existing participants through high school. Members of the Dating Matters research team at CDC participate on an interagency workgroup on teen dating violence prevention headed by the National Institutes of Justice and this evaluation has been discussed with that group to ensure efforts are not duplicated across agencies. No similar work that accomplishes the same objectives is being conducted by any other federal agency.


A.5. Impact on Small Businesses or Other Small Entities


There is no anticipated impact on small businesses related to this data collection.



A.6. Consequences of Collecting the Information Less Frequently


The present study will provide the primary outcome data needed for local, state, and federal policy makers to assess the long-term effectiveness of the Dating Matters® initiative on TDV perpetration and victimization among adolescents. Under prior approval (“Evaluation of Dating Matters®: Strategies to Promote Healthy Teen Relationships,” OMB# 09-20-0941, Expiration 5/30/2016) students participated in surveys at the beginning and end of the school year while in middle school and once per year while in high school. Under this proposal, we will continue only the high school portion and survey high school aged youth once per year. Adolescence is a time of enormous growth and developmental change; thus, frequent assessment of main outcomes and hypothesized mediators are necessary in order to best capture program effects and determine causality. Less frequent outcome evaluation data collection would not allow for adequate measurement of the relative impact of the two models of prevention on key outcomes.


A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


This request fully complies with the regulation 5 CFR 1320.5.


A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


A.8.1. Federal Register Notice


A 60-day Federal Register Notice was published in the Federal Register on September 22, 2015, vol. 80, No. 183, pp. 57185 - 57186 (Attachment B). CDC received two non-substantive comment (Attachment B1 & 2). There were no replies with the standard CDC response. Contact information wasn’t provided.

A.8.2. Efforts to Consult Outside the Agency


A series of expert panels were held to inform the development, implementation, and evaluation of Dating Matters®. The following outlines the panels and their participants:


Implementation Panel--Communications (December 8-9, 2009)

Catherine Stayton, PhD, Director, Injury Epidemiology Unit, NYC Dept. of Health & Mental Hygiene

Julia Perilla, PhD, Assistant Research Professor, Georgia State University

Kristin Schubert, MPH, Program Officer, Robert Wood Johnson Foundation

Ivan Juzang, President/Founder, MEE Productions

Heather Luz McNaughton Reyes, PhD, MPH, Postdoctoral Researcher, Department of Health Behavior and Health Education, Gillings School of Global Public Health, University of North Carolina

Olis Simmons, Executive Director, Youth Uprising

Nneka Norville, Senior Public Affairs Manager, BET Networks

Lisa Witter, Chief Operating Officer, Fenton Communications


Implementation Panel--Policy (May 26-28, 2010)

Eve Birge, Education Program Specialist, U.S. Department of Education

Megan Foreman, Policy Specialist, National Conference of State Legislatures, Health Program

Deborah Gorman-Smith, Research Fellow, University of Chicago

Cheryl Grills, Professor and Chair of Psychology, Loyola Marymount University

Catherine Guerrero, Program Director, Colorado Department of Public Health and Environment

Carrie Mulford, Social Science Analyst, National Institute of Justice

Heather O'Beirne Kelly, Senior Legislative & Federal Affairs Officer, American Psychological Association

AJ Pearlman, State Policy Attorney, Break the Cycle

Brad Perry, Sexual Violence Prevention Coordinator, Virginia Sexual & Domestic Violence Action Alliance

Barri Rosenbluth, Expect Respect Program Director, Safe Place

Sally Schaeffer, Senior Public Policy Advocate, Family Violence Prevention Fund

David Wolfe, RBC Chair in Children's Mental Health Centre for Addiction and Mental Health

Caroline Ledlie, Program Officer, Centers for Disease Control Foundation

Kristin Schubert, Program Officer, The Robert Wood Johnson Foundation

Elizabeth Zurich, Health Policy Lead, Centers for Disease Control and Prevention

Kathleen Rutherford, Senior Mediator, Meridian Institute

Mark Jacobs, Mediator, Meridian Institute


Evaluation Methodology Panel (October 13-14, 2010)

Laura Leviton, Robert Wood Johnson Foundation

Rhonda BeLue, Penn State Methodology Center

Michael Cleveland, Penn State Methodology Center

Pamela Orpinas, University of Georgia

Leslie Snyder, University of Connecticut

Martie Thompson, Clemson University

Jacqueline Lloyd, National Institute on Drug Abuse


Implementation Panel—Capacity/Readiness (December 13, 2010)

Barbara Blumenthal, PhD, Independent Consultant; Visiting Lecturer, Blumenthal Consulting, LLC

Abigail Fagan, PhD, Assistant Professor, University of South Carolina

Paul Flaspohler, PhD, Assistant Professor, Miami University

Catherine Guerrero, MPA, Rape Prevention and Education Program Manager, North Carolina Division of Public Health

Pamela Jumper-Thurman, PhD, Senior Research Scientist, Colorado State University

Wendi Siebold, Senior Research Associate, Evaluation, Management, and Training Associates


Implementation Panel--Adaptation (January 10, 2011)

Barbara Ball, PhD, Program Evaluation Specialist; Start Strong Austin, Project Director SafePlace

Paul Flaspohler, PhD, Assistant Professor, Miami University

Warren Passin, MPH, MSW, Project Manager, ICF Macro

Hank Tomlinson, PhD, Behavioral Scientist, Centers for Disease Control & Prevention

Donna-Marie Winn, PhD, Research Scientist, University of North Carolina at Chapel Hill, FPG Child Development Center


A.9. Explanation of Any Payment or Gift to Respondents


We propose continuing to provide small gifts to high school aged youth who participate in the follow-up survey. When youth matriculate into high school, they are no longer in the Dating Matters® participating middle schools, making follow-up potentially challenging. Therefore, to maintain robust response rates and data quality, for high school youth only, we were approved by OMB (“Evaluation of Dating Matters®: Strategies to Promote Healthy Teen Relationships,” OMB# 0920-0941, Expiration 5/30/2016) to provide (and have been providing) a nominal non-monetary gift to participants in an amount up to $15. We seek to continue this approach. The use of this non-monetary gift is critical to maintain a high response rate of this high-risk and highly mobile sample. Response rates for the follow-up survey have been consistently around 70%.


Theory (Blau 1964; Homans 1961) and experience (Dillman et al. 2009) dictate the provision of nominal gifts (Foster et al. 2010) to ensure adequate participation in the project without coercion.  Our approach to gifts is also based on the contractor’s decades of experience in survey research and the need to balance motivating respondents to participate without offering a coercive sum (i.e., a sum that a low-income individual would find difficult to refuse; Dillman et al. 2009).  As the evaluation contractor considered alternative approaches, but selected a low-cost graduated gift approach as the most effective design, based on the literature and their experience.


A.10. Protection of the Privacy and Confidentiality of Information Provided by Respondents


This submission has been reviewed by the NCIPC IRB/OMB officer, who has determined that the Privacy Act does apply.  The applicable System of Records Notice (SORN) is 09-20-0160, “23917” published in the Federal Register on November 24, 1986 vol. 51, page number 42484. The Privacy Impact Assessment (PIA) is attached (Attachment H).


Because all data will be linked in some way, all data will be treated with the strictest security in order to protect the privacy of all participants. For the student outcome surveys, a Certificate of Confidentiality was obtained on June 5, 2012 in order to protect the confidentiality of data from external requests/subpoena for the data. All data collection and data management staff will continue to be well-trained in maintaining information security at all stages of the data collection and data management process. Protocols for data collection at schools and at other data collection sites will ensure that names, birthdates, and all other personally identifiable information is kept secure during all stages of data collection. Recruitment lists, consent forms, and all survey data will never be left unattended while data collectors are in the field, and all data will be kept in locked, secure facilities when safely delivered back to the contracting firm where data will be stored.

The first time data was collected, the respondents gave us personally identifiable information in the form of full names. In addition, in the parent consents for student participation, the parent was asked to provide his/her address, phone, and email. Attached to the survey will be a form to collect updated alternative contact information for people who will know how to contact the family if the contractor cannot contact them directly to facilitate follow-up at later iterations of the survey. Unique identification numbers were created for each consented student. The scannable survey forms for each participant will be marked only by the unique identifier code assigned to that individual. The first page of the survey will contain information (on a removable label) with the respondent’s name. This will allow staff to distribute surveys easily in classrooms. Once the survey is handed to the correct respondent, the respondent can tear off the removable sticker containing her/his name, so that from that point on, only the unique identifier code can be connected with the information provided in the survey. This process will occur at all administrations of the surveys. The ID-to-name crosswalk will only be available to a limited number of evaluation contractor staff (the contractor’s principal investigator and project manager). After being extracted, all identifiable school data will contain the unique identifier code in lieu of the respondent’s name. Programming and server set-up for the online surveys will follow strict guidance for online data security. Personally identifiable data will be immediately encrypted upon entry, and there will be no way for anyone else to link the survey data with names or other personally identifiable data.


Data will only be presented and analyzed in aggregate form. The Certificate of Confidentiality protects the individual participants from release of personal data, even to students’ parents who request the information. CDC might have to give information to DHHS if they needed to evaluate the overall study, but that is not likely. The only other time that CDC or the contractor might have to share information is when researchers, who are required to protect a child through mandated reporting laws, learn from a child that he or she is being hurt by an adult or planning to hurt him/herself or someone else.


All data will be stored in encrypted databases on password secured data platforms. As mentioned previously, survey data will be linked only with a unique identifier code and be kept in a separate database from personally identifiable data, and a third database with extremely limited personnel access (only CDC and contractor database manager will have access) will contain information linking participants with their unique identifier codes. Identified data will initially be stored and maintained by the evaluation contractor, but will be handed over to CDC on an annual basis. Only Linda Johnson and Alana Vivolo, CDC Data Managers (or their replacements in this role should they discontinue work with this project), will have access to identified data. The same information security protocols will be followed at both facilities. The contractor will be required to destroy all data six months after the end of their contract (currently March 30, 2019; although we expect this to change to March 30, 2020 pending approvals to collect data through May, 2019). CDC will maintain the database containing personally identifiable data and the database linking participants’ identities to their unique identifier codes, as well as the de-identified survey database will be maintained until all analysis has been completed and it is decided that no further follow-ups with the sample will be conducted. Current CDC IRB approval and consent forms are provided in Attachments C, F, & G. Please note, that only Parental Consent Forms included in this Request for Revision are those for the Chicago High Schools. Parental consent forms are not included for the other three sites. These sites have IRB approval to continue to follow the sample over time using the original consent form (approved as part of the original 0920-0941 package), as the original form gives consent for participation in a longitudinal study. At this point, all participants have been recruited into the study, and therefore this submission covers a period where we are only following our current sample over time, but we are no longer consenting and recruiting participants. The exception to this is the Chicago High School Form - the Chicago Public Schools RRB requires that we re-consent our sample once they reach high school, if we want continue to survey students (during the school day).


Due to the large nature of this data collection and federal expenditure to collect it, restricted-use datasets will have to be created from the outcome evaluation data. These restricted-use datasets will be completely de-identified, and any demographic information or other variables with such low endorsement that might allow the identification of respondents will be removed from the dataset before publication.


All publication of this data will be in aggregate form. No respondent would ever be able to be identified from the information provided to the public at the aggregate level.



A. 11. Institutional Review Board (IRB) and Justification for Sensitive Questions


A.11.1 IRB Approval


The study protocol has been reviewed and approved by CDC IRB. Approval letter is attached (Attachment C).

A.11.2 Justification for Sensitive Questions


The student survey data to be collected in this proposal includes sensitive questions. The primary outcome on which we expect Dating Matters® to have an impact, perpetration and victimization of dating violence behaviors, is a sensitive topic, and in order to measure impact on dating violence, we must ask students directly about their perpetration and victimization of dating violence. In addition, many of the other empirically supported risk factors that we expect may change as a result of exposure to the two models of prevention (e.g., substance use, risky sexual behaviors, attitudes toward dating violence, engagement in delinquent behaviors, school disciplinary problems) are also sensitive topics, and in order to measure program impact, we must ask questions directly about these topics. The consent forms disclose to parents and students that some of the survey questions may be sensitive in nature and that they do not have to answer any questions that they do not want to answer. We obtained a Certificate of Confidentiality that will further insure the privacy and security of the respondents’ answers to such questions. School counselors or other staff will be available during data collections to assist any respondents who feel upset or disturbed by any of the questions. We cannot evaluate the impact of the comprehensive Dating Matters® approach on TDV without asking sensitive questions about dating violence and other related behaviors.


A.12. Estimates of Annualized Burden Hours and Costs


Burden estimates were derived based on the number and nature of the questions, the administration methods (e.g., using scantrons, open-ended questions) and the age of the respondents. The number of respondents was based on the sampling plan and power analysis for the main hypotheses.


A.12.A. Estimated Annualized Burden Hours


The respondent burden has been estimated based on the number of respondents enrolled or otherwise involved in a given data collection effort (see sampling frames in SSB), the number of times each of these respondents needed to be contacted, and the estimated amount of time (expressed in hours or fractions thereof) required for a respondent to provide the requested information. This calculation of the total amount of time required of the respondents is then multiplied by an estimated hourly wage for the respondent population affected by the particular data collection instrument/ processes. The product of the total amount of time require and the applicable estimated hourly cost to each respondent yields an estimate of the total respondent cost across multiple data collection instruments/processes and the three year data collection period of the project. The total estimated burden for this request is 3,299 hours per year (Table A.12).



Table A.12- Estimate of Annual Burden Hours.1

Type of Respondent

Form Name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (Hours)

Total Burden (Hours)

Student Program Participant

Student Outcome Survey Follow-up – Att. D

4399

1

.75

3299

Total





3299





The hourly wage used to calculate the respondent costs are based on professions of comparable experience using the Department of Labor wage tables (www.dol.gov). Total Respondent Cost for this evaluation is $23,917.75 per year (Table A.12.B.).



A.12.B. Estimated Annualized Burden Cost


Type of Respondent

Form Name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (Hours)

Total Burden (Hours)

Hourly Wage Rate

Total Respondent Cost

Student Program Participant

Student Outcome Survey Follow-up – Att. D.

4399

1

.75

3299

$7.25

$23,917.75

Total





$23,917.75


A.13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers


Respondents will incur no capital or maintenance costs.

A.14. Annualized Cost to the Federal Government


LABOR

COST

Contract costs for evaluation

Outcome Evaluation:

$650,000 per year

Federal employee costs (OMB point of contact, PIs, and co-PIs):

Salaries: 3 federal employees @ $88,000/year

$264,000 per year

Total per year

$914,000 per year


A.15. Explanation for Program Changes or Adjustments


This revision request extends a currently OMB-approved data collection (OMB# 0920-0941) to continue data collection. As noted above, the current request seeks to extend this existing approval through May 31, 2019 to continue collecting high school follow-up data on the remaining cohorts of middle school intervention participants. Because fewer students will be followed and only data in high school cohorts will be collected under this revision request, there is a reduction in the estimated number of participants from 10,692 to 4399 per year and, as such, a reduction in the total burden hours and total costs. No changes have been made to survey instruments or any procedures included in the prior ICR.


A.16. Plans for Tabulation and Publication and Project Time Schedule


Tabulation and Analysis Plan Outcome Evaluation

The data collection involves the evaluation of Dating Matters®; specifically, collection of survey data from high-school aged youth until 2019 As previously noted, the evaluation utilizes a cluster randomized design in which 10-12 middle schools in each of four cities are randomized to either Dating Matters® or standard practice. Students are followed throughout middle school to assess short-term impact and throughout high school to assess longer-term impact.


The final analysis plan will be determined once preliminary analysis of the data can indicate the most appropriate plan of action for analysis. Intervention condition (Dating Matters or standard of care) was randomly assigned at the school level, so all four sites will contain both prevention models. Random assignment ensured relatively similar groups at baseline, but any initial differences between groups will be statistically controlled. We will analyze based on an intent-to-treat approach, but it is likely that we will also examine student data with respect to exposure to the relevant curriculum. It is expected that analysis will include Hierarchical Linear Modeling (HLM) to test for intervention effectiveness, given that individuals are nested within schools which are nested within sites. HLM provides a conceptual framework and a flexible set of analytic tools to analyze the special requirements of our data emerging from a multi-stage sample from multiple sources (e.g., students, parents, schools, etc.). Classes are nested within schools and variables will be defined at any level of this hierarchy. Nesting occurs when a unit of measurement is a subset of a larger unit and the units clustered in the larger unit might be correlated. Some of our variables will be measured at the school level (e.g., school size and location), others will be derived from the class level (e.g., grade level and treatment group), and others at the student level (e.g., survey results on behavior). For example, in our models, student data will be included on level 1 and classroom data and school data will be included on levels 2 and 3 respectively, with site location included as a covariate. For example, for one of our tests we would use the following level-2 fixed effect equations: B0k = γ00 + γ01INTERVENTIONk + ∑γ0sWsk + u0k in which γ01 represents the fixed effect of the intervention at the school level on the outcome B0k, W represents s number of classroom-level confounding variables for control purposes, and u represents the level-2 classroom random effect. Coefficient γ01 represents the amount of the difference the intervention makes relative to the control group, by different grades levels. We will also estimate the reduction of the residual classroom effect unexplained by the Intervention predictor to gauge the proportion of variation explained by and assess the impact of the Intervention. Our primary outcome for these analyses is participant experience of teen dating violence in both the middle and high school aged years.


Table A.16 Time Schedule

Activity

Time Schedule

Administer outcome and implementation evaluation: Evaluation activities will continue until implementation is complete in 2016.

Ongoing, to begin immediately upon OMB approval

Analyze evaluation results:

Data will be analyzed annually to monitor effects with ultimate analysis (to address study hypotheses with sufficient power); analysis will be initiated within 60 days of receiving the fourth year of evaluation data in 2016.

Ongoing, to begin immediately upon OMB approval

Develop products and publications based on the results of the evaluation:

Within a year of receiving the complete evaluation data (with four years of data collection) it is anticipated that the main publications examining the outcome and implementation of Dating Matters will be submitted for publication. In addition to scientific publications, research-in-briefs and other non-technical reports of the evaluation results will be prepared and disseminated to key stakeholders.

Beginning one year after analysis evaluation and ongoing thereafter



A.17. Reason(s) Display of OMB Expiration Date is Inappropriate

Not applicable.


A.18. Exceptions to Certification for Paperwork Reduction Act Submissions

Not applicable.


References



Blau PM. Exchange and Power in Social Life. New York: John Wiley and Sons; 1964.


Dillman DA, Smyth JD, Christian LM. Internet, Mail and Mixed-Mode Surveys: The Tailored Design Method (3rd ed.). 3rd. ed. New Jersey: Wiley; 2009.


Foshee, V. A., Bauman, K. E., Arriaga, X. B., Helms, R. W., Koch, G. G., & Linder, G. F. (1998). An evaluation of Safe Dates, an adolescent violence prevention program. American Journal of Public Health, 88, 45-50.


Foster E, Frasier A, Morrison H, O’Connor K, Blumberg S. All Things Incentive: Exploring the Best Combination of Incentive Conditions. In: American Association for Public Opinion Research,; May 14, 2010.


Homans GC. Social Behavior: It’s Elementary Form. New York: Harcourt, Brace and World; 1961.


Niolon, P. H., Vivolo-Kantor, A. M., Latzman, N. E., Valle, L. A., Kuoh, H., Burton, T., Taylor, B., & Tharp, A. T. (2015). Prevalence of teen dating violence and co-occurring risk factors among middle school youth in high risk urban communities. Journal of Adolescent Health, published online.


1 The column labeled Number of Respondents reflects a 70% response rate for the entire sample to be surveyed each year.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorhci3
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy