SUPPORTING STATEMENT: PART B
November 25, 2015
Evaluation of Dating Matters®: Strategies to Promote Healthy Teen Relationships
OMB# 0920-0941
Point of Contact
Sarah DeGue, PhD (Project Lead)
Centers for Disease Control and Prevention
National Center for Injury Prevention and Control
4770 Buford Highway NE MS F-64
Atlanta, GA 30341-3724
Phone: 770-488-3899
Email: [email protected]
Table of Contents
B. Collections of Information Employing statistical procedures
1. Respondent Universe and Sampling Methods
2. Procedures for the Collection of Information
3. Methods to Maximize Response Rates and Deal with Non-Response
4. Test of Procedures or Methods to be Undertaken
5. Individuals consulted on Statistical Aspects and Individuals collecting and/or analyzing information
B. Collections of Information Employing Statistical Methods
Design. In documentation outlined by the original protocol 0920-0941, CDC and an evaluation contractor (NORC at the University of Chicago) worked with CDC-funded local health departments to implement two models of a teen dating violence prevention program. Middle schools across four communities (Alameda County, CA; Baltimore, MD; Broward County, FL; and Chicago, IL) were randomized to either the standard of care approach (SafeDates) or a comprehensive approach (Dating Matters®). In consultation with each of the sites, the evaluation contractor and CDC have determined that a within-site simple random assignment of middle schools to one of the these two prevention models was appropriate. In three of the four sites, all students enrolled in the middle schools were asked to complete a survey following appropriate parental consent and student assent procedures. In the fourth site—Broward County—a random sample of students within the school was recrited for participation given large school sample sizes. Students with parental consent were surveyed twice per year over the course of the intervention (September 2012-May 2016) and now are being followed and surveyed once per year (Spring 2014-Spring 2018) as they matriculate into high school.
Population. The original study population included students in 6th, 7th and 8th grades at the middle schools in the four participating sites in the 2012-13 school year and the additional incoming 6th grades in the school years 2013-14 and 2014-15, for a total of 5 cohorts who are being surveyed longitudinally. The current sample for high-school followup surveys consists of the students with parental consent for participation who completed at least one survey in middle school (n=10,692). The data collection requested in this revision will continue to follow the students in the sample with followup surveys from the 2016-2017 school year through grade 12 or 2019, whichever comes first.
As part of the original study approved in OMB package #0920-0941, we surveyed 5 cohorts of middle school students (6th, 7th, and 8th grades in the 2012-13 school year and the additional incoming 6th grades in the school years 2013-14 and 2014-15) from schools in the four sites twice annually when the students were in middle school and once annually once they reached high school over a 4 year data collection period ( Fall 2012-Spring 2016). The figure below reflects the anticipated sample size for each of the data collection years covered in the current revision request. In the 2016-2017 year, the first 4 cohorts surveyed will be in high school. Then the first cohort will graduate (these were the students in 8th grade in the first recruitment school year), and in the 2017-2018 year, the final four cohorts of the five total cohorts will be in high school. In the 2018-2019 year, only the final three cohorts will be in high school and will be surveyed. Because of the difficulty of following students in high risk communities across multiple years and because we are no longer able to survey all students in the school setting, we anticipate a 70% retention rate for each year.
Figure 2: Sample
ANNUAL STUDENT SAMPLE |
|
2016-2017 High School Sample (4 cohorts) assuming 70% Participation Rate |
4399 |
2017-2018 High School Sample (4 cohorts) assuming 70% Retention Rate |
4629 |
2018-2019 High School Sample (3 cohorts) assuming 70% Retention Rate 3583 |
|
|
|
Final sample for 2016-2019 assuming 70% Participation Rate |
12,611 |
|
|
As part of the original study approved in OMB package #0920-0941, middle school students were surveyed during the school day. Because many of the students in our sample across the four sites matriculated to many different high schools, we are able to survey some students during the school day in high school (pending the school’s permission), but we must employ the use of other survey methods as well, such as mail, online, phone, and in-person methods. These have been approved for high school collection under the current approval (OMB package 0920-0941) and we are requesting to continue these survey methods in the current revision request.
In school survey administration: NORC (evaluation contractor) will identify schools that have clusters of students from our sample that are large enough (e.g., more than 5 students in one school) to make attempting in-school administration of surveys a cost-effective approach. NORC will notify parents through a letter that they will be surveying their student again as part of the Dating Matters® evaluation study, which they previously gave consent for their child to participate in, and will give the parent the opportunity to opt their child out of the current and future surveys. Students will either be surveyed in school via paper and pencil or an online format depending on the preferences and resources available at the school. Student assent will be obtained the day of survey administration (see Attachment TT). Students will have the chance to ask any questions before signing or deciding not to participate. Non-participating students will be directed to complete other work as assigned by the teacher or to return to their classroom while assenting peers complete the surveys.
For paper and pencil administered surveys, surveys come with a pre-attached unique identifier number assigned to the student at their first survey administration in the study. In addition, each survey has removable sticker with the student’s name affixed. This will allow staff to distribute surveys easily in classrooms. Students will be instructed to remove the name portion of the label before returning completed surveys to the staff. The ID-to-name code matrix will only be available to the research team and will be secured in the (contractor) project director’s office. The assessments will be administered during a class period or other school designated time during the school day. Students will read the items silently to themselves and answer the questions. When they are finished, they will insert the survey into a secure envelope being monitored by the data collector.
For online administration, each student will be given a unique username and password to link them to their survey on NORC’s secure server. Students will read the assent statement online and click a box to indicate their assent before continuing on to the survey. Students will have the same opportunity to ask questions of the proctor as those taking the survey in the paper and pencil format.
Mail administration: Before in school administration begins, all students in the sample will be mailed a cover letter, assent form, paper and pencil format survey with their unique identifier code on it, and a pre-addressed stamped envelope. The cover letter will remind them of the study and ask them to complete the survey either in paper and pencil format OR online using a unique username and password associated with their unique identifier. If they complete in paper and pencil format, they are asked to mail the signed assent form and completed survey back to NORC in the completed envelope.
Phone administration: For those students who did not complete a survey via mail, online or in-school, we will attempt to reach pariticipants by phone. Interviewers will either assist the student in completing the survey online (if possible) or will verbally obtain assent and then read the questions to the student and record their answers in a computer-based survey platform like the online survey completed by the students.
In-person administration: In the last phase of data collection each year, students who have not completed a survey through other means will be contacted via an in-person visit to their homes. Proctors will explain who they are to parents, obtain assent from the student, and administer the survey on a laptop, either allowing the student to complete the survey online or administering it verbally and keying in the answers themselves, depending on the student’s preferences and reading abilities.
The administration of student surveys will be coordinated and conducted by NORC field personnel. These staff have been trained regarding the survey instruments, the purpose and goals of the project, legal requirements, human subject guidelines, survey administration protocols, data security protocols, and the coordination of school personnel involved in the administration. NORC staff employs quality control measures to ensure appropriate school survey administration. These measures include supervisory review of checklists, random calls (by the contractor’s project manager) to site personnel to confirm adherence to protocols, the availability of refresher training for survey staff, helpline (email/telephone) support for timely support of field personnel, scheduling standards for completion of classroom surveys in each school within a window, and central review of survey packages delivered by field personnel. Field staff may make accommodations for students on a case-by-case basis if issues, such as low literacy, impede youth’s ability to complete the survey. Accommodations range from answering questions from youth about the meaning of a word or item to reading survey questions aloud to youth.
Statistical Concerns and Power Estimations:
Outcome Evaluation:
The comprehensive Dating Matters® initiative employs interventions targeting multiple levels of the social ecology (student, parent, school, community) and is therefore more “expensive” from a resource perspective than the existing, widely disseminated evidence based student curriculum, Safe Dates. Therefore comparing Dating Matters to Safe Dates will allow us to see whether the extra investment of resources is “worth it” in terms of effectiveness. However, it raises one major statistical concern; as we already know that Safe Dates is likely to have an effect (it did in the original trial, but has never been tested in this population), then our study must be sufficiently powered to detect an improved effect of Dating Matters. As we randomized to condition at the school level, the school is the unit of analysis used for our power analysis. We determined that we would need to randomize at least 40 schools in order to have sufficient power to detect effects on our main outcomes. Therefore, each of the four sites was asked to identify 10-12 schools that we then randomized to condition.Because there was the potential for attrition at the school level, power analyses were conducted on the minimum anticipated sample of 40 schools. Additionally, because attrition did occur, sites were asked to replace schools that dropped out in the first two years of implementation. Variation in time in study will be accounted for in analyses. The principle assumptions informing the outcome evaluation power analyses were as follows:
Using prior knowledge that dating violence is around 20%, we assumed proportions from the control to vary from 10% to 20%.
We assumed that the intervention lowers dating violence; thus, we assumed that the proportion of violence from the treatment group would be lower than that of the control group. We assumed treatment group proportions vary from 4% to 25%.
We assumed a type I error of 5%. This is the significance level (alpha).
We assumed that the plausible values for true control group rate of dating violence lie between 8% and 50%. Using constraints control proportions helps improve accuracy in measuring the required sample size. Lowering the width of the range of plausible values of the control group proportions helps increases statistical power.
We assumed that there is grade effect (J=3). Thus 6th, 7th and 8th grades vary. We however assumed no class effect. Thus the different classroom groups within the same grade are assumed not to be significantly different.
We assumed that there is a school effect (conservatively, K=40).
Based on a power analysis for an RCT design, this power analysis was based on a three-level Cluster Randomized Trial (3-level CRT) where students are nested within classes and classes are nested within schools. At the individual student level the study will have >90% power to detect differences as small as 9 percentage points between the intervention and comparison conditions.
The response rates expected for the student samples is detailed in Figure 2: Population and Samples, above. In section B2, Procedures for Collection of Information, the methods that will be applied to achieve these response rates are described. These methods are based upon prior work carried out by the evaluation contractor.
The approach to ensuring the highest possible retention of the student sample begins with a survey instrument designed for a high school population, and field protocols which have been tested in prior work with the target population. Intensive proctor/interviewer training will support informed responses to student concerns about participating and attaining the highest possible participation rate. NORC field staff are consistent and persistant without being pushy when trying to contact students outside of school (by phone and in-person visits) to complete surveys and have successfully attained an overall 70% response rate in the first two years of high school data collection under the current approval. This response rate is high given the high-risk nature of these communities and the high level of mobility. Additionally, as currently approved for high school participants in OMB package #0920-0941, high school students are given a $15 gift card as compensation for their time spent completing the survey. This has proven to be an effective compensation in the currently approved protocol. Additionally, high school students are typically finishing the surveys once they start them under the current protocol, but if students doing in-school administration are unable to complete their surveys in the allotted time, NORC will work with the students and the schools to allow them to complete their surveys at another time.
B.4. Test of Procedures or Methods to be Undertaken
As is detailed in the current approval (0920-0941), all surveys and procedures were developed with the extensive input of internal experts and external expert consultants and have been used and implemented without problems for several years under the current study. Survey instruments were developed with the extensive input of expert consultants. When possible, and in most cases, we have utilized pre-existing surveys that have been used in samples similar to ours. Sources of items are documented; all items used in the outcome instruments have been tested for reliability and validity. We worked with consultants within and outside of CDC to determine the appropriate length and reading level for the respondent population of the survey, noting in particular the high-risk urban settings in which this study is being conducted by design. The student survey was designed to be completed within 45-minutes.
All instruments and procedures have been reviewed extensively by CDC. The following individuals have worked closely in developing the instrument and procedures that will be used, and will be responsible for data collection and/or analysis:
CDC, National Center for Injury Prevention and Control, Division of Violence Prevention
Phyllis Holditch Niolon
Natasha Elkovitch Latzman
Sarah DeGue
Andra Teten Tharp
Alana Vivolo-Kantor
Tessa Burton
Linda Johnson
Craig Bryant
Frank Luo
Henrietta Kuoh
Kevin Vagi
2M (data analysis/statistical contractor)
Sharon Ghazarian
Todd Little
NORC at the University of Chicago (evaluation/data collection contractor)
Bruce Taylor
Elizabeth Mumford
Michael Schoeny
Shannon Nelson
The expert panel on evaluation design (see Supporting Statement A) also consulted on statistical concerns with various evaluation designs. CDC staff listed above will be responsible for leading data analysis in consultation and collaboration with the 2M contractors listed.
Reference.
Taylor, B.G, Stein, N., Woods, D. and Mumford, E.A. (2011). Dating violence prevention programs in
New York City public middle schools: A multi-level experimental evaluation. Final Report.
National Institute of Justice, Washington, DC: Government Printing Office.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | hci3 |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |