Ssb - 9-4-2013

SSB - 9-4-2013.docx

Evaluation of Dating Matters: Strategies to Promote Healthy Teen Relationships

OMB: 0920-0941

Document [docx]
Download: docx | pdf

1
















HHS/CDC/NCIPC

SUPPORTING STATEMENT FOR

OMB INFORMATION COLLECTION REQUEST



OMB# 0920-0941



Part B


September 3, 2013


Evaluation of Dating Matters: Strategies to Promote Healthy Teen Relationships


Supported by:


Department of Health and Human Services

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

Division of Violence Prevention



Government Project Officers:


Point of Contact for OMB

Andra Tharp, PhD (Project Lead)

[email protected]/

770-488-3936




Table of Contents


B. Collections of Information Employing statistical procedures

1. Respondent Universe and Sampling Methods

2. Procedures for the Collection of Information

3. Methods to Maximize Response Rates and Deal with Non-Response

4. Test of Procedures or Methods to be Undertaken

5. Individuals consulted on Statistical Aspects and Individuals collecting and/or analyzing information

B. Collections of Information Employing Statistical Methods

B.1. Respondent Universe and Sampling Methods


Outcome Evaluation

These estimates are based on the following information:

  • Number of communities/sites: 4

  • Number of schools across 4 communities/sites: 44 (12 in 3 communities, 8 in 1 community)

  • Number of students in each middle school sample is 270 (3 classrooms per grade* 3 grades * 30 students) times 44 schools= 11,880 students

  • Number of school staff in each school: 40

  • Number of schools implementing the standard model of TDV prevention: 23 (across 4 sites/communities)

  • Number of schools implementing the comprehensive model of TDV prevention: 21 (across 4 sites/communities)

Design. Four implementation sites have been funded, and they have identified 44 schools in total to implement the two models of teen dating violence prevention. In consultation with each of the sites, the evaluation contractor and CDC have determined that a within-site simple random assignment of schools to one of the two prevention models is appropriate for three of the four funded communities: Alameda County, Broward County, and Chicago. The advantage of simple random assignment is that is easy to implement, can be easily understood by stakeholders, and yields data that can be easily analyzed. If there are a small number of units, however, there is a concern that the groups might not end up as equal to each other as in designs with a large number of units being randomized. This is of particular concern when the schools being randomized start out with large differences on important factors.


In the fourth funded community—Baltimore—two of the study schools cover 6th -12th grades, thus forming a separate block from the remaining 6th-8th grade middle schools. In Baltimore, therefore, blocking by school type is the most effective approach. For the Baltimore site, we had 11 schools enter the study in October 2011 (covering our first two randomization blocks) and one more school enter the study in April 2012 (our third randomization block). Block 1 has two schools (that cover grades 6-12), Block 2 has ten schools (covering schools with grades 6-8), and Block 3 has one school (covering grades 6-8 that entered the study in April 2012). We conducted our random assignment of schools to one of the two study conditions within each of these blocks.


Population. The study population includes students in 6th, 7th and 8th grades at 44 schools in the four participating sites. At most, schools are expected to have 6 classrooms per grade, with an average of 30 students per classroom yielding a population of 23,760 students (44 schools * 3 grades * 6 classrooms per grade * 30 students per classroom). Please note that these youth will be recruited into the study in grades 6th-8th, but followed through grade 12.


The sampling frame for parents, given that we would only include one parent per student, is also 23,760 for the three years of data collection covered by this package. Based on our research and consultation with middle schools, most schools with 500-600 students have approximately 40 staff. If we assume 40 educators per school, the sampling frame for the educator sample is 1,760.


Students: . The study will survey samples of classrooms from all three middle school grade levels in the 44 schools, annually over a 4 year data collection period (see Figure 2). (Please note that we recognize that our OMB approval will expire after 3 years and we will submit a new package at that time so that the life of the project is approved.) In each year of data collection, we will recruit 30 students per classroom * a sample of 3 classrooms per grade * 3 grades * 44 schools, resulting in a student sample of 11,880. We assume a 95% participation rate (n = 11,286) for the baseline student survey (due to students being absent and parents not providing consent for student participation). Because this is a longitudinal data collection, the follow-up survey will lose some students due to attrition (e.g., students absent; students move out of district; parents withdraw permission). At follow-up (at the end of the school year), we assume a retention rate of 90% of the 11,880 students (n = 10,692).


Figure 2: Population and Samples

ANNUAL STUDENT SAMPLE

Annual Student Sample

11,880

Baseline Sample assuming 95% Participation Rate

11,286

Follow-up Sample assuming 90% Retention Rate

10,692

ANNUAL PARENT SAMPLE

Initial Sampling Frame (based on student sample)

11,880

Random Selection of 17% of parents of student sample

2,020

Baseline Sample assuming 95% Participation Rate

1,919

Follow-up Sample assuming 90% Retention Rate

1,818

ANNUAL EDUCATOR SAMPLE

Total Number of Participating Schools

44

Number of Educators per School

40

Initial Sampling Frame

1,760

Baseline sample assuming 95% Participation Rate

1,672

Follow-Up samples assuming 90% Participation Rate

1,584


Parents: We will recruit parents of 17% of the student sample (11,880) inclusive of parents participating in the parent curricula (2%), and those not receiving the parent curricula (15%), from both the Dating Matters schools and the standard schools. Although we will keep track of which parents stay in the sample in subsequent years, we will recruit the parent participants of the curricula in the intervention schools for that year only, so if a parent participates when their child is in 6th grade but does not participate in the 7th grade parent curriculum the next year, we will not attempt to re-survey that parent. New parents may have to be recruited from the standard schools if the comprehensive parent samples change drastically from year to year. However, this will not change the total burden on parents in our eligible population since only drop outs will be replaced with the same number of replacement parent participants. We will recruit a sample of 17% of eligible parents per grade per school for a total of 2,020 parents. We expect that 95% of the 2,020 parents will agree to participate at baseline (n=1,919) and 90% will participate in the follow-up survey (n= 1,818) parents.


Educators: We will attempt to recruit all educators in each school (44 schools * 40 educators per school = 1,760), who are assumed to stay in their positions over the study period (in contrast to the cohorts of students moving through the school). We expect a 95% participation rate for an estimated sample of 1,672 educators at baseline and 90% participation rate at follow-up for an estimated sample of 1,584.


School data extractors: We will attempt to recruit one data extractor per 44 schools to extract school data to be used in conjunction with the outcome data for the students. Individual level school data will only be collected for students participating in the evaluation, so this data will reflect the same sampling frame as the student survey data. As a result, the data extractors in each school will access individual school-level data for those students in their school who consented and participated in the baseline student survey (3*4*30*95% = 342).


Implementation Evaluation

Focus Groups

The evaluation contractor will conduct both student and implementer focus groups.


For the student focus groups, the contractor will work with teachers and principals to construct how students are selected and grouped together, resulting in groups of 10 students per group. Two groups will be held per each of the 4 sites (10x2x4 = 80 total student participants) moderated in a uniform manner according to the student focus group guide (Attachment ZZ).


Student implementer focus groups will be organized by site (moderated according to guidance in Attachments AAA and BBB), with two annual focus groups per site with 10 implementers in each group (10x2x4 = 80 total student program implementer participants).


Communications focus groups will be organized by site with up to four groups per site (4x4x6 = 96 total student participants) moderated in a uniform manner according to the communications focus group guide (Attachment KKKK).


Parent program implementer focus groups will be organized by site (moderated according to guidance in Attachments AAA and BBB), with two annual focus groups per site with 10 implementers in each group (10x2x4 = 80 total parent program implementer participants).


All other instruments

Due to the exploratory nature of the implementation evaluation, sampling techniques will not be employed for the Brand Ambassador survey, Communications Coordinator Tracking Form, Student Program Master Trainer, Parent Program Manager, or Fidelity Session Logs. Key health department, community advisory board, and school leadership will participate in the capacity and readiness assessments.


Because the implementation evaluation is a crucial aspect of the program implementation and potential respondents represent a relatively small number of individuals, we will implement data collection methods to obtain a high rate of participation (described below). These respondents will comprise the target population, and entire sampling frame, for the purpose of the implementation evaluation. The specific number of respondents is outlined in detail below:


School leadership: based on the predicted number of two school leadership (e.g., principal, vice principal) per comprehensive school (21 schools), the number of respondents will be 42.


Local Health Department representative: based on the predicted number of four communities/sites and four local health department representatives working on Dating Matters per community, the number of respondents will be 16. Local health department representatives will complete both the local health department capacity/readiness assessment as well as the short cost survey (Attachments SS and IIII).


Community Advisory Board Representative: based on the predicted number of 20 community representatives per 4 communities/sites, the number of respondents will be 80.


Parent Program Manager: With a maximum of one parent program manager per community/site, the number of program manager respondents will be 4. It is anticipated that they will receive up to 50 TA requests per year and complete the form 50 times.


Student Program Master Trainer TA Form: With a maximum of 3 master trainers per community. There will be 12 master trainers. It is anticipated that they will receive up to 50 TA requests per year and complete the form 50 times.


Parent Curricula Implementers: it is expected that each school implementing the comprehensive approach (n = 21) will have one male and one female parent implementing the parenting program. Therefore, we will have 42 parent program implementer respondents (2 parents x 21 schools). Please note that on the burden table the number of respondents is multiplied by the number of sessions in each parent program.


For example, the 6th grade program has 6 sessions and 210 (42 x 5) are listed.


The 7th grade program has three sessions and 126 (42 x 3) are listed.


The 8th grade parent curriculum is mailed to parents and, as such, does not involve implementers or session logs.


Student Curricula Implementers: based on the predicted number of 20 student curricula implementers per grade per site that will be completing fidelity instruments, the total number of respondents will be 80 per grade (20*4). Please note that on the burden table, the number of respondents is multiplied by the number of sessions in each student curricula program.


For example, the 6th grade curriculum has 6 sessions, so a total of 480 total respondents are listed (80 x 6).


The 7th grade program has 7 sessions, so a total of 560 total respondents are listed (80 x 7).


The 8th grade comprehensive program has 10 sessions and 800 respondents are listed (80 x 10).


The 8th grade standard program has 10 sessions and 800 total respondents are listed (80 x 10).


Brand Ambassadors: The Brand Ambassador Implementation Survey will be provided to each brand ambassador in each community. With a maximum of 20 brand ambassadors per community, the feedback form will be collected from a total of 80 brand ambassadors.


Communications Implementers (“Brand Ambassador Coordinators”): The Communications Campaign Tracking form will be provided to each brand ambassador coordinator in each community. With a maximum of one brand ambassador coordinator per community (n = 4), the feedback form will be collected from a total of 4 brand ambassador coordinators.


Parent Program Participants: The 6th and 7th grade parent satisfaction questionnaires will be completed by parent participating in the parent program in each community. With a maximum of 18 parents per parent group, up to 5 groups per grade, per comprehensive school there is a maximum number of parent respondents of 1,890 (18*5*21) for the 6th grade satisfaction questionnaire and 1,890 for the 7th grade satisfaction questionnaire.


The following table provides our exclusion criteria and rationale:

Exclusion Criteria

Rationale

Students and educators who cannot complete surveys in English will be excluded from the study

We do not have the resources to conduct data collection in languages other than English. We assume that most students and educators who are attending U.S. schools will be able to complete the survey in English.

Parents who cannot complete the surveys in English or Spanish will be excluded in the survey.

As we anticipated that there might be a number of parents who would not be able to participate in the curriculum or the surveys unless it was offered in Spanish, we are translating our materials and planning to be ready to collect data in Spanish. However, we do not have the resources to translate into/collect data in other languages.

Brand Ambassadors who cannot complete the survey in English will be excluded.

We do not have the resources to conduct the Brand Ambassador data collection in languages other than English. We assume that most students who are attending U.S. schools will be able to complete the survey in English.

Parents participating in the parent curricula groups whose children are not in the student curricula groups will be excluded in the survey.

Although any parent can participate in the parent curricula, for the purpose of evaluating the impact of Dating Matters, it is unnecessary to collect data for parents whose child is not receiving the Dating Matters program.

All other respondents who cannot complete the implementation focus groups in English will be excluded.

We do not have the resources to conduct focus groups in languages other than English. We assume that most individuals working in the type of agencies that will be eligible to participate in the implementer focus groups will be able to participate in English.


B.2. Procedures for the Collection of Information


Outcome Evaluation:

Students: While some schools will have classrooms randomly sampled to participate in the evaluation surveys, other schools (with fewer students) will have all eligible students in the building complete the survey. All students in the participating classrooms will be approached to complete the survey following IRB approved consenting/assenting guidelines. Via the parent letter and consent form, parents will be informed about the survey and the potential release of school data, including the topics covered by the survey. The contractor will work with each school to include an announcement about the survey in the school newsletter or other school publication. The exact notification mechanism and content of the letters/announcement will be determined by the contractor in consultation with each school in which either the comprehensive initiative or standard practice will be implemented. A notification letter will include the following elements: parents will be encouraged to discuss the survey with their children, parents will be given contact information for a school staff member to request a copy of the survey, ask questions about the survey, and/or to consent or refuse permission for their child to take part in the survey. Permission will also be sought to obtain the student’s school data. Consent forms will be distributed several weeks before the administration of the first survey. Parents will be able to consent or refuse their child’s participation until the last business day before the start of data collection. Figure 3 (below) represents the student follow-up paradigm over the course of the study.




Shape1

Student assent will be obtained the day of the baseline survey administration. The assent form (see Attachment TT), will be distributed and read by the students and then verbally summarized by the survey proctor. Students will have the chance to ask any questions before signing or deciding not to participate. Non-participating students will be directed to complete other work as assigned by the teacher to occupy him/herself while assenting peers complete the surveys.


The proctoring of the student surveys will be coordinated and conducted by NORC field personnel. These staff will be trained regarding the survey instruments, the purpose and goals of the project, legal requirements, human subject guidelines, survey administration protocols, data security protocols, and the coordination of school personnel involved in the administration. NORC will develop data collection training materials and conduct training for the data collection staff (at each site at a central location, or if it is more cost-efficient, all data collection staff for each site will gather for a single centralized training session). NORC staff will employ quality control measures to ensure appropriate school survey administration. These measures will include supervisory review of checklists, random calls (by the contractor’s project manager) to site personnel to confirm adherence to protocols, the availability of refresher training for survey staff, helpline (email/telephone) support for timely support of field personnel, scheduling standards for completion of classroom surveys in each school within a window, and central review of survey packages delivered by field personnel. Field staff may make accommodations for students on a case-by-case basis if issues, such as low literacy, impede youth’s ability to complete the survey. Accommodations range from answering questions from youth about the meaning of a word or item to reading survey questions aloud to youth.


The baseline and follow-up assessments will be in paper-and-pencil format on scannable forms. Surveys will come with a pre-attached unique identifier number generated through a sequential process based on the students’ school and classroom. In addition, each survey will have a removable sticker with the student’s name affixed. This will allow staff to distribute surveys easily in classrooms. Students will be instructed to remove the name portion of the label before returning completed surveys to the staff. This process will occur at pre-test and at all post-tests. The ID-to-name code matrix will only be available to the research team and will be secured in the (contractor) project director’s office. The assessments will be administered during a class period or other school designated time during the school day. Students will read the items silently to themselves and answer the questions. When they are finished, they will insert the survey into a secure envelope being monitored by the data collector. Procedures for having students complete unfinished surveys or complete surveys if they were absent on the day of survey administration will be negotiated between the schools and the evaluation contractor. The follow-up survey will follow similar procedures to the baseline survey except that students will not need to be re-assented.


To assess the possible attrition bias from participating students who drop out of the study, we will employ multiple contact methods and conduct non-response analyses. Additional efforts will be used to retain students in the sample who are lost (e.g., absent for one or more survey), and their survey responses will be compared to students regularly surveyed in their classroom settings. The main goal is to assure that subsequent outcome analyses are adjusted for any identified biases in student participation.


Parents: The following material covers parent sample description, recruitment, survey administration, and response rates. The study involves recruiting three parent samples as follows:

  1. Parents participating in the parenting program (likely to be a small sample in each school).

  2. Parents in treatment schools but not participating in the parenting program; and

  3. Parents in comparison schools and thus not offered the parenting program.


We will recruit parents of 17% of the student sample (11,880) inclusive of parents participating in the parent curricula (2%), and those not receiving the parent curricula (15%) from both the Dating Matters schools and the standard-of-care schools. The 15% figure consists of those parents of students either in the comparison condition (where the parenting program is not available) or those parents of students in the comprehensive condition who decline to participate in the parenting component.


Participants in the parent curricula (Group I) will be consented and complete the baseline survey, where possible, before the first parent group and the receipt of any parts of the curriculum. In cases where some advance notice of the timing of the parent session is available, NORC staff will coordinate planned site visits around these points to allow for in-person distribution and collection of the parent surveys. These surveys are designed to be self-administered and respondents will be provided options to complete the survey through a paper-and-pencil modality through the U.S. Mail or online. In some cases, it might not be possible to administer the surveys prior to the first parent session. However, these surveys would then be administered as soon after the first parenting session as possible. Follow-up surveys for parents participating in the Dating Matters parenting programs will delivered in the spring via the same methods (options of paper copies in the mail and web surveys will be provided) offered to parents in Groups II and III of the full parent sample. Because the scheduling of the Dating Matters parenting programs in each site is not yet known, exact plans for delivering the baseline and follow-up surveys will be finalized in coordination with sites and schools. In the case of parent intervention (Families for Safe Dates), implemented with 8th grade parents (this intervention will occur through the delivery of material via the mail), parents will be mailed consent forms and the survey before the other materials and asked to complete and mail back the survey before beginning the curriculum. Parents in in Groups II and III will also be mailed an introductory letter with the opportunity to complete the survey online or to return a postage-paid postcard to obtain a hard copy of the survey, which they may complete on paper and return in a stamped, addressed envelope. Parents will be asked to participate in a follow-up survey at the end of the school year as well.


The administration of the parent surveys will be coordinated both by NORC field personnel and central staff, reflecting the different parent samples and survey modalities. For parents in Group I, who physically participate in the Dating Matters parenting programs, survey data will be collected to the extent possible onsite during one of the sessions (depending on the survey round/timing). As necessary in a given site, alternative means of surveying parents in Group I will be parallel to the aforementioned protocols for Groups II and III. As indicated above, for survey administration of the parent survey during Dating Matters parenting program sessions, NORC staff (where possible) will coordinate planned site visits around these points to allow for in-person distribution and collection of the parent surveys. Although the surveys will be distributed and collected by NORC field staff, they will still be self-administered by the parents.


To assess the possible non-response bias from recruited parents refusing participation in the study, we will employ multiple contact methods and conduct non-response analyses. The main goal is to assure that subsequent outcome analyses are adjusted for any identified biases in parent participation. The eventual analyses of the parent data will be strengthened by propensity score matching of parents receiving the parenting program to comparison school parents. Propensity score matching assesses the predicted probability of membership in the treatment versus control group based on observed predictors obtained from a regression to create a counterfactual group. Propensity score matching would involve calculating the distance (e.g., Mahalanobis distance) between the parent participants who attended the parenting program to each comparison school parent who completed a survey. The nonparticipant with the minimum distance is chosen as the match for the treatment parent participant and both cases are put into a separate analytic comparison pool. This approach will strengthen the evaluation of the delivery of the Dating Matters parenting programs.


Because some parents (and all educators, see below) will be contacted by email, NORC will train staff to ensure consistent and effective communications.


Educators: Educators will be recruited via email to participate in an online survey, as they will not be providing any personally identifiable information. NORC will collaborate with each school to obtain a list of Educators’ school email addresses. The Educator surveys will be administered through a web-link distributed to all Educators in the participating schools via their individual school email addresses. NORC will collaborate with site and school representatives to send to Educators an email prompt and links from a centralized local source (e.g., someone in each school’s administration, or one email from the Site Coordinator). Follow-up to non-respondent Educators will be conducted directly through email, with up to ten email follow-up requests for participation sent to the evaluation point-of-contact at each school. Each educator will be emailed a unique PIN and password to log in to the survey, so that we will be able to determine whether the educator has participated in the survey but will not know which answers are his/hers. The Web mechanism will include a disclosure statement and informed consent before Educators can complete the respective surveys. The consent “form” will appear first on the screen before the actual survey, and the educator will imply consent by clicking to continue on to the survey. The web-based survey will request Educators to enter a unique password code which will allow Educators to (1) save their entries and return to the survey if interrupted, and (2) will allow NORC to ensure only one survey response per Educator. Password identifiers will be stripped for confidentiality as the Educator survey component is a repeated cross-sectional design. Educators will be surveyed at the beginning of the school year before any implementation has commenced and at the end of each school year thereafter. Some participating educators will also potentially be implementers of the curriculum, and this dual role will be documented in the survey.


School data extractors: School data extractors will be recruited by the evaluation contractor, in collaboration with onsite evaluation study coordinators. Depending on the preference and regulations regarding student data at each site, extractors will either be representatives of the contracting agency, local health department, or school personnel. One extractor per school will be identified.


School leadership: School leaders will be identified by grantees as representatives of schools participating in the Dating Matters initiative. Grantees will identify one school administrator or staff member who has sufficient knowledge of the school environment to be able to complete the school component of the capacity/readiness assessment. A representative of school leadership will likely be identified as the main point of contact at each school for the school implementation of the Dating Matters curricula or that point of contact’s designee. Completion of the capacity/readiness assessment is a program activity within the Dating Matters initiative and within the duties defined for staff working on the initiative.


Local Health Department representatives: The Local Health Department representative will be the staff member responsible for overseeing the Dating Matters initiative (i.e., project coordinator) and up to three other staff members identified by the project coordinator. Staff member participants will be individuals with sufficient experience at the Local Health Department to be able to complete questions about the organization. Completion of the capacity/readiness assessment is a program activity within the Dating Matters initiative and within the duties defined for one or more staff working on the initiative.


Community Advisory Board representatives. The local health department works closely with its community advisory board to conduct the Dating Matters initiative. Members of the community advisory board will be recruited to complete the capacity/readiness assessment. Completion of the capacity/readiness assessment is a program activity within the Dating Matters initiative and within the duties defined for one or more staff working on the initiative.


Parent Curricula Implementers: Parent curricula implementers will be recruited by the program manager and will be trained and certified to implement the curricula. Program managers will be given detailed instructions about the type of parents in the community who will be suitable as implementers. These instructions have been used across the world in implementations of Parents Matter! CDC and the TA contractor will work during the planning year to develop a detailed plan to recruit the implementers and administer the surveys.


Student Curricula Implementers (6th, 7th, 8th grade comprehensive and 8th grade standard): Student curricula implementers will be recruited by the local health department and schools. It is expected that in most sites, teachers in the schools implementing Dating Matters will implement the student curricula, but it is possible that some sites may opt to use community organizations to deliver the curricula. The TA contractor and program developers/CDC and will work with the sites to assist them in recruiting implementers. CDC and the TA contractor will work during the planning year to develop a detailed plan to recruit the implementers and administer the surveys.


Brand Ambassadors: Brand ambassadors will be recruited exclusively from the comprehensive school communities. Brand ambassadors will be competed through a selective application process. The paper and pencil application will be made available at the comprehensive schools at the beginning of each academic year. Selections will be made by the Brand Ambassador Coordinators in each community. Once a brand ambassador has been selected, they will be required to obtain parental permission (form provided) and also sign a student assent form. Both forms outline guidelines for participation in the brand ambassador program as well as any evaluation activities.


Communication focus group participants. Communications focus group participants will be recruited by research facilities in the Dating Matters™ communities, led by a moderator with extensive experience working with youth. Traditional recruitment (phone lists) techniques will be used, supplemented by community-based outreach. In addition, some of the 15- to 18-year-old youth may be recruited from the Brand Ambassador program. Once recruited participants will assent to participate and their parents will provide consent.


Communications Coordinators: Communications implementers will be selected by the Health Department (or its communications designee) in each community. This staff position was identified in the Dating Matters initiative program funding opportunity and all sites have identified an individual who will serve in this role.


Student Program Master Trainers: Student program master trainers will be selected by the Health Department in each community and will receive training to train student program implementers in Safe Dates and the CDC-developed curricula. Master trainers will be the first line of TA for the program implementers.


Parent Program Managers. Parent program managers will be selected by the Health Department in each community and will receive training to oversee the parent program implementers. Parent program managers will be the first line of TA for the parent program implementers.


6th and 7th grade Parent Program Participants. Parent program participants will be recruited by the parent program implementers and program managers to participate in the parent programs. Parents will be recruited according to the eligibility criteria outlined in the parent programs materials and training. At the end of their participation in the program, parents will complete the satisfaction questionnaire.


Statistical Concerns and Power Estimations:

Outcome Evaluation:

The design of this study makes a great deal of practical sense. The comprehensive Dating Matters initiative employs interventions targeting multiple levels of the social ecology (student, parent, school, community) and is therefore more “expensive” from a resource perspective than the existing, widely disseminated evidence based student curriculum, Safe Dates. Therefore comparing Dating Matters to Safe Dates will allow us to see whether the extra investment of resources is “worth it” in terms of effectiveness. However, it raises one major statistical concern; as we already know that Safe Dates is likely to have an effect (it did in the original trial, but has never been tested in this population), then our study must be sufficiently powered to detect an improved effect of Dating Matters. As we are randomizing to condition at the school level, the school is the unit of analysis used for our power analysis. We determined that we would need to randomize at least 40 schools in order to have sufficient power to detect effects on our main outcomes. The sites have collectively identified 44 schools. Because there is the potential for attrition at the school level, power analyses have been conducted on the minimum anticipated sample of 40 schools. Of course, if all 44 schools maintain participation, the outcome analysis will have greater power to detect an effect of the intervention. The principle assumptions informing the outcome evaluation power analyses are as follows:


  1. Using prior knowledge that dating violence is around 20%, we assumed proportions from the control to vary from 10% to 20%.

  2. We assumed that the intervention lowers dating violence; thus, we assumed that the proportion of violence from the treatment group would be lower than that of the control group. We assumed treatment group proportions vary from 4% to 25%.

  3. We assumed a type I error of 5%. This is the significance level (alpha).

  4. We assumed that the plausible values for true control group rate of dating violence lie between 8% and 50%. Using constraints control proportions helps improve accuracy in measuring the required sample size. Lowering the width of the range of plausible values of the control group proportions helps increases statistical power.

  5. We assumed that there is grade effect (J=3). Thus 6th, 7th and 8th grades vary. We however assumed no class effect. Thus the different classroom groups within the same grade are assumed not to be significantly different.

  6. We assumed that there is a school effect (conservatively, K=40).


Based on a power analysis for an RCT design, at the individual student level the study will have >90% power to detect differences as small as 9 percentage points between the intervention and comparison conditions.


Based on a power analysis for an RCT design, the parent analyses (assuming n= 1.818) will have acceptable statistical power (>80%) to detect differences as small as 9 percentage points between the intervention and comparison conditions (e.g., 20% compared to 11% for the prevalence of dating violence). As with the student power analysis, this power analysis was based on a three-level Cluster Randomized Trial (3-level CRT) where students are nested within classes and classes are nested within schools. In fact, if the parent sample drops as low as 1,680 parents, it will still provide an 80% power level. The issue of greater concern is obtaining a representative sample of parents and avoiding any bias being introduced by non-response.


Implementation Evaluation:

A critical component of an independent evaluation is the collection of multiple forms of data that can serve both as inputs to an ongoing process evaluation and to the eventual outcomes evaluation. The implementation evaluation will track and monitor what activities are being implemented, barriers to implementation, implementers’ level of fidelity to program curricula, and the integrity of the random assignment process in addition to capturing the cost of implementation, nature of TA provided, and the capacity and readiness of agencies implementing Dating Matters (local health department, community advisory boards, and schools). In general the implementation evaluation data will be used to describe implementation and enhance program materials, training and technical assistance for program improvement, and no statistical analyses will be performed. However, fidelity instruments will be taken into account in the analyses assessing the impact of Dating Matters and capacity and readiness assessments may be used to calculate a threshold of capacity needed to implement comprehensive prevention programs.


B.3. Methods to Maximize Response Rates and Deal with Nonresponse


The response rates expected for the student and parent samples are detailed in Figure 2: Population and Samples, above. In section B2, Procedures for Collection of Information, the methods that will be applied to achieve these response rates are described. These methods are based upon prior work carried out by the evaluation contractor.


The approach to ensuring the highest possible retention of the student sample begins with a survey instrument designed for a middle school population, and field protocols which have been tested in prior work with the target population. Proctor training will support informed responses to student concerns about participating. Administrative support for students absent on the day of surveys will be coordinated by school and site to ensure maximum participation of assenting students whose parents have waived consent.


The exact procedures that will be used to maximize cooperation and to achieve the desired high response rates will be determined in collaboration with the sites and the contractors. Because schools have committed to participate through Memorandums of Understanding in their health department’s application process, we are optimistic that they will be willing to work closely with us to maximize response rates. For students, procedures will be put in place to allow students who do not finish during the allotted data collection time to complete their surveys and to allow students who are absent to complete their surveys. For the hardcopy parent surveys, a pre-survey letter will also be mailed (see Attachment WWW). The respondents will be provided with a stamped, pre-addressed envelope in which to return their surveys. For the online version of the parent surveys, the consent “form” and survey data will be received through secure connections on the web. The contractor will utilize either email or postcards to send the sampled parents several follow-up reminders encouraging participation (see Attachment YYY). Educators will be sent reminder emails (see Attachment ZZZ) and encouraged by school administrators to complete the surveys.


As noted above, school commitment was obtained prior to sites applying for Dating Matters funding. Moreover, in their applications, grantees agreed to complete the capacity and readiness assessments and participate in the implementation evaluation as part of their programmatic activities. Response rates to the implementation evaluation instruments including the focus group guided discussions will be maximized by integrating the instruments into grantee activities and trainings and by employing both on-site program oversight (e.g., master trainers, parent program managers, communications coordinator) and off-site oversight (training and technical assistance provider) and support. The importance of participating in the implementation evaluation in order to improve program quality will be emphasized to implementers and strategies to increase efficiencies of information collection and decrease burden will be used, such as administering the capacity and readiness assessments electronically. Furthermore, information from the implementation evaluation will be used in a feedback loop to implementers. This information will help the TA provider determine whether program components are being delivered adequately, and which areas of program implementation may require subsequent or specific technical assistance. Feeding information back to the local health department and schools (e.g., report from capacity and readiness assessments) and to implementers from the implementation evaluation will increase the relevance of the evaluation for the grantees and increase participation. Finally, information about the integrity of the random assignment process is important to subsequent outcome analyses.


B.4. Test of Procedures or Methods to be Undertaken


All surveys and procedures will be reviewed and “walked through” by the evaluation contractor and the liaison at the health department. Survey instruments have been developed with the extensive input of expert consultants prior to awarding the evaluation contract, and through a secondary process of expert consultation under the evaluation contract. When possible, and in most cases, we have utilized pre-existing surveys that have been used in samples similar to ours. Sources of items are documented; all items used in the outcome instruments have been tested for reliability and validity. We worked with consultants within and outside of CDC to determine the appropriate length and reading level for the respondent population of the three surveys (middle school student survey; parent survey; educator survey), noting in particular the high-risk urban settings in which this study is being conducted by design. The student instruments are designed to be completed within one 45-minute classroom period. Student, parent, and educator surveys have been streamlined slightly since the original OMB submission to ensure that students, parents and educators can understand the questions and complete the survey in a reasonable amount of time.


All capacity/readiness assessments have been piloted with 4 local health departments by the contractor who developed the instrument.


The fidelity instruments and satisfaction questionnaires were adapted from instruments that have been used in past evaluations of the programs we utilize in Dating Matters (e.g., Safe Dates and Parents Matter!).


For the focus groups, we relied upon past CDC protocols and protocols used by other researchers studying primary prevention programs in the area of youth dating violence (Taylor, Stein, Wood and Mumford, 2011). Also, given the organic nature of focus groups the questions set forth in our submitted focus group should be considered exemplary. Focus groups are used as an exploratory research tool (Greenbaum 1993; Vaughn, Schumm & Sinagub 1996), and involve the explicit use of group interaction and exploration to produce data and insights that would be less accessible without the interaction found in a group and exploratory probing (Morgan 1988:12). Given the exploratory approach inherent in focus groups, there could be some deviation from the focus protocol included in this OMB submission (Greenbaum, 1993).


Other instruments, such as the cost questions and TA tracking forms, we developed specifically for use in Dating Matters in consultation with a health economist and TA provider, respectively.

B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


All instruments and procedures have been reviewed extensively by CDC. The following individuals have worked closely in developing the instrument and procedures that will be used, and will be responsible for data analysis: Phyllis Holditch Niolon, Natasha Elkovitch Latzman, Linda Anne Valle, Andra Tharp, Dawn Fowler, Tessa Burton, Linda Johnson, Kimberly Friere, Craig Bryant, Frank Luo, and Kevin Vagi, and from the evaluation contractor (NORC at the University of Chicago) team, Bruce Taylor, Elizabeth Mumford, Deborah Gorman-Smith, Michael Schoeny, Dorothy Espelage, Elizabeth Hair, and Pamela Loose. The expert panel on evaluation design (see Supporting Statement A) also consulted on statistical concerns with various evaluation designs. The expert panel on capacity and readiness (see Supporting Statement A) consulted on the development and use of the capacity and readiness assessments. The evaluation contractors are in the process of being identified, but offerors have extensive expertise in methodological design and data collection. CDC staff listed above will be responsible for analyzing the data.


References.

Greenbaum, T. L. (1993). The handbook for focus group research. New York: Macmillan.

Krueger, R. A. (1994). Focus groups: A practical guide for applied research. Thousand Oaks, CA: Sage.

Morgan, D. L. (1988). Focus groups as qualitative research. Newbury Park, CA: Sage.

Taylor, B.G, Stein, N., Woods, D. and Mumford, E.A. (2011). Dating violence prevention programs in

New York City public middle schools: A multi-level experimental evaluation. Final Report.

National Institute of Justice, Washington, DC: Government Printing Office.

Vaughn, S., Schumm, J. S., & Sinagub, J. (1996). Focus group interviews in education and psychology. Thousand Oaks, CA: Sage.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorhci3
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy