Supporting Statement B_6.11.19

Supporting Statement B_6.11.19.docx

SMARTool Pilot Replication Project

OMB: 0937-0207

Document [docx]
Download: docx | pdf

Supporting Statement B for the
Office of the Assistant Secretary for Health
SMARTool Pilot Replication Project










Submitted to

Office of Management and Budget
Office of Information and Regulatory Affairs




Submitted by

Department of Health and Human Services
Office of the Assistant Secretary for Health






June 2019











Exhibits




B1. Respondent Universe and Sampling Methods

The project includes two Implementing Organizations (IOs) delivering two different SRA programs; one IO is delivering its SRA program in 4 schools, and the second is implementing its SRA program in 10 schools. The IOs were chosen, in part, because of their significant experience in implementing SMARTool-aligned SRA curricula. The IOs were also chosen because they met other requirements of the evaluation, including a large enough sample size of students, feasibility of a within-school comparison group, and their SRA programs served 9th or 10th graders and lasted less than 4 months. The respondent universe for the youth proximal evaluation includes intervention and comparison group youth in the 14 schools (across the two IOs). Youth must be in either the 9th or 10th grade to be eligible to participate in the program and the evaluation.

  • Parental consent/youth assent. In each of the two IOs, we expect to have approximately 600 youth enrolled in the intervention and 600 youth in the comparison group, for a total of 2,400 youth across all IOs.

We expect that 50-70% of those youth will receive parental consent to participate in the evaluation (see Exhibit 1). We base these assumptions on consent rates for previous teen pregnancy prevention evaluations, including the evaluation of It’s Your Game… Keep It Real in South Carolina (Coyle, Potter, Glassman, McDade-Montez, & Unti, 2015), the evaluation of It’s Your Game… Keep It Real in Houston, Texas (Coyle, Anderson, Laris, Unti, Franks, & Glassman, 2016), and the evaluation of Healthy Futures (Calise, Chow, Doré, & JSI Research & Training Institute, Inc., 2015). Parental consent rates for these evaluations ranged from 57% to 73%. For our burden estimates in Supporting Statement Part A, we assume 70%, which is on the high end of this range, to avoid underestimating respondent burden. For our power calculations below, we assume 50% consent, which is on the low end of this range, to avoid overstating statistical power.

  • Classroom rosters. Each of the 14 schools participating in the study will need to provide a roster and administrative data for eligible youth.

  • Baseline and follow-up youth surveys. All youth who receive parental consent will be invited to participate in the baseline and follow-up youth surveys. We expect that 95% of the youth with parental consent will participate in the baseline survey, and 95% in the follow-up survey. These assumptions are based on RTI experience with similar studies such as the School Safety and School-Based Mental Health Services in a Large Metropolitan School District and the Shelby School Safety Study (both funded by the National Institute of Justice). Thus, we expect to obtain 1,596 responses to the baseline survey and 1,596 responses to the follow-up surveys.

Data for the process evaluation will come from two surveys, key informant interviews, an attendance form, and a session log. The respondent universes are as follows:

  • Participant survey. The respondent universe will be all youth receiving the intervention. We anticipate that half of the 1,680 youth who receive parental consent to participate, or 840, will be in the intervention group. We anticipate that 90% of these youth will participate in the process evaluation participant survey. All members of the respondent universe whose parents have consented and who themselves have assented will be asked to complete the survey; there will be no sampling. Thus, we expect to obtain 756 responses to the participant survey.

  • Facilitator survey. The respondent universe comprises all facilitators who deliver the SMARTool-aligned curricula to the youth.. We anticipate that approximately 4 facilitators will be involved with the intervention per IO, for a total of 8 facilitators. All members of the respondent universe will be asked to complete the survey; there will be no sampling. Because all facilitators will be staff employed by the IOs, we expect the response rate to be 100%. Thus, we expect to obtain 8 responses to the facilitator survey.

  • Facilitator key informant interviews. The respondent universe consists of all 8 facilitators involved in the intervention. A sample of 3 facilitators will be selected from each IO for the interviews, for a total of 6 facilitators. Facilitators will be purposively selected to ensure representation of the sites’ sociodemographic and geographic characteristics. These characteristics are defined by population density (rural, suburban, urban), racial and ethnic profiles, and socioeconomic status (e.g., the percentage of youth in the school lunch program). Because all facilitators will be staff employed by the IOs, we expect the response rate to be 100%. Thus, we expect to obtain 6 responses to the facilitator key informant interviews.

  • School representative key informant interviews. The respondent universe consists of administrators or managers of all the schools where the interventions take place who have responsibility for overseeing a school’s involvement in the intervention. The school representatives are expected to be a school principal, vice principal, or school liaison who has been approved by the school principal to help facilitate logistics for the data collection. A sample of 2 schools will be purposively selected from each IO to ensure that some variety of schools are represented. The total number of school representatives to be selected across both IOs is therefore 4. We anticipate that all school representatives will agree to participate in the interview because of their role in the intervention, and if any are unable to participate for any reason, we will select representatives from back-up schools. Thus, we expect to obtain 4 responses to the school representative key informant interviews.

  • Facilitator session log. The respondent universe for the session logs consists of all of the estimated 8 facilitators participating in the intervention. We anticipate that each facilitator will teach a total of 75 sessions as part of the intervention. We estimate that a total of 600 sessions will be held across all IOs during the intervention. Because all facilitators will be staff employed by their IO, we expect the response rate to be 100%. We thus expect to obtain 600 session logs.

  • Attendance form. Depending on the school, attendance data may be obtainable from a centralized school system before the end of the school year. Otherwise, we would ask school staff (e.g., classroom teachers) or program facilitators to complete an attendance form for each session offered. To be conservative, our burden estimate assumes that all attendance data would be provided by school staff. We expect the response rate to be 100%. We expect to obtain one attendance form per session, or 600 forms.

Exhibit 1. Numbers of Respondents

Instrument

Respondent Universe

Expected Response Rate

Total Expected Completed Instruments

Expected Completed Instruments Per IO

(N= 2 IOs)

Parental consent form (to participate in the evaluation)

2,400

50-70%

1,200-1,680

600-840

Classroom roster report

14

100%

14

7

Outcome baseline survey

1,680

95%

1,596

804

Outcome follow-up survey

1,680

95%

1,596

804

Facilitator session logs

600

100%

600

300

Facilitator survey

8

100%

8

4

Facilitator key informant interviews

6

100%

6

3

School representative key informant interviews

4

100%

4

2

Participant survey

840

90%

756

378

Attendance forms

600

100%

600

300



Sample Size Estimates

We developed our sample size requirements for each of our IOs so that the planned statistical tests for the evaluation will have enough power to detect statistically meaningful differences in outcomes between treatment and comparison group members if those differences do in fact exist. Given that each IO is implementing a different SRA program, the study is powered to measure the effect of each of program separately.


We selected a sample size that could detect impacts within each of our two IOs comparable to those seen in other evaluations of pregnancy prevention programs for adolescents. Our sample size estimates were based on using a standardized “effect size,” which involves dividing the intervention’s impact by the standard deviation of the outcome.


Sample requirements were estimated using an ANCOVA or regressor model approach (Allison, 1990) using excel workbooks developed by International Initiative for Impact Evaluation (Djimeau & Houdolo, 2016). The workbook incorporates well-established formulae and methods for power calculation for simple and multilevel (clustered) research designs for both categorical and continuous outcome variables (Hayes & Bennet, 1999; Raudenbush, Lieu, Congdon, & Martinez, 2011). Power calculations focused on determining the minimal detectable effect size (MDES, Bloom, 1995) that this model could find statistically significant with 0.80 power and a two-tailed test of the difference between treatment groups means at follow-up controlling for baseline values.

The power estimates required several assumptions. First, based on previous SRA-related research at the classroom level (Scull, T.M., Kupersmidt, J.B., Malik, C.V., & Morgan-Lopez, A.A. (in press)), an intraclass correlation (ICC) of 0.03 was used. This magnitude ICC is not large but is sufficient to potentially bias results if not accounted for in the sample design and analysis. Second, a base correlation over time from baseline to follow-up of 0.6 was assumed. A scan of test-retest reliability for behaviors, knowledge, and attitudes related to sexual activity in school-age samples yielded a range of values from about 0.25 to 1.0. In general, the correlation was higher for behaviors such as vaginal intercourse, substance use, and parent-related measures such as communication. Correlations were consistent across varied samples within and outside the United States. The assumed value of 0.6 falls midway in the range of values reported, making it a reasonable value to use as a representative test-retest coefficient. This correlation results in a level 1 covariate R2 of 0.36. Class, the level of assignment and the level 2 unit for multilevel analysis, would vary in size within schools but was assumed to be 20 for the power calculation. This is expected to be a floor estimate, with few classes lower in size than that value and many expected to exceed it.



Exhibit 2 shows the Minimal Detectable Effect Sizes or smallest effect that a given sample will have adequate power to detect in each of the IOs. The estimated analytic sample size for each of the two IOs (600 youth) represents a conservative estimate of the total number of students enrolled in the classrooms, receiving parent consent, and completing the baseline and follow-up youth surveys.

The component of the study that examines the impact of the ‘REAL Essentials’ curriculum in IO A will have a larger MDES due to the smaller number of classrooms and larger class sizes. The component of the study that examines the impact of the ‘Represent ®’curriculum in IO B will have a smaller MDES due to the larger number of classrooms and smaller class sizes.



Exhibit 2: Estimated Sample Sizes and Minimal Detectable Effect Sizes for Each of the IOs

SRA program

Expected # of schools

Estimated # of classrooms

Estimated # of students

MDES in standard deviation units: attitudes/beliefs


IO A (REAL Essentials)

4

17-20 classrooms (30-35 students per classroom)


1,200

0.30

IO B (Represent®)

10

56-60 classrooms
(20-25 students per classroom)

1,200

0.20



The estimated effect size of 0.30 in IO A (REAL Essentials Curriculum) is consistent with changes in attitudes and beliefs observed in several studies, but is larger than the average effect size. The estimated effect size of 0.20 in IO B (Represent Curriculum) is also similar to the average effect sizes observed in previous studies for both knowledge/attitude/belief and behavioral outcomes. Previous studies examined included:

  • Scull, T. M., Kupersmidt, J. B., Malik, C. V., & Morgan-Lopez, A. A. (Accepted). Media literacy education to promote adolescent sexual health: A short-term randomized control trial of media aware, a comprehensive sexual health program for middle school students. Journal of Health Communication.

  • Lieberman, L., & Su, H. (2012). Impact of the choosing the best program in communities committed to abstinence education. SAGE Open. https://doi.org/10.1177/2158244012442938

  • Borawski, E. A., Trapl, E. S., Lovegreen, L. D., Colabianchi, N., & Block, T. (2005). Effectiveness of abstinence-only intervention in middle school teens. American Journal of Health Behavior, 29, 423–434.

  • Clark, M. A., & Devaney, B. (2006). First-year impacts of the Heritage Keepers® Life Skills Education Component. Princeton, NJ: Mathematica Policy Research, Inc.

  • Clark, M. A., Trenhold, C., Devaney, B., Wheeler, J., & Quay, L. (2007). Impacts of the Heritage Keepers® Life Skills Education Component. Final Report. Princeton, NJ: Mathematica Policy Research, Inc.

Notably, demographic information for each school is available from the National Center for Education Statistics (NCES)1. Exhibit 3 provides demographic information for the schools that will be included in this study.

Exhibit 3. Demographic Information on Schools Included in Study

School*

Male

Female

Am. Indian/ Alaska Native

Asian

Black

Hispanic

Hawaiian/

Pacific Islander

White

Two or more races

Free Lunch

Reduced Lunch

Schools in IO A (Real Essentials Curriculum)

School 1

52%


48%


<1%

<1%

68%

23%

<1%

6%

2%

77%

7%

School 2

47%

53%

<1%

1%

90%

 5%

<1%

3%

1%

78%

5%

School 3

55%

45%

1%

4%

26%

45%

<1%

21%

2%

66%

8%

Schools in IO B (Represent Curriculum)

School 1

49%

51%

<1%

<1%

85%

3%


<1%

8%

4%

55%

10%

School 2

52%

48%

<1%

3%

5%

5%

<1%

78%

8%

25%

5%

School 3

54%

46%

<1%

<1%

31%

26%

<1%

36%

7%

67%

4%

School 4

52%

48%

<1%

1%

3%

1%

<1%

91%

3%

28%

6%

School 5

48%

52%

<1%

<1%

<1%

<1%

<1%

98%

1%

44%

17%

School 6

52%

48%

<1%

1%

5%

2%

<1%

89%

3%

**

**

School 7

50%

50%

<1%

1%

17%

2%

1%

74%

4%

25%

8%

School 8

49%

51%

<1%

1%

<1%

<1%

<1%

98%

1%

28%

9%

School 9

53%

47%

<1%

<1%

1%

<1%

<1%

97%

1%

65%

29%

School 10

47%

53%

<1%

5%

6%

19%

<1%

64%

6%

52%

11%

* Table Data Source: National Center for Educational Statistics. School directory data (2017 -2018) School Demographic data (2016 -2017)

** Missing data (Data unavailable in NCES 2016- 2017 data and unavailable in the FY 2019 State B Department of Education Database)





B2. Procedures for the Collection of Information

Parental Consent and Youth Assent

The study team will provide each participating school with parental consent forms to be distributed by the classroom teacher or other school staff. Consent forms will be double-sided. The parental consent form is provided in Appendix A. The consent forms will be tailored for the schools being served by each IO. Approximately 4 weeks before the baseline survey date, the youth’s classroom teacher or other school staff will give the youth a parental consent form to take home. The form explains the evaluation and requests the parent to sign the form and have the youth return it to the classroom teacher or other school staff. To facilitate improved response rates, the study team will work with the IOs and their partner schools to provide parent consent forms in packets sent home to parents at the beginning of the school year (or over the summer). Teachers or school staff will also distribute parent consent forms to participating youth to take home, or to parents during orientation or other in-person events such as parent meetings.

The consent form will ask parents to consent to the baseline and follow-up surveys, so the forms will need to be distributed only once, before the baseline survey.

Youth will provide written assent to participate in the evaluation. The youth assent form is provided in Appendix B. The study staff will distribute the assent form to the youth who have parental consent before the baseline and follow-up surveys. The youth will read along while the study staff reads aloud. Any youth who do not assent to participate will be given an alternative task that the teacher or other school staff has identified for that class ahead of time.

Before the follow-up survey and the process evaluation participant survey, data collectors will review the portions of the assent that are relevant to those surveys and provide youth the opportunity to opt out if they do not want to participate.

Youth Baseline and Follow-up Outcome Surveys

The youth baseline and follow-up outcome surveys will be in paper-and-pencil form and will be administered in school settings (e.g., classroom, library, cafeteria) during the school day. The questionnaires, along with an introduction, are provided as Appendices I and J. The youth baseline and follow-up outcome surveys are designed to take 30 minutes to complete. The baseline survey will be administered immediately (within a week) before the intervention begins, and the follow-up survey will be administered immediately (within a week) after the intervention is complete.

As part of planning for data collection, a staff person at each school will be identified to serve as school liaison and assist the study team with logistics. Study team field staff will be hired and trained by RTI International (RTI) to work locally in assigned schools and will be supervised by an RTI study team site coordinator at RTI’s headquarters. The study team field staff will administer the baseline and follow-up surveys and will coordinate with IO staff and the site liaisons to schedule the times for survey administration. The study team field staff will follow all security protocols for entering the school and will set up the survey location. The study team field staff will provide instructions to the youth for completing the survey, including steps to take once they have completed the survey (e.g., placing the questionnaires in the envelopes provided and leaving them on their desks).

Rosters/Administrative Data Files

The RTI study team site coordinator will request rosters/administrative data from the school office on all youth targeted to receive the program in intervention classrooms and all youth in comparison classrooms. The site coordinator will consult with each school principal, site liaison, or other school staff on the best way to obtain the needed lists of youth and associated demographic data. We will request these data from schools prior to the beginning of the fall 2019 semester to prepare parental consent forms for distribution. The rosters will include student names, ID numbers, date of birth, grade, sex, race, ethnicity, free and reduced-price lunch (FRPL) status (if feasible), English language learner status (if feasible), homeroom teacher (if feasible), Individualized Education Plan status (if feasible), and grade-point average (GPA) (if feasible). At the end of the school year, we will request updated rosters and administrative data on these same variables at the student level. School administrators or school staff will upload the rosters and administrative data files to a secure file transfer protocol (SFTP) environment. Schools or districts may charge a small fee for the rosters/administrative data, especially if data files require additional filtering by classrooms, grades, or IDs. Instructions and an example template for schools are provided as Appendix C.

Participant Survey

The process evaluation participant survey will be appended to the youth follow-up outcome survey for all youth in the intervention group. The questionnaire is provided as Appendix F. The survey will be administered with a paper-and-pencil questionnaire that is designed to take about 10 minutes to complete. A study team field staff member will coordinate with IO staff to schedule the times for survey administration.

Study team field staff will follow all security protocols for entering the school and will coordinate with the facilitator to determine when during the session the questionnaires will be distributed. The study team field staff will provide instructions to the youth for completing the survey, including steps to take once they have completed the survey (e.g., placing the questionnaires in the envelopes provided and leaving them on their desks).

Facilitator Survey

The facilitator survey will be administered at the end of the intervention. The questionnaire, along with an introduction, is provided as Appendix E. It will be administered as a paper-and-pencil questionnaire. The study team field staff will hand-deliver the questionnaire to the facilitator approximately 3 days before the last scheduled participant survey involving a class taught by the facilitator. The study team field staff will ask the facilitator to complete the survey before their last class takes the participant survey. The study team field staff will collect completed questionnaires from facilitators and place them in a sealed envelope for transmittal to the study team.

Facilitator Key Informant Interviews

The key informant interviews will be administered at the end of the intervention, after the facilitator surveys. Facilitator interviews will take place at the site or IO offices, whichever is more convenient for the facilitator. The interviews are expected to take an average of 1 hour. Prior to beginning a key informant interview, the interviewer will read respondents a verbal consent form that informs them of their rights, including the right to not answer any question, and asks for their consent to participate in the discussion. Detailed notes will be taken on each interview and will be stored on a secure server to which only the study team has access. Information collected by interviews will be reported only in aggregate and individual respondents will not be identified.

A procedures manual will be developed for the administration of the interviews and training will be provided to all interviewers and note takers regarding interview procedures and materials.

The facilitator interview guide, along with an introduction and the consent form, is provided as Appendix G.

School Representative Key Informant Interview

School representative interviews will take place at the school. The interviews are expected to take an average of 1 hour. The school representative interviews will follow the same procedure as the facilitator key informant interviews. The school representative interview guide, along with an introduction and the consent form, is provided as Appendix G.

Facilitator Session Log

Facilitator session logs will be completed by each facilitator. The session log template is provided as Appendix D. Facilitators will be trained to on how to complete the session logs. They will complete one session log for each session and submit them to the IO. The IO will record and track receipt of the session logs and will forward them to the study team weekly.

Attendance Forms

The attendance forms may be completed by either the regular classroom teacher, other school staff, or the facilitator for submission to the IO. The IO will track receipt of attendance forms and forward them to the study team weekly. The attendance form template is provided as Appendix H.

B3. Methods to Maximize Response Rates and Address Nonresponse

To maximize response rates, we intend to use strategies that have worked successfully in other major studies we have conducted, including

  • Middle Grades Longitudinal Study of 2017–18 (MGLS:2017) OMB number 1850-0911 v.19;

  • Impact Evaluation of a School-Based Violence Prevention Program, OMB number 1850‑0814;

  • The Comprehension of Emergency Operations Protocols Study (CEOP);

  • Shelby School Safety Study;

  • School Safety and School-Based Mental Health Services in a Large Metropolitan School District (SBMH);

  • Redesigned High Schools for Transformed STEM Learning Study (TSL);

  • Now Is the Time– Project AWARE (Advancing Wellness and Resilience Education), OMB number 0930-0364; and

  • Evaluating the Prevention Effects of Men of Strength (MOST) Clubs on Sexual Violence and Teen Dating Violence Perpetration.


The data collection plan approaches the school as a community. We aim to establish rapport with the whole community—principals, teachers, parents, and students. The school community must be approached with respect and sensitivity to achieve high levels of participation. Study team field staff will be trained in all tasks, from securing school and teacher cooperation to completing youth surveys. This approach provides continuity in contact with the school community and helps build rapport with all types of respondents. We will also leverage the strong relationships between the IOs and their communities and schools to establish rapport and foster high levels of participation.


Additional methods are described below.


Parental Consent and Youth Assent Process

The study team will use a variety of techniques to maximize the proportion of youth who receive parental permission to participate in the study.


  • Experienced study team field staff. Study team field staff will have established records of successfully conducting data collection in school districts and schools. Field staff will demonstrate flexibility in working with the school and assuming as much of the burden as possible while conducting the student surveys.

  • Previews of the questionnaires. Per the Protection of Pupil Rights Amendment, copies of the baseline and follow-up surveys will be made available in an office at the school for parents to review before signing the consent form.

  • Follow-ups and reminders for parental consent forms. Study team field staff will track the return of parent consent forms up to 4 weeks. Each week, they will provide copies of reminder notes and consent forms to classroom teachers or other school staff to distribute to youth whose parents have not yet returned their consent forms.

  • Youth, class, or teacher/staff incentives. Depending on the preferences of the schools, the study team will provide youth, class, or teacher/staff incentives for the return of parental consent forms (regardless of whether the parents provide consent to participate). Options include the following:

  • Youth incentives. Eligibility to participate in two $25 gift card drawings for all youth who return a signed parental consent form. In addition, youth who return a signed form will be given a small incentive, equivalent to $1 in value (e.g., pen, notepad), for each youth who returns a signed form. These incentives are comparable to what we have used successfully in other, similar studies (e.g., MOST, CEOP).

  • Class incentives. $50 for a party for the class from each grade at the school with the highest return rate. These incentives are comparable to what we have used successfully in other, similar studies (e.g., MOST, CEOP).

  • Teacher incentives. A $25 gift card to each teacher for each classroom in which at least 90% of the parental consent forms are returned. Gift cards may be used to purchase classroom supplies or to pay for a class party. These incentives are comparable to what we have used successfully in other, similar studies (e.g., TSL, CEOP).

  • IO and school encouragement. As a condition of participating in the project, IOs and schools must agree to encourage their youth to complete the surveys and to allocate time and space for youth to do so.

  • Assurances of privacy. Respondents will receive assurances of privacy to encourage participation and minimize item nonresponse.

  • Scheduling. For most of the sites, the youth baseline and follow-up surveys will be implemented during the school day. For any youth who may be absent on the days that the surveys are administered, the study team field staff will work with the site liaison to schedule make-up times to administer the surveys. For the facilitator survey, facilitators will be given the questionnaire 3 days before their last class takes the participant survey to provide them with flexibility in completing the form. A study team member will personally collect the completed facilitator questionnaires.

  • Questionnaire design. To minimize item nonresponse, the questionnaires have been designed to minimize the time required to complete them: 30 minutes for the baseline and follow-up surveys; 10 minutes for the process evaluation participant survey; and 25 minutes for the facilitator survey. Each questionnaire is also designed using clear, age-appropriate language.



Methods to Deal With Nonresponse

Nonresponse falls into two primary categories. Survey nonresponse occurs when a sampled person does not complete any of the measurement instrument and is essentially not part of the study. Reasons for survey nonresponse include refusal, changing schools, or lack of comprehension/ability to complete the instrument. Item nonresponse, typically referred to as missing data, occurs when individual items on a questionnaire are not completed. Analytical methods for addressing each type are briefly discussed below.

The evaluation team will take steps to understand the nature of any non-response and to account for the threat that it may pose for the validity of the study’s impact estimates. Using data from the baseline survey, evaluation team members will first test for statistically significant differences across demographic and baseline outcome variables between respondents and nonrespondents. Any such differences will be documented in the evaluation impact reports. The team will also test for differences between the research groups in their baseline characteristics and control for these differences using covariates when estimating program impacts.

For survey nonresponse and overall attrition, the evaluation team will also conduct analyses of participant attrition at various steps in the study process, from eligibility to parental consent to youth assent to youth baseline and follow-up survey completion. Analyses of attrition and survey nonresponse both overall and by intervention versus comparison group (i.e., differential attrition) will inform the evaluation team on how the study fared relative to preferred attrition and nonresponse standards in the field of teen pregnancy prevention.

The team will also examine casewise nonresponse to identify participants who did not complete a large percentage (e.g., 70% or more) of the survey items.

For item-level nonresponse, the team will examine and report statistics on missing data by item. To address missing item-level responses for each of the TPP-relevant study variables, the FFRDC team will use a cost effective and validated ‘mean value imputation’ approach (c.f., Puma, Olsen, Bell, et al., 2009). For each TPP-relevant item variable, we will create a flag (dummy variable) that identifies cases with missing values. In a newly created TPP-relevant item variable, each missing data point will be replaced with the mean that has been calculated across all non-missing cases for that TPP-relevant item variable (Puma et al., 2009).

The evaluation team will also examine overall and differential attrition for the study and will provide descriptive information on sources of attrition (e.g., parental consent, youth absences from attendance data, survey or item nonresponse). These analyses will help to contextualize the study findings, and a discussion of study limitations due to patterns in attrition will be included in the final report.

B4. Tests of Procedures or Methods to Be Undertaken

As much as possible, the data collection instruments for the study draw on surveys, forms, and protocols that have been used successfully in previous federal studies. For example, the youth proximal outcome questionnaires and participant survey were modeled on instruments used in previous studies addressing similar topics with similar populations, including the following:

  • Adolescent Family Life Prevention Programs, Core Baseline Questionnaire. OMB number 0990-0291 (expiration date: 02/29/2012).

  • 2017 National High School Youth Risk Behavior Survey. Form approved OMB number 0920-0493 (expiration date: 11/30/2019).

  • Drug-Free Schools and Communities Act Outcomes Study—Student Survey, U.S. Department of Education. OMB number 1875-0070 (expiration date: 5/31/1995).

  • BFY Building Futures for Youth: My Life, My Choices, My Future! female and male student surveys, 2006–2007. No known OMB number.

  • Evaluation of Adolescent Pregnancy Prevention Approaches and Impact Evaluation of the Teen Pregnancy Prevention Program Grantees, by Mathematica Policy Research. OMB number 0970-0360 (expiration date: 07/31/2013).

  • Profiles of Romantic and Sexual Relationships in Emerging Adulthood: A National Study. No known OMB number (McGroder & Rue, 2010).

  • Mindy Scott of Child Trends helped the Federally Funded Research and Development Center (FFRDC) team to develop the five items specifically for the purpose of measuring teenagers’ understanding of key components of healthy relationships. No known OMB number (Scott, Moore, Fish, Benedetti, & Erikson, 2015).

  • Engender Health for a Better Life. Re:MIX Program Evaluation Survey. Child Trends. No OMB number.

  • The National Longitudinal Study of Adolescent to Adult Health (Add Health) Wave I. University of North Carolina, Carolina Population Center. No OMB number (McGroder & Rue, 2010).

  • Healthy Respect Youth Development High School Survey Follow-Up; Pre-course Healthy Respect Youth Development Program Survey. No known OMB number.

  • Operation Keepsake Program Pretest 2008–2009. No known OMB number.

  • 2009 National Youth Risk Behavior Survey. OMB number 0920-0493.

  • Pew Research Center’s Internet & American Life Project.

  • 2013–2015 National Survey of Family Growth. Female Questionnaire. OMB number 0920‑0314.

  • Dating Matters: Strategies to Promote Healthy Teen Relationships™ Initiative. OMB number 0920-0941.

  • It’s Your Game… Keep It Real” Student Reaction Survey (sponsored by HHS Office of Adolescent Health). No known OMB number (Potter, Coyle, Glassman, Kershner, & Prince, 2016).

  • Promoting Health Among Teens (PHAT) Participant Debriefing Questionnaire. No known OMB number (Walker, Inoa, & Coppola, 2016).

  • Youth Empowerment IDEAS Youth Survey (DRAFT). (Mathematica Policy Research, October 8, 2018).

Most of the questions in the baseline questionnaire are based on questions used in previous questionnaires or modified to align with SMARTool targets. Several questions were created for this study to address specific topics of interest not covered in the other questionnaires reviewed. We conducted cognitive testing of the youth baseline questionnaire with nine youth in the 9th or 10th grade. The cognitive testing identified a few questions that were not clearly worded, so the study team revised the questionnaire accordingly.

We pretested the baseline questionnaire with nine more youth in the 9th or 10th grade. The purpose of the pretests was to identify problems that study respondents might have providing the requested information and to confirm the level of burden. Six of the nine youth were able to complete the questionnaire in 30 minutes or less, but three required more time. As a result, the study team shortened the questionnaire to ensure that it could be completed in 30 minutes or fewer.

Similarly, many items in the facilitator questionnaire were derived from one of the following sources:

  • Attitudes of Alabama Parents of Public School Children. No known OMB number (Millner, Turrens, & Shaw, 2017).

  • Working to Institutionalize Sex Education (WISE) Teacher Survey. No known OMB number (Butler, Sorace, & Beach, 2018).

  • It’s Your Game… Keep It Real” Teacher Survey (sponsored by HHS Office of Adolescent Health). No known OMB number (Potter et al., 2016).

  • Teachers’ Attitude and Comfort Scale. No known OMB number (Perez, Luquis, & Allison, 2004).

  • All4You! Implementation Fidelity Log. No known OMB number (Coyle et al., 2006).

Study team field staff will be available to answer questions throughout the data collection period. Staff will be trained to respond to questions about the study and individual forms, so they can provide technical assistance and report any issues that come up in the field.

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The agency official responsible for receiving and approving contract deliverables is

OASH Contract Officer

Office of the Assistant Secretary of Health

Department of Health and Human Services

Individuals consulted on the statistical aspects of the study are listed in Exhibit 4.

Exhibit 4. Individuals Consulted on Statistical Design

Name

Title

Telephone Number

Jason Williams

Lead Statistician, RTI

(919) 541-6734

Antonio Morgan-Lopez

Senior Statistical Advisor, RTI

(919) 316-3436

Karol Krotki

Senior Statistical Advisor, RTI

(202) 728-2485

Stefanie Schmidt

Senior Project Lead, MITRE

(703) 983-4074











Individuals responsible for data collection and analysis are listed in Exhibit 5.

Exhibit 5. Individuals Responsible for Design, Data Collection, and Analysis

Name

Title

Telephone Number

Barri Burrus

Senior Advisor, RTI

(919) 597-5109

Heather Kane

Project Director, RTI

(919) 541- 6738

Elvira Elek

Task Leader, Design Lead, RTI

(202) 728-2048

Suyapa Silvia

Data Collection Task Leader, RTI

(919) 541- 5851

Terri Dempsey

Lead Site Coordinator, RTI

(919) 541-6886

Kristen Klein

Project Lead, MITRE

(703) 983-4047

Jodie Royan Beltz

Technical Lead, MITRE

(703) 983-7198





References

Allison, P. D. (1990). Change Scores as Dependent Variables in Regression Analysis. Sociological Methodology, 20, 93-114.

Bloom, HS, 1995. Minimum detectable effects: A simple way to report the statistical power of experimental designs. Evaluation Review, 19, pp.547–556.

Butler, R. S., Sorace, D., & Beach, K. H. (2018). Institutionalizing sex education in diverse U.S. school districts. Journal of Adolescent Health, 62, 149–156. https://reader.elsevier.com/reader/sd/EA045155AA60B30458AC4F46868567DC38B7D3ED0868EAB5B4051A4B129C087669C914EDD084CB0CED4C5E1C6F254A16

Calise, T. V., Chow, W., Doré, K. F., & JSI Research & Training Institute, Inc. (2015, December 30). Evaluation of healthy futures in three northeastern Massachusetts cities: Findings from an innovative teen pregnancy prevention program. Final impact report for The Black Ministerial Alliance of Greater Boston, Inc. Prepared for the Office of Adolescent Health, U.S. Department of Health and Human Services.

Coyle, K., Anderson, P., Laris, B. A., Unti, T., Franks, H., & Glassman, J. (2016, July 14). Evaluation of it’s your game…keep it real in Houston, TX: Final impact report for University of Texas Health Science Center - Houston. Scotts Valley, CA: ETR Associates. Retrieved from https://www.hhs.gov/ash/oah/sites/default/files/ash/oah/oah-initiatives/evaluation/grantee-led-evaluation/reports/uthsc-final-report.pdf

Coyle, K. K., Kirby, D. B., Robin, L. E., Banspach, S. W., Baumler, E., & Glassman, J. (2006). All4You! A randomized trial of an HIV, other STDs, and pregnancy prevention intervention for alternative school students. AIDS Education and Prevention, 18, 187–203. doi:10.1521/aeap.2006.18.3.187

Coyle, K. K., Potter, S. C., Glassman, J. R., McDade-Montez, L., & Unti, T. (2015, December 7). Evaluation of it’s your game…keep it real in South Carolina: Final impact report for South Carolina Campaign to Prevent Teen Pregnancy. Scotts Valley, CA: ETR Associates.

Djimeu, E.W. and Houndolo, D-G, 2016. Power calculation for causal inference in social science: sample size and minimum detectable effect determination, 3ie impact evaluation manual, 3ie Working Paper 26. New Delhi: International Initiative for Impact Evaluation (3ie).

Hayes, RJ and Bennett, S, 1999. Simple sample size calculation for cluster-randomized trials. International Journal of Epidemiology, 28, pp.319–326.

McGroder, S. M., & Rue, L. (2010). A guide to measures targeting the delay of sexual initiation and/or the prevention of risky sexual behaviors. Prepared for the Family and Youth Services Bureau, U.S. Department of Health and Human Services.

Millner, V., Turrens, J., & Shaw, T. (2017). Attitudes of Alabama parents of public school children regarding sex education for their children in public schools. N.p.: University of South Alabama Research. Retrieved from http://acptp.org/wp-content/uploads/2017/08/Alabama-Parental-Attitudes-Study.pdf

Perez, M. A., Luquis, R., & Allison, L. (2004). Instrument development for measuring teachers’ attitudes and comfort in teaching human sexuality. American Journal of Health Education, 35, 24–29. https://doi.org/10.1080/19325037.2004.10603601

Potter, S. C., Coyle, K. K., Glassman, J. R., Kershner, S., & Prince, M. S. (2016). It’s Your Game… Keep It Real in South Carolina: A group randomized trial evaluating the replication of an evidence-based adolescent pregnancy and sexually transmitted infection prevention program. American Journal of Public Health, 106(Suppl. 1): S60–S69. doi:10.2105/AJPH.2016.303419

Raudenbush, SW, Liu, X-F, Congdon, R and Martinez, A, 2011. Optimal Design for Longitudinal and Multilevel Research: Documentation for the Optimal Design Software Version 3.0.

Scott, M. E., Moore, K. A., Fish, H., Benedetti, A., & Erikson, S. (2015). Healthy marriage and relationship education: Recommended outcome measures for adolescents (OPRE Report No. 2015-65a). Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families, Office of Planning, Research and Evaluation.

Scull, T.M., Kupersmidt, J.B., Malik, C.V., & Morgan-Lopez, A.A. (in press). Media Literacy Education to Promote Adolescent Sexual Health: A Short-Term Randomized Control Trial of Media Aware, a Comprehensive Sexual Health Program for Middle School Students. Accepted for publication on 10 November 2018 at Journal of Health Communication.

Walker, E. M., Inoa, R., & Coppola, N. (2016). Evaluation of Promoting Health Among Teens! abstinence-only intervention in Yonkers, NY. Princeton, NJ: Sametric Research. Retrieved from https://www.hhs.gov/ash/oah/sites/default/files/ash/oah/oah-initiatives/evaluation/grantee-led-evaluation/reports/program-reach-final-report.pdf


i


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCullen, Katie
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy