0990_0452_Supporting Statement_A_MPC_OMB_Passback_rev82017

0990_0452_Supporting Statement_A_MPC_OMB_Passback_rev82017.docx

Federal Evaluation of Making Proud Choices! (MPC!

OMB: 0990-0452

Document [docx]
Download: docx | pdf

Part A: Justification for the Collection of Follow-up Survey Data – Federal Evaluation of Making Proud Choices!

OMB Control Number 0990-0452

August 2017

U.S. Department of Health and Human Services Office of Adolescent Health
Office of the Director Department
of Health and Human Services

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

Project Officer: Amy Farb


Part A: Justification for the Collection of Follow-up Survey Data - Federal Evaluation of Making Proud Choices!

OMB Control Number 0990-0452

August 2017





CONTENTS

CONTENTS iii

TABLES iv

ATTACHMENTS v

PART A INTRODUCTION 1

A.1. Circumstances Making the Collection of Information Necessary 2

1. Legal or Administrative Requirements that Necessitate the Collection 2

2. Study Objectives 3

A.2. Purpose and Use of the Information Collection 5

A.3. Use of Information Technology to Reduce Burden 6

A.4. Efforts to Identify Duplication and Use of Similar Information 7

A.5. Impact on Small Businesses 7

A.6. Consequences of Not Collecting the Information/Collecting Less Frequently 7

A.7. Special Circumstances 7

A.8. Federal Register Notice and Consultation Outside the Agency 7

A.9. Payments to Respondents 7

A.10. Assurance of Confidentiality 9

A.11. Justification for Sensitive Questions 10

A.12 Estimates of the Burden of Data Collection 12

A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 13

A.14. Annualized Cost to Federal Government 14

A.15. Explanation for Program Changes or Adjustments 14

A16. Plans for Tabulation and Publication and Project Time Schedule 14

1. Analysis Plan 14

2. Time Schedule and Publications 15

A17. Reason(s) Display of OMB Expiration Date is Inappropriate 15

A18. Exceptions to Certification for Paperwork Reduction Act Submissions 15

SUPPORTING REFERENCES 16





TABLES

Table A1.1. Summary of Outcome Domains and Constructs 4

Table A9.1. Thank You Payments for the Follow-up Data Collections 9

Table A11.1. Summary of Sensitive Topics to be Included on the Follow-up Survey and Their Justification 11

Table A.12.1. Calculations of Annual Burden Hours and Cost for Youth Participants for the follow-up survey 12

Table A.12.2. Calculations of Annual Burden Hours and Costs to Date 13

Table A.16.1. Timeline for Use of Data Collection Instruments 15





ATTACHMENTS

ATTACHMENT A: QUESTION BY QUESTION SOURCE TABLE FOR THE follow- up survey

ATTACHMENT b: SOURCES REFERENCED FOR THE follow-up survey

ATTACHMENT c: PERSONS CONSULTED ON INSTRUMENT DEVELOPMENT AND/OR ANALYSIS OF THE follow-up survey

ATTACHMENT D: CONFIDENTIALITY PLEDGE

ATTACHMENT E: ANALYSIS PLAN

ATTACHMENT F: 60 DAY FEDERAL REGISTER NOTICE

ATTACHMENT G: pretest memo





INSTRUMENTS

Instrument 1: follow-up survey



PART A INTRODUCTION

This package is a revision to an existing approval (OMB Control #0990-0452) to add a follow-up survey.

Research on programs to prevent teen pregnancy is at a turning point. Much of the available research evidence dates to the late 1980s and early 1990s, when public health officials were facing the twin threats of the emerging HIV/AIDS epidemic and a sharp, unexpected increase in the teen birth rate in the United States. In response to these threats, researchers launched a broad, sustained effort to identify and test new programs and curricula with the potential to reduce high rates of teen pregnancy, sexually transmitted diseases (STDs), and associated sexual risk behaviors.

Much has changed in the intervening years. The teen birth rate ultimately peaked in the early 1990s and has now plunged to historic lows (Ventura et al. 2014). Researchers have succeeded in identifying dozens of prevention programs with demonstrated evidence of success in reducing adolescent sexual risk behaviors (Goesling et al. 2014), and the federal government has invested millions of dollars in disseminating knowledge of the programs and implementing them in communities around the country (Kappeler and Farb 2014; Zief et al. 2013). Overall rates of adolescent sexual activity have also declined since the early 1990s, but there has been less progress on addressing dissimilar rates by race and ethnicity. This current context shifts the research agenda towards a new primary challenge: how to use existing evidence-based programs to sustain the ongoing decline in teen birth rates in the United States and reduce remaining disparities in rates across communities and between different racial/ethnic groups..

In response to this shifting research agenda, the Office of Adolescent Health (OAH) seeks to design a new large-scale, multisite random assignment evaluation of an abstinence-based teen pregnancy prevention program that will make a significant contribution to the growing portfolio of research activities OAH has sponsored since the office was established in 2010. Much of OAH’s existing evaluation work focuses on documenting and evaluating the first cohort of grantees funded under the OAH Teen Pregnancy Prevention (TPP) program. With this new evaluation, OAH seeks to launch a “second generation” of evaluation activities - one that addresses a more targeted set of research questions of significant practical relevance to OAH and the broader field. In particular, the new evaluation will seek to advance the existing evidence base by identifying and testing a replication of a commonly used but understudied abstinence-based teen pregnancy prevention program.

To meet this objective, OAH is designing a randomized controlled trial (RCT) of Making Proud Choices! (MPC!) The MPC! curriculum aims to increase students’ knowledge of STDs and HIV, as well as their understanding of the effectiveness of abstinence, condoms, and contraceptives at reducing STDs and pregnancy.

OMB approved the instruments associated with two data collection efforts for the MPC! Evaluation: (1) collection of baseline data for the impact study through the baseline survey; and (2) collection of information on program implementation (OMB Control #0990-0452; approved on January 17, 2017).

With this submission, OAH requests OMB approval for the follow-up survey instrument, which will be used to collect data from study participants. The follow-up survey will be administered approximately 9 and 15 months post baseline. The follow-up survey contains primarily items that are on the OMB-approved baseline survey. Modifications made for the follow-up survey include dropping items that are not relevant for follow-up data collection, and adding items that address key outcomes aligned with the MPC! program. Attachment A is a question-by-question review of items on the follow-up survey. This document notes which items on the follow-up survey are also found on the baseline survey and where applicable, describes modifications made for the follow-up survey. This document also identifies new items on the follow-up survey and their source.

Additionally, this submission describes a revision to the study in Section A.15. OAH originally designed the study to address the relative effectiveness of the program implemented by school health teachers and professional health educators. The original study design used a three-armed RCT, where schools were to be randomly assigned to receive (1) MPC! implemented by health educators, (2) MPC! implemented by school teachers, or (3) business as usual health programming as the control condition. After input from an expert panel, OAH has changed the study design to test the effectiveness of the program as delivered by health educators. Schools will now be randomized to one of two groups: (1) MPC! implemented by health educators, and (2) business as usual health programming as the control condition. The same number of schools are expected to participate in the study, which does not change the original burden assumptions.

A.1. Circumstances Making the Collection of Information Necessary

1. Legal or Administrative Requirements that Necessitate the Collection

The current federal emphasis on evidence-based approaches to teen pregnancy prevention began in 2010 with congressional authorization of the TPP program and creation of OAH. The TPP program was one of six early evidence-based initiatives authorized by Congress to increase the use of data and evidence in social policy (Haskins and Margolis 2015). The program provides roughly $100 million annually to state and local organizations to implement evidence-based and promising new teen pregnancy prevention programs. As with several of the other federal evidence-based initiatives, the TPP program features a “tiered evidence” grant structure: the majority of funding goes to disseminate and scale up Tier 1 programs that have some existing evidence of effectiveness, whereas a smaller amount supports Tier 2 demonstration projects, which support innovation in the field by developing and rigorously testing promising new approaches to teen pregnancy prevention.

The first cohort of TPP grantees was announced in fall 2010, consisted of five-year awards running through fall 2015 (Kappeler and Farb 2014). A total of 75 organizations received funding under Tier 1 of the TPP program, with awards ranging from roughly $400,000 to $4 million annually. In line with the program’s emphasis on evidence-based approaches, grantees were required to select from a list of 28 existing programs and curricula that the U.S. Department of Health and Human Services (HHS) had identified as having demonstrated evidence of effectiveness in reducing teen pregnancy, STDs, or associated sexual risk behaviors. More than three-quarters of these eligible programs (23 of 28) were selected for use by at least one grantee. The TPP program was successful in reaching a very large segment of the population, with about 100,000 youth per year receiving services across a broad network of schools and other community organizations (Wilson and Lawson 2014). In addition, nearly 20 of these grantees conducted impact evaluations of their TPP program.

The experience of the first cohort of TPP grantees highlighted challenges local communities can face when implementing evidence-based programs (Margolis and Roper 2014). For example, grantees needed practical guidance on how to replicate evidence-based programs with fidelity within the time and scheduling constraints of their local schools and community-based organizations. In other cases, grantees found that the content of some of the older evidence-based programs was outdated or did not resonate with local youth. Implementation fidelity was often difficult to maintain, and varied based on the setting and mode. OAH drew on these lessons when developing plans for the next cohort of TPP grantees, for whom awards were announced on July 6, 2015. For this second cohort, OAH retained the overall tiered structure of the grant program and provides the greatest funding for the replication of evidence-based programs (Tier 1). For Tier 1, a total of 50 organizations received funding to replicate evidence-based programs in high-need communities (Tier 1B). In addition, eight organizations received funding to serve as intermediaries to support capacity building for implementing and evaluating evidence-based programs (Tier 1A).

This evaluation, designed to provide new evidence to guide the identification of evidence-based TPP programs, is authorized under Section 301 of the Public Health Service Act (42 U.S.C.241).

2. Study Objectives

OAH has designed a new research agenda to complement the second cohort of TPP funding. Building on the experiences of the first cohort of grantees and the grantee-led evaluations, OAH seeks to launch a “second generation” of evaluation activities - one that addresses a more targeted set of research questions of significant practical relevance to OAH and the broader field. This new evaluation of MPC! will seek to advance the existing evidence base by identifying and testing the replication of a commonly used but understudied abstinence-based program, intended to increase students’ knowledge of STDs and HIV, and understanding of the effectiveness of abstinence, condoms, and contraceptives at reducing STDs and pregnancy. The MPC! curriculum emphasizes abstinence as the safest choice for avoiding pregnancy and STDs, but also encourages youth to use condoms if they do have sex. The curriculum also covers refusal methods to improve participants’ feelings of self-efficacy regarding condom use and sexual activity.

MPC! is a very popular program across federal grant programs. It is implemented by over 100 providers nationwide. The program’s evidence of effectiveness is limited to a single study that meets HHS evidence review standards (Jemmott et al. 1998). The study is nearly 20 years old, and was conducted in a highly controlled implementation context by the program developers. New evidence is needed on the effectiveness of the program as it is replicated nationwide, and in schools.

The study will be designed to address this question:

  1. Does MPC!, implemented by health educators in schools, change youth sexual behavioral outcomes, relative to a business as usual sexual health program?

The evaluation will be conducted in approximately 39 high schools, and in required health classes.

It is expected that most youth in the study will be 9th graders enrolled in a school’s required health class. Schools will be randomized to one of two conditions: (1) a treatment group taught MPC! by an outside health educator from a local health department or community based organization, or (2) a control group that receives the health curriculum the school’s health teacher normally provides (i.e. a business as usual control condition). Eligible evaluation youth will be those who are expected to take a required health class.

Survey data will be collected from youth study participants at baseline (before MPC! programming begins for treatment youth) and approximately 9 months and 15 months after baseline. The baseline survey data will be used to describe the evaluation sample, to define subgroups of interest (gender and sexual experience at baseline), and as a source of covariates to be used in the impact estimation models. The follow-up survey data will be used to estimate program impacts on knowledge, attitudes, beliefs, and behaviors such as sexual initiation and contraception use (see Table A1.1 for information on outcome domains and constructs). Survey items that measure each outcome construct will be used as the dependent variable in a regression analysis used to estimate intent-to-treat program impacts of the MPC! program. The impact study will be complemented by the implementation and fidelity assessment. This study component will take a detailed look at program operations along four key aspects: (1) inputs required for implementation to succeed and be sustained, (2) contextual factors that influence implementation, (3) fidelity and quality of program implementation, and (4) participants’ responsiveness to service1.

Table A1.1. Summary of Outcome Domains and Constructs

Outcome Domain

Outcome Construct

Exposure to information

Attended classes on reproductive health topics

Received information about birth control from a doctor, nurse, or clinic

Knowledge

Knowledge about condoms

Knowledge about birth control pills

Knowledge about STIs

Knowledge about IUDs

Knowledge about other hormonal methods of birth control

Knowledge about pregnancy

Attitudes

Support for abstinence

Support for condom use

Refusal skills

Perceived refusal skills

Communication with parents

Communication about romantic relationships and sex

Intentions

Intentions to have sexual intercourse

Intentions to use birth control

Sexual risk behavior

Sexual initiation

Sex in the past three months

Sex without a condom in past three months



OAH is currently requesting OMB approval for the collection of the follow-up data for the impact study. Enrollment into the study will be conducted over approximately two and a half years - from winter 2017 to spring 2019. Follow-up surveys will occur approximately 9 and 15 months following study enrollment; therefore, a three-year clearance is requested for follow-up data collection.

A.2. Purpose and Use of the Information Collection

Data collected on the Federal Evaluation of MPC! follow-up survey will be used as a central component to the impact study. The follow-up data collection for which approval is now sought will focus on two types of outcomes outlined in Table A1.1 – both of which can be measured only through surveys of youth. The first are sexual risk outcomes, including the extent and nature of sexual activity, use of contraception (if sexually active), pregnancy, and testing for and diagnoses of STDs. The second are a series of intermediate outcomes that may be associated with the sexual risk outcomes and thus important to measure as potential pathways of any program effects on sexual risk behavior. Examples of these intermediate outcomes include participation in and exposure to pregnancy prevention programs and services, intentions and expectations of sexual activity, knowledge of contraception, condom use self-efficacy and negotiation skills and sexual risks, dating behavior and alcohol and drug use. In addition, the survey includes a small number of questions that identify socio-demographic or other characteristics of youth in the study sample, which may be used for descriptive purposes. Finally, for sample youth who report not being sexually active, the survey includes questions to support a descriptive analysis of these youth and a future investigation of their propensity for later risky behaviors2.

Follow-up data will be used to address the following research questions on program impact:

  • Is MPC! effective at meeting its immediate objectives, such as improving exposure, knowledge, and attitudes?

  • What is the effect of MPC! on sexual behavior outcomes, such as postponing sexual activity, and reducing or preventing sexual risk behaviors and STDs?

  • Does MPC! work better for some groups of adolescents than for others?

The primary impact analysis will focus on those who provide follow-up survey data, regardless of their level of participation in the program, or whether they complete the baseline survey. Doing so enables the team to conduct a rigorous, intent-to-treat impact analysis that meets the standards of the HHS Teen Pregnancy Prevention Evidence Review. We also plan to conduct analyses on subgroups defined by baseline measures. These analyses will be considered exploratory, and will not be used as a primary test of the effectiveness of the intervention. Instead, they are intended to help program providers and practitioners understand whether the pattern of the findings for the full sample is similar to or different from trends observed for particular subgroups. We will observe trends for subgroups defined by (1) gender, and (2) sexual experience at baseline.

We acknowledge that statistical power for these exploratory analyses may be insufficient due to smaller sample sizes within the subgroups. For that reason, these analyses are intended not as a primary test of the intervention’s effectiveness, but instead as a means to understand whether the overall pattern of findings are similar to trends observed within and across particular subgroups.

Many of the items included on the follow-up survey3 are identical to the items in the already- approved baseline survey instrument (OMB Control Number 0990-0452). These items are similar to those in surveys OMB has already approved for use in comparable evaluations, including the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA-OMB Control Number 0990-0382), the Teen Pregnancy Prevention Replication Study (OMB Control Number 0990-0394), the Personal Responsibility Education Program Multi-Component Evaluation (PREP-OMB Control Number 0970-0398), and the Pregnancy Assistance Fund Study (PAF-OMB Control Number 0990-0424).4

A.3. Use of Information Technology to Reduce Burden

As with the baseline, the follow-up data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Whenever possible, the follow-up will be a web-based survey administered in school, in a group setting.5 Trained Mathematica field staff will provide participants with smartphones, along with unique login information to access the survey from the device.

Web-based surveys are an attractive option for surveys of adolescents and young adults, especially surveys that ask sensitive questions and have various pathways based on responses to those questions. Web-based surveys can decrease respondent burden and improve data quality. The web-based application will include built-in skips and will route respondents to the next appropriate question based on their answers. The web-based program automatically skips them out of any questions not relevant to them, thus reducing burden on respondents having to navigate through various paths. Additionally, data checks can be programmed into the survey to eliminate responses that are out of range as well as conflicting responses.

Students who have moved out of the area or have transferred to a non-study school, or those who are otherwise unable to complete the survey through the in-school data collection, will be sent advance letters with the information necessary to log on to the web survey and complete it on their own time. For those who do not respond, we will follow up with postcards, emails and texts (with permission provided at the time of consent) and phone calls. These participants will also be given the option to complete the survey over the phone with a trained Mathematica interviewer. On the PAF study follow-up survey (OMB Control Number 0990-0424), respondents are offered web, followed by phone, and completion rates are 90 percent, with about the same percentage completing the survey by web as by phone.

A.4. Efforts to Identify Duplication and Use of Similar Information

The information collection requirements for the Federal Evaluation of MPC! have been carefully reviewed to avoid duplication with existing studies. Although the information from the one prior 1998 study of MPC! that meets the HHS evidence review standards provides value to our understanding of the effectiveness of this curriculum on behavioral outcomes, OAH does not believe that it provides current information on program effectiveness, and with a broader population of youth participating in schools. The data collection for the Federal Evaluation of MPC! is a critical step in providing essential information on program effectiveness on this very popular program being implemented in today’s schools.

A.5. Impact on Small Businesses

No small businesses are expected to be impacted. Mathematica staff will work with the study sites (schools) to lead and coordinate the data collection activities. The data collection plan is designed to minimize burden on schools by providing staff from Mathematica to lead the follow-up data collection activities.

A.6. Consequences of Not Collecting the Information/Collecting Less Frequently

Outcome data are essential to conducting a rigorous evaluation of the MPC! program. Without outcome data, we cannot estimate the short-term effect of the intervention following program implementation (using the 9-month follow-up survey), or whether those effects are sustained long term or translate to the expected behavioral outcomes (using the 15-month follow-up survey).

A.7. Special Circumstances

There are no special circumstances for the proposed data collection efforts.

A.8. Federal Register Notice and Consultation Outside the Agency

A 60-day Federal Register Notice was published in the Federal Register on February 10, 2017, Vol. 82, No. 27; pp. 10364-10365 (see Attachment F). No public comments were received.

The names and contact information of the persons consulted in the drafting and refinement of the IIS instruments are in Attachment C.

A.9. Payments to Respondents

We propose offering a combination of non-monetary gifts and gift cards (detailed below) to study participants in appreciation of their ongoing participation in the study by responding to the follow-up surveys. Our surveys include highly sensitive questions, and thus impose some burden on respondents. Research has shown that such payments are effective at increasing response rates for populations similar to those participating in this study6,7. Research also suggests that providing an incentive for earlier surveys may contribute to higher response rates for subsequent surveys.8 Therefore, providing a modest gift of appreciation at the first follow-up can reduce attrition for second follow-up data collection.

Achieving a high survey response rate at follow-up is critical for three reasons. First, it is necessary for achieving the minimum sample size needed to demonstrate the statistical significance of the findings. The federal government makes great efforts to ensure that they are investing in well-powered studies. Second, rigorous evidence reviews (such as the HHS Teen Pregnancy Prevention Evidence review, which will eventually assess the evidence from this study), assess study attrition using follow-up survey response rates when determining the evidence rating for the study. High attrition from the study can result in a low evidence rating, which would indicate that the federal government had invested in a study that lacked sufficient internal validity to draw conclusions about program effectiveness. The federal government makes great efforts to ensure that they are investing in valid program evaluations. Third, and finally, OAH has directed its contractor to study the effectiveness of MPC! in schools in low-income, disadvantaged areas. OAH intends that the study population be representative of the youth in these school districts. If high response rates are not received, the study sample could be biased in the direction of the higher achieving, higher-income, more highly motivated youth with more supportive parents – the population that is likely to complete a follow-up survey early and with little or no incentive9. Unlike other large-scale federal survey efforts, such as the Census, the study sample of a randomized controlled trial is set at the beginning of the study; there is no opportunity to add sample members over time to replace survey non-respondents. Therefore, a plan to achieve a high response rates among randomized study participants is a critical part of a follow-up data collection plan.

The incentive structure proposed in this ICR is a modified version of the incentive structure used on other federally funded studies with similar populations, including the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA-OMB Control Number 0990-0382), PREP (OMB Control Number 0970-0398), and the Strengthening Relationship Education and Marriage Services (STREAMS) Evaluation (OMB Control Number 0970-0481, approved July 5, 2016), see table A9.1. These studies have achieved high response rates on the first and second follow-ups10. On PREP, the first follow-up response rates in two of the school-based sites are 90 to 94 percent and second follow-up response rates are 84 to 91 percent.11

The modifications made for this ICR reflect OMB’s efforts to lower the amount of the incentives and to provide non-monetary incentives for students completing surveys in schools.

Table A9.1. OMB -approved incentives on similar Federal projects



Follow-up 1

Follow-up 2

Project (OMB Control No.)

ICR Reference #

In-school

Out of school

In-school

Out of school

PREP (0970-0398)

201401.0970.001

$15

$20

$20

$25

STREAMS (0970-0481)

201703.0970.011

$15

$20

NA

NA

PPA (0990-0382)

201502.0990.002

$10

$25

$10

$25






In an evaluation such as Making Proud Choices, students completing surveys one – two years after the initial baseline survey are no longer organized in the class groupings in which the study and intervention initially took place. Instead, at the time of the follow-up surveys, schools gather students during congregate times, such as lunch, study halls, and advisory periods. Youth have discretion as to whether they meet with the survey team or not.


Since students in school can easily be detracted from attending a follow-up survey session during lunch or study hall, it is important to incentivize them to attend the survey administration and be fully informed about the opportunity to participate. Therefore, we propose to use a small non-monetary incentive to encourage students to attend the survey administration, regardless of whether they then assent to participate.


Mathematica will work with the participating districts and schools to determine a specific non-monetary incentive based on what is available to students in these underserved, low-income communities. For example, if pizza is readily available, we will propose to offer a pizza lunch following survey administration. If pizza is not readily available, we could instead offer a voucher for movie tickets if a theater is accessible to the students in the communities in which the study is taking place. We will work with the schools to identify the most appropriate and accessible activity for a non-monetary incentive. The incentive will be valued at no more than $5 per respondent, which is consistent with the value of the non-monetary incentive OMB approved to provide to students to return the parent permission form at the time of study enrollment.


Maintaining high response rates among adolescents who are not available for in-school administration is much more challenging. For example, on the OAH sponsored study of the Pregnancy Assistance Fund, where survey respondents are not available for in-school administration, 85 percent response rates are only achieved after the use of $25 gift cards (OMB approval 0990-0424). For the Making Proud Choices study we estimate that up to 30 percent of our sample may be responding to the survey out of school time. For those students we are unable to reach in school either because they have moved or are chronically absent, the youth must put added time and effort in outside of the school day to complete the survey.


For those participants who complete the survey outside of school, either because group administration is not feasible or they are not able to attend a group administration, a $10 gift card will be provided for completing the first follow-up survey and a $15 gift card will be provided to participants completing the second follow-up survey. Compared to first follow-up surveys, slightly larger gifts are offered to respondents for the second follow-up surveys for both group and phone administration in order to ensure high response rates. Attrition from surveys tends to increase over time due to mobility of participants and study fatigue. Higher incentives are needed to continue to ensure participant responses. The typically lower response rates for second follow-ups increase the value of each response, making slightly higher incentives cost-effective.



Table A9.2 summarizes the non-monetary and gift cards to be provided to participants for the follow-up survey data collection, based on where the survey will be administered.





Table A9.2. Thank You Payments for the Follow-up Data Collections

Type of Administration

Length of Activity(minutes)

First Follow-up

Second Follow-up

Group Administration (In-school)

30 minutes

Non-monetary incentive valued at $5

Non-monetary incentive valued at $5

Individual Administration (Out of school)

30 minutes

$10 gift card

$15 gift card


A.10. Assurance of Confidentiality

Mathematica will be responsible for Institutional Review Board (IRB) approval for all data collection activities and any additional local IRB or Research Review Board (RRB) approvals for each school district prior to information collection and for other data collection instruments, as necessary. We have IRB approval from New England IRB, dated January 12, 2017 for the protocol and February 3, 2017 for consent forms, for all data collection and implementation study activities (including the follow-up data collection), and the baseline data collection and implementation study instruments. IRB approval for all follow-up survey instruments will be acquired following OMB approval and prior to the start of data collection. We expect to work with three school districts for this study. The first district, Mobile County Public Schools, has agreed to participate in the first cohort, which will begin enrolling in February 2017. Mobile County Public Schools does not have a local department of education IRB. If additional districts recruited for the study do have a local IRB or RRB, will follow all protocols for approval.

Before collecting any data, Mathematica will seek active consent from a parent or legal guardian. The consent form will explain the purpose of the study, the data being collected and the way the data will be used. As with the baseline survey, prior to the administration of each follow-up survey the evaluation team will seek assent from youth with parental consent.12 The assent form states that (1) answers will be kept private and will not be seen by anyone outside the study team, (2) participation is voluntary, and (3) youth may refuse to participate at any time without penalty. Participants will be told that, to the extent allowable by law, individual identifying information will not be released or published; rather, results will be published only in summary form with no identifying information at the individual level. In addition, our protocol during the self-administration of the web instrument and CATI interviews will provide reassurance that we take the issue of privacy seriously. It will be made clear to respondents that identifying information will be kept separate from questionnaires. To access the web survey application, each questionnaire will require a unique login, and respondents will have to enter a verification code, such as their date of birth, to begin the survey; this will prevent unauthorized users from accessing the web application. Any personally identifiable information will be stored in secure files, separate from survey and other individual-level data. Field staff will collect the smartphones used for survey administration at the end of the survey and will be trained to keep the devices in a secure location at all times. All field staff and phone interviewers are required to sign a confidentiality pledge when hired by Mathematica (Attachment D). If there is an internet outage at the school on the day of the survey, field staff will facilitate a self-administered pencil and paper instrument (PAPI). In those instances, the questionnaire and outer packet envelope will have a label with a unique ID number; no identifying information will appear on the questionnaire or return envelope. The hard copy instrument has three parts (Part A, Part B1, and Part B2) that mirror the web programming, to avoid asking youth who are not sexually experienced detailed questions about their sexual activities. Part A survey asks for background information and concludes with a single screening question about sexual experience. Youth with sexual experience will complete Part B1 and those without will complete Part B2. Before turning completed questionnaires in to field staff, respondents will place them in blank return envelopes and seal them. Field staff are trained to keep all data collection forms in a secure location and are instructed not to share any materials with anyone outside of the study team. Completed surveys are shipped immediately via FedEx to Mathematica’s Survey Operations Center for receipting.

Mathematica has established security plans for handling data during all phases of the data collection. The plans include a secure server infrastructure for online data collection of the web-based survey, which features HTTPS encrypted data communication, user authentication, firewalls, and multiple layers of servers to minimize vulnerability to security breaches. Hosting the survey on an HTTPS site ensures that data are transmitted using 128-bit encryption; transmissions intercepted by unauthorized users cannot be read as plain text. This security measure is in addition to standard user unique login authentication that prevents unauthorized users from accessing the web application.

All electronic data will be stored in secure files, with identifying information kept in a file separate from survey and other individual-level data. Survey responses will be stored on a secure, password-protected computer shared drive.

Privacy Act Considerations

Based on the following two considerations, the Privacy Act does not apply for this information request. First, the records collected in this study will not be retrieved by personal identifiers and second, according to the 1975 OMB Privacy Act Guidance, which was reaffirmed in the recent issuance of Circular A-108, the Privacy Act only applies to systems of records that are required to be managed by the agency; the data collection for this study is discretionary.

For the first consideration, each sample member in the study is assigned a unique study identification (ID) number. Only Mathematica team members have access to these study ID numbers. When creating a survey data file, only the study ID number is included in the file, not the student name or any other PII. Student names and other PII are kept separate from the survey data. When retrieving information about a case, team members pull up the case using their study ID number, not the student name.

All PII data are stored separately and securely from de-identified survey data.  Any files containing PII are stored on Mathematica’s network in a secure project folder whose access is limited to select project team members. Only the principal investigator, project director and key study staff have access to this folder. Furthermore, approved study team members can only access this folder after going through multiple layers of security.

For the second consideration, the data collection is contracted and discretionary, therefore not covered by the Privacy Act. The proposed data collection will not be conducted by OAH, but rather, through their contractor, Mathematica. OMB Privacy Act Implementation: Guidelines and Responsibilities (July 9, 1975) describes terms under which data collected by a contractor under contract to the Federal Government is covered by the Privacy Act:

“Not only must the terms of the contract provide for the operation (as opposed to design) of such a system, but the operation of the system must be to accomplish an agency function. This was Intended to limit the scope of the coverage to those Systems actually taking the place of a Federal system which, but for the contract, would have been performed by an agency and covered by the Privacy Act.” (40 FRN 28976)

The proposed data collection does not create a system to “accomplish an agency function,” and is not a system that is “taking the place of a Federal system which, but for the contact, would have been performed by an agency.” OAH has discretion as to whether and how to carry out this data collection.  Thus, the proposed data collection is discretionary, not required, and the Privacy Act does not apply.

A.11. Justification for Sensitive Questions

A key objective of MPC! is to prevent teen pregnancy through a delay in sexual initiation, decrease in sexual activity, and/or an increase in contraceptive use. Because this is the primary focus of the program, some questions on the follow-up survey are necessarily related to these sensitive issues.

Table A11.1 lists the sensitive topics found on the follow-up survey, along with a justification for their inclusion. Questions about sensitive topics are drawn from previously-successful youth surveys and similar federal surveys (see Attachments A and B). Careful selection of these items was guided by experience in determining whether or not the benefits of measures may outweigh concerns about the heightened sensitivity to specific issues among sample members, parents, and program staff. Although these topics are sensitive, they are commonly and successfully asked of high school youth similar to those who will be in the Federal Evaluation of MPC!

Table A11.1. Summary of Sensitive Topics to be Included on the Follow-up Survey and Their Justification

Topic

Justification

Similar Federally Funded Surveys13

Gender identity14

The MPC! program aims to be sensitive and inclusive of all students. This question asks for the student’s self-identified gender, and includes options for transgender, unsure and other.

BRFSS

Sexual orientation

OAH has a strong interest in improving programming that serves lesbian, gay, bisexual, and questioning youth. This question will provide documentation of the proportion of youth in the study that are part of this subpopulation.

PREP, STREAMS, YRBSS, PPA, TPP Replication Study

Sexual activity, incidence of pregnancy, and contraceptive use

Sexual activity, incidence of pregnancy, and contraceptive use are all key outcomes for the evaluation.

Title V Abstinence Study, PPA, PREP, PAF, STREAMS, YRBSS, ADD Health, TPP Replication Study

Intentions regarding sexual activity

Intentions regarding engaging in sex and other risk-taking behaviors are extremely strong predictors of subsequent behavior (Buhi and Goodson, 2007). Intentions are strongly related to behavior and will be an important mediator predicting behavior change.

Title V Abstinence Study, PREP, PAF, PPA, TPP Replication Study

Drug and alcohol use and violence

There is a substantial body of literature linking various high-risk behaviors of youth, particularly drug and alcohol use, sexual intercourse, and risky sexual behavior. The effectiveness of various program strategies is expected to differ for youth who are and are not experimenting with or using drugs and alcohol (Tapert et al., 2001; Li et al., 2001; Boyer et al., 1999; Fergusson and Lynskey, 1996; Sen, 2002; Dermen et al., 1998; Santelli et al., 2001.)

Title V Abstinence Study, PPA, PREP, PAF, YRBSS, ADD Health, TPP Replication Study


In addition, the follow-up survey instrument will be designed so that only sexually active youth will receive most of these sensitive questions. The survey will ask all youth for background information and will include a screening question about sexual experience. The survey will route youth who report ever having sexual experience to additional questions about sexual behavior; those who report never having sex will be routed to other questions. Thus, many of the sensitive items related to sexual activity will be asked only of sample members who report being sexually active. This structure has been used successfully in other federally funded evaluations of teen pregnancy prevention programs, such as the Evaluation of the Title V, Section 510 Abstinence Education Program (OMB Control Numbers 0990-0233 and 0990-0237), the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA-OMB Control Number 0990-0382), the TPP Replication Study (OMB Control Number 0990-0394), and the Personal Responsibility Education Program (PREP) Multi-Component Evaluation (OMB Control Number 0970-0398). As an added protection and to make respondents feel more comfortable answering these sensitive questions, the smartphones will be equipped with privacy screens.

A.12 Estimates of the Burden of Data Collection

OAH is requesting three years of clearance for the Federal Evaluation of MPC!. As noted earlier, a three-year clearance is needed because enrollment into the study will be conducted across nearly three years, and follow-up data collection will occur approximately 9 and 15 months after baseline. Table A12.1 provides the estimated annual reporting burden for study participants as a result of the follow-up survey data collection for youth.

We anticipate that 3,900 youth enrolled in the expected 39 study schools will be eligible to participate in the study because they are enrolled in a required health class. We expect to consent 70 percent of the eligible youth, for a total sample size of 2,730, of which we expect 90 percent will complete the first follow-up survey (n=2,457) and 85 percent will complete the second follow-up survey (n=2,321). These response rates are consistent with those achieved in two of the PREP (OMB Control Number 0970-0398) school based sites, which used nearly identical evaluation structures, where the first follow-up response rates are 90 to 94 percent and second follow-up response rates are 84 to 91 percent.

Based on experience with similar questionnaires, it is estimated that it will take youth 30 minutes (30/60 hour) to complete each follow-up survey, on average. The total burden for the first follow-up is 1,228.5 hours, and the total burden for the second follow-up is 1,160.5 hours. Across the three years of administration, this is a total burden for the follow-up surveys of 2,389 hours, or an average of 796.5 hours a year.

Table A.12.1. Calculations of Annual Burden Hours and Cost for Youth Participants for the follow-up survey

Instrument

Type of respondent

Total Number of Respondents

Annual Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Annual Burden Hours

Annual Burden Hours for Youth Age 18 or Older

Hourly Wage Rate

Total Annual Costs

1. Follow-up survey (9 months post baseline)

Youth study participants

2,457

819

1

30/60

409.5

41

$7.25

$297.25

2. Follow-up survey (15 months post baseline)

Youth study participants

2,321

774

1

30/60

387

77.5

$7.25

$561.88

Estimated Annual Burden for Youth Follow-up survey Participants






796.5

237


$859.13

NOTE: We assumed 10% of the sample will be 18 at the 9-month survey and 20% at the 15-month. The federal minimum wage was used to calculate annual costs.

Table A.12.2 details the overall annual burden requested for data collection associated with the MPC! Study. A total of 595.8 annual hours (and an annual cost of $1,583.98) was approved in the prior ICR for this project (Baseline Survey and Implementation and Fidelity Assessment). A total of 796.5 annual hours (and an annual cost of $859.13) is requested in this ICR.

Table A.12.2. Calculations of Annual Burden Hours and Costs to Date

Data collection instrument

Type of Respondent

Annual number of respondents

Number of responses per respondent

Average burden hours per response

Annual burden hours

Annual Burden Hours for Youth Age 18 or Older

Hourly Wage Rate

Total costs

Baseline and Implementation and Fidelity Assessment Data Collection (Approved January 17, 2017)

Baseline survey

Youth study participants

865

1

30/60

432.5

NA

NA

NA

Master topic guide Staff Interviews

School administrator, health educators

39

1

1

39

39

$20.76

$809.64

Staff Survey

Health educators

26

1

30/60

13

13

$20.76

$269.88

Program Attendance Data Collection Protocol

Health educators

17.33

14

1/60

4.9

4.9

$20.76

$101.72

Program Fidelity Checklist

Health educators

17.33

14

5/60

19.4

19.4

$20.76

$402.74

Youth Focus Group

Participating program participants

87

1

1

87

NA

NA

NA

Subtotal: Annual burden approved to date:





595.8

76.3


$1,583.98

Follow-up Survey Data Collection (Requested in this ICR)

Follow-up survey (9 months post baseline)

Youth study participants

819

1

30/60

409.5

41

$7.25

$297.25

Follow-up survey (15 months post baseline)

Youth study participants

774

1

30/60

387

77.5

$7.25

$561.88

Estimated Total Annual Burden





1,392.3

194.8


$2,443.11


A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

These information collection activities do not place any capital cost or cost of maintaining requirements on respondents.

A.14. Annualized Cost to Federal Government

Data collection will be carried out by Mathematica, under contract with OAH to conduct the Federal Evaluation of MPC!.

In the previously approved ICR (OMB Control #0990-0452; approved on January 17, 2017), the initial cost to the federal government for baseline survey data collection and the implementation and fidelity assessment was $500,150.00 The baseline survey data collection and implementation and fidelity assessment is incrementally funded, and the current total cost to the federal government is $1,675,350. The estimated annualized cost to the federal government for the baseline survey data collection and implementation and fidelity assessment is $558,450 ($1,675,350/3).

For this current ICR, the total cost to the federal government for the 9-month follow-up survey is $693,000, and the total cost to the federal government for the 15-month follow-up survey is $748,000. The estimated annualized cost to the federal government for the 9 month follow- up survey is $231,000 ($693,000/3) and for the 15 month follow-up survey is $249,333 ($748,000/3).

The total annualized cost to the federal government is $1,038,783 ($558,450 + $231,000 + $249,333).

A.15. Explanation for Program Changes or Adjustments

OMB gave approval on January 17, 2017 for the impact study baseline data collection and implementation and fidelity assessment data collections (OMB Control Number 0990-0452). OAH now seeks approval for the data collection associated with the administration of the 9 and 15 month follow-up survey instrument. This request will increase the total annual burden requested for the MPC! Evaluation from 595.8 hours to 1,392.3 hours.

With this ICR, OAH is also explaining a modest revision to the study design that does not affect original burden assumptions. OAH originally designed the study to address the relative effectiveness of the program implemented by school health teachers and professional health educators. The original study design used a three-armed RCT, where schools were to be randomly assigned to receive (1) MPC! implemented by health educators, (2) MPC! implemented by school teachers, or (3) business as usual health programming as the control condition. After input from an expert panel, OAH has changed the study design to test the effectiveness of the program as delivered by health educators. Schools will now be randomized to one of two groups: (1) MPC! implemented by health educators, and (2) business as usual health programming as the control condition. The same number of schools are expected to participate in the study, which does not change the original burden assumptions.

A16. Plans for Tabulation and Publication and Project Time Schedule

1. Analysis Plan

Data from the follow-up surveys will be used to estimate the effect of the intervention on the outcomes of interest – both the sexual behavior measures as distal outcomes, and the more proximal, mediating variables (knowledge, attitudes, and intentions).

As noted in Section A.2., the primary impact analysis will focus on those who provide follow-up survey data, regardless of their level of participation in the program, or whether they complete the baseline survey. Doing so enables the team to conduct a rigorous, intent-to-treat impact analysis that meets the standards of the HHS Teen Pregnancy Prevention Evidence Review. Many baseline measures will be measured again at follow-up; their baseline values can be used to improve the precision of impact estimates by their inclusion as covariates in the impact models, for those with both baseline and follow-up data.

We also plan on conducting exploratory analyses on subgroups defined by baseline measures. These analyses will be considered exploratory, and will not be used as a primary test of the effectiveness of the intervention. Instead, they are intended to help program providers and practitioners understand whether the pattern of the findings for the full sample is similar to or different from trends observed for particular subgroups. We will observe trends for subgroups defined by (1) gender, and (2) sexual experience at baseline.

We acknowledge that statistical power for these exploratory analyses may be insufficient as a result of smaller sample sizes within the subgroups. For that reason, these analyses are not intended as a primary test of the intervention’s effectiveness, but instead as a means to understanding whether the overall pattern of findings is similar to trends observed within and across particular subgroups.

A detailed analysis plan is in Attachment E.

2. Time Schedule and Publications

OAH expects that the Federal Evaluation of MPC! will be conducted over five years, beginning in September 2015. This request is for a three-year period beginning in July 2017. A schedule of the data collection efforts for the follow-up survey follows. Reporting on the results of the follow-up surveys occur in 2019 (9—month follow-up) and 2020 (15-month follow-up survey).

Table A.16.1. Timeline for Use of Data Collection Instruments

Instrument

Date of 60-Day Submission

Date of 30-Day Submission

Date Clearance Needed

Date for Use in Field

Instrument 1:
Follow-up Survey (to be administered 9 months and 15 months post-baseline)

February 2017

April 2017

July 2017

Fall 2017 for 9 month survey; Spring 2018 for 15 month follow-up survey



A17. Reason(s) Display of OMB Expiration Date is Inappropriate

All instruments, consent and assent forms and letters will display the OMB Control Number and expiration date.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

SUPPORTING REFERENCES

Boyer, Cherrie B., Jeanne M. Tschann, and Mary-Ann Shafer. “Predictors of Risk for Sexually Transmitted Diseases in Ninth Grade Urban High School Students.” Journal of Adolescent Research, vol. 14, no. 4, 1999, pp. 448–465.

Buhi, Eric R., and Patricia Goodson. “Predictors of Adolescent Sexual Behavior and Intention: A Theory-Guided Systematic Review.” Journal of Adolescent Health, vol. 40, no. 1, 2007, pp. 4–21.

Dermen, K. H., M. L. Cooper, and V. B. Agocha. “Sex-Related Alcohol Expectancies as Moderators of the Relationship Between Alcohol Use and Risky Sex in Adolescents.” Journal of Studies on Alcohol, vol. 59, no. 1, 1998, pp. 71–77.

Fergusson, David M., and Michael T. Lynskey. “Alcohol Misuse and Adolescent Sexual Behaviors and Risk Taking.” Pediatrics, vol. 98, no. 1, 1996, pp. 91–96.

Jemmott, J. B., L. S. Jemmott, and G. T. Fong. “Abstinence and Safer Sex HIV Risk-Reduction Interventions for African American Adolescents: A Randomized Controlled Trial.” Journal of the American Medical Association, vol. 279, no. 19, 1998, pp. 1529–1536.

Li, Xiaoming, Bonita Stanton, Lesley Cottrell, James Burns, Robert Pack, and Linda Kaljee. “Patterns of Initiation of Sex and Drug-Related Activities among Urban Low-Income African-American Adolescents.” Journal of Adolescent Health, vol. 28, no. 1, 2001, pp. 46–54.

Santelli, John S., Leah Robin, Nancy D. Brener, and Richard Lowry. “Timing of Alcohol and Other Drug Use and Sexual Risk Behaviors Among Unmarried Adolescents and Young Adults.” Family Planning Perspectives, vol. 33, no. 5, 2001, pp. 200–205.

Sen, Bisakha. “Does Alcohol Use Increase the Risk of Sexual Intercourse Among Adolescents? Evidence from the NLSY97.” Journal of Health Economics, vol. 21, no. 6, 2002, pp. 1085–1093.

Tapert, Susan F., Gregory A. Aarons, Georganna R. Sedlar, and Sandra A. Brown. “Adolescent Substance Use and Sexual Risk-Taking Behavior.” Journal of Adolescent Health, vol. 28, no. 3, 2001, pp. 181–189.

1 Data collection instruments associated with the implementation and fidelity assessment were approved by OMB on 1/17/2017 (OMB Control No. 0990-0452).

2 To ensure the privacy of survey respondents, we have timed the length of the series of questions for non-sexually active youth to approximate to the length of the series for sexually active youth.

3 The participant-facing name of the study is the Attitudes, Behaviors, and Choices (or ABC) Study. This name appears on the instrument (Instrument 1).

4 ACF received initial OMB approval for the PPA baseline survey on July 26, 2010 (OMB Control Number 0970-0360). In summer 2011, oversight of PPA was transferred to the Office of Adolescent Health (OAH) within the Office of the Assistant Secretary, and the project is now tracked with a different OMB Control Number (0990-0382). The OMB Control Number for the Teen Pregnancy Prevention Replication Study is 0990-0394. OMB approval for the PREP baseline survey was received on March 12, 2013 (OMB Control Number 0970-0398). OMB approval for the PAF baseline survey was received on August 30, 2014 (OMB Control Number 0990-0424).

5 In the event of an internet outage at the school on the day of survey administration, field staff will have materials ready to do a self-administered pencil and paper (PAPI) survey administration and therefore will not increase the burden of the school in having to reschedule. PAPI survey administration was used successfully on the PREP evaluation (OMB Control #0970-0398).

6 Berlin, Martha, Leyla Mohadjer, Joseph Waksberg, Andrew Kolstad, Irwin Kirsch, D. Rock, and Kentaro Yamamoto. 1992. An experiment in monetary incentives. In JSM proceedings, 393–98. Alexandria, VA: American Statistical Association.

7 James, Jeannine M., and Richard Bolstein. 1990. The effect of monetary incentives and follow-up mailings on the response rate and response quality in mail surveys. Public Opinion Quarterly 54 (3): 346–61.

8 Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. 1998. Does the payment of incentives create expectation effects? Public Opinion Quarterly 62:152–64.

9 Singer, Eleanor, and Richard A. Kulka. 2002. Paying respondents for survey participation. In Studies of welfare populations: Data collection and research issues, eds. Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, 105–28. Washington, DC: National Academy Press.

10 STREAMS follow-up data collection is scheduled to begin in fall 2017.

11 Response rates provided are for two school-based sites on PREP. First follow-up data collection is on-going in one and closed in the other and second follow-up data collection is on-going in both sites.

12 The consent and assent forms were approved by OMB on January 17, 2017 (OMB# 0990-0452).

13 PPA OMB Control Number 0990-0382, PREP OMB Control Number 0970-0398, PAF OMB Control Number 0990-0424, STREAMS OMB Control Number 0970-0481. Center for Disease Control and Prevention (CDC) YRBSS: 2017 Youth Risk Behavior Surveillance System Questionnaire & the Eunice Kennedy Shriver National Institute of Child Health and Human Development, with co-funding from 17 other federal agencies, National Longitudinal Study of Adolescent to Adult Health (ADD Health), Title V Abstinence Evaluation OMB Control Numbers 0990-0233 and 0990-0237, 2016 Behavioral Risk Factor Surveillance System Questionnaire, sponsored by the Center for Disease Control and Prevention, and the Teen Pregnancy Prevention (TPP) Replication Study, OMB Control Number 0990-0397.

14 In consultation with OMB, we are retaining on the follow up survey the version of the gender identity question that was approved on the baseline survey. In future ICR’s we will incorporate current OMB guidance for gender identity items in baseline submissions.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMPC_Statement A_042617
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy