0990_0452_Supporting_Statement B_OMB_Passback_rev_82017

0990_0452_Supporting_Statement B_OMB_Passback_rev_82017.docx

Federal Evaluation of Making Proud Choices! (MPC!

OMB: 0990-0452

Document [docx]
Download: docx | pdf

Part B: Statistical Methods for the Collection of Follow-up Survey Data - Federal Evaluation of Making Proud Choices!

OMB Control Number 0990-0452

August 2017

U.S. Department of Health and Human Services Office of Adolescent Health
Office of the Director Department
of Health and Human Services

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

Project Officer: Amy Farb


Part B: Statistical Methods for the Collection of Follow-up Survey Data - Federal Evaluation of Making Proud Choices!

OMB Control Number 0990-0452

August 2017





CONTENTS

CONTENTS iii

TABLES iii

ATTACHMENTS iv

PART B INTRODUCTION 1

B1. Respondent Universe and Sampling Methods 1

B2. Procedures for Collection of Information 3

B3. Methods to Maximize Response Rates and Deal with Non-Response 4

B4. Test of Procedures or Methods to be Undertaken 6

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 6

TABLES

B1.1. Minimum Detectible Impacts for The Federal Evaluation of MPC! 3





ATTACHMENTS

ATTACHMENT A: QUESTION BY QUESTION SOURCE LIST FOR THE FOLLOW-UP survey

ATTACHMENT b: SOURCES REFERENCED FOR THE FOLLOW-UP survey

ATTACHMENT c: PERSONS CONSULTED ON INSTRUMENT DEVELOPMENT AND/OR ANALYSIS OF THE MPC! FOLLOW-UP survey

ATTACHMENT D: CONFIDENTIALITY PLEDGE

ATTACHMENT E: ANALYSIS PLAN

ATTACHMENT f: 60 day federal register notice

ATTACHMENT g: pretest memo


INSTRUMENTS

Instrument 1: FOLLOW-UP survey



PART B INTRODUCTION

In response to a shifting research agenda in the field of teen pregnancy prevention, the Office of Adolescent Health (OAH) seeks to design a new large-scale, multisite random assignment evaluation of an abstinence-based teen pregnancy prevention program that makes a significant contribution to the growing portfolio of research activities OAH has sponsored since the office was established in 2010. Much of OAH’s existing evaluation work focuses on documenting and evaluating the first cohort of grantees funded under the OAH Teen Pregnancy Prevention (TPP) program. With this new evaluation, OAH seeks to launch a “second generation” of evaluation activities - one that addresses a more targeted set of research questions of significant practical relevance to OAH and the broader field. In particular, this new evaluation will seek to advance the existing evidence base by testing the replication of a commonly used but understudied evidence-based teen pregnancy prevention program, Making Proud Choices! The abstinence-based MPC! curriculum aims to increase students’ knowledge of sexually transmitted diseases (STDs) and HIV, and understanding of the effectiveness of abstinence, condoms, and contraceptives at reducing STDs and pregnancy.

On January 17, 2017 OMB approved the instruments associated with two data collection efforts for the MPC! Evaluation: (1) collection of baseline data for the impact study through the baseline survey; and (2) collection of information for the implementation and fidelity assessment through master topic guides for interviews, staff surveys, program attendance data, fidelity logs, and youth focus groups (OMB Control #0990-0452).

With this submission, OAH is requesting OMB approval for the follow-up survey instrument, which will be used to collect data from study participants. This survey, to be administered approximately 9 and 15 months post-baseline, contains several items that are on the OMB-approved baseline survey. Modifications made for the follow-up survey include dropping items not relevant for follow-up data collection, and adding items that address key outcomes.

In a randomized controlled trial, we technically do not need baseline measures of the outcomes for analysis because randomly determining treatment and control groups ensures that there are no systematic differences between the two groups that are correlated with the intervention; any inequalities observed are due to chance and do not introduce bias into the experiment. A valid estimate of program effectiveness is possible by simply testing the statistical significance of mean differences in outcomes across treatment and control groups. The government asks contractors to collect measures at baseline to (1) prove to report readers and evidence reviews that random assignment did, indeed, result in two equivalent groups on observables, and (2) improve the precision of our impact estimates. If an outcome is not measured at baseline, it is common practice to use a proxy of that measure at baseline as a covariate in impact estimation models to improve precision.

For the MPC follow-up survey, we added 8 items and a small number of sub-items that are directly related to the intervention and are designed to measure outcomes in the domains of knowledge, attitudes, refusal skills, and intentions (SSA, Table A1.1). Prior to adding these 8 items and sub-items, we dropped 19 items that are considered time invariant or will not be used for impact or exploratory analyses. The new items are 2.7, 2.18, 3.14, 3.15, 3.20, 3.21, 3.26, and 3.27, and the new sub-items are with items 2.2, 2.6 and 3.37 (Instrument 1). In addition, we added an item on high school completion (item 1.5). This is a very typical item on surveys of high school youth. It was not measured at baseline because our sample was all enrolled in school at the time of study enrollment. While it is not an outcome for this intervention (and will not be used for impact analysis), it may be used for secondary or exploratory analyses. In addition, this contract will deliver a data file to OAH, and if OAH makes the data file available as a restricted or public use file, a measure of high school completion in a file that also contains adolescent attitudes, beliefs, and behaviors may well inform other research questions to be addressed through secondary data analysis. Attachment A, a question-by-question review of items on the follow-up survey, notes which items are also found on the baseline survey, and describes any modifications made for the follow-up survey. This document also identifies new items on the follow-up survey, and their source.

Additionally, this submission describes a revision to the study in Section A.15. OAH originally designed the study to address the relative effectiveness of the program implemented by school health teachers and professional health educators. The original design used a three-armed randomized control trial (RCT), where schools were to be randomly assigned to receive (1) MPC! implemented by health educators, (2) MPC! implemented by school teachers, or (3) business as usual health programming as the control condition. After input from an expert panel, OAH has changed the design to test the effectiveness of the program as delivered by health educators. Schools will now be randomized to one of two groups: (1) MPC! implemented by health educators, and (2) business as usual health programming as the control condition. The same number of schools are expected to participate in the study, which does not change the original burden assumptions.

B1. Respondent Universe and Sampling Methods

The evaluation will be conducted in approximately 39 high schools, and in required health classes.



It is expected that most youth in the study will be 9th graders enrolled in a school’s required health class. Schools will be randomized to one of two conditions: (1) a treatment group taught MPC! by an outside health educator from a local health department or community-based organization, or (2) a control group that receives the health curriculum the school’s health teacher normally provides (i.e., a business-as-usual control condition).

Eligible evaluation youth will be those that are expected to take a required health class. We anticipate that 3,900 youth enrolled in the expected 39 study schools will be eligible to participate because they are enrolled in such a class. We expect to consent 70 percent of the eligible youth, for a total sample size of 2,730, of which we expect 90 percent will complete the first follow-up survey (n=2,457) and 85 percent will complete the second follow-up survey (n=2,321). These response rates are in line with other federally funded projects using a similar design, such as PREP (OMB Control Number 0970-0398), where first follow-up response rates in two of the school-based sites are 90 to 94 percent, and second follow-up response rates are 84 to 91 percent.1Similarly, on PPA in the Chicago school-based site (OMB Control Number 0990-0382) the first follow-up response rate was 94 percent and the second follow-up response rate was 90 percent.

The evaluation sample is expected to be equally male and female, primarily African American or Hispanic (at least 50 percent of the sample), and low-income (with more than 50 percent of the sample qualifying for free and reduced price lunch).

Statistical Power. In similar studies of teen pregnancy prevention programs in schools with group administration of follow-up surveys, such as the Personal Responsibility Education Program (PREP - OMB Control Number 0970-0398) and the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA-OMB Control Number 0990-0382), we have achieved high response rates on follow-up surveys. On PREP, for example, two of the school-based studies have greater than 90 percent response rates at the first follow-up (12 months after baseline) and about 84 percent at the second follow-up (24 months after baseline). On the PPA (OMB Control Number 0990-0382) Chicago school-based site, the first follow-up response rate was 94 percent and the second follow-up response rate was 90 percent. We anticipate similar response rates for this study where the first follow-up is planned for 9 months after baseline and the second follow-up is planned for 15 months after baseline. Also, based on our experiences on PREP and PPA, we expect to retain all 39 randomized clusters. Power calculations are therefore based on retaining the entire school sample.

The primary impact analysis will focus on those who provide follow-up survey data, regardless of their level of participation in the program, or whether they complete the baseline survey. This will enable the team to conduct a rigorous, intent-to-treat impact analysis that meets the standards of the HHS Evidence Review. As noted above, we expect some non-response to the surveys. We expect that 95 percent of consented youth will complete the baseline survey (n=2,594), 90 percent will complete the first follow-up (n=2,457), and 85 percent will complete the second follow-up (n=2,321). At the first follow-up (9 months after baseline), for a prevalence rate of 25 percent (such as a sexual initiation), we can detect a 5.5 percentage point difference between the two groups; for a prevalence rate of 50 percent (such as contraception use), we can detect an 6.4 percentage point difference between the two groups. At the time of the longer-term follow-up (15 months after baseline), we will be able to detect similar MDIs as we will at the 9-month follow-up (Table B1.1)

Table B1.1. Minimum Detectible Impacts for The Federal Evaluation of MPC!


First follow-up
(90% response rate)

Second follow-up
(85% response rate)

MDES

0.128

0.175

MDI (50% prevalence rate in control group)

6.4 percentage points

6.5 percentage points

MDI (25% prevalence rate in control group)

5.5 percentage points

5.6 percentage points

Note: These calculations assume an intra-class correlation coefficient of .01; 25 percent of individual-level variance in the outcome explained; and 70 percent of the cluster variance explained, due to baseline measures of the outcome of interest, baseline assessment of other risk behaviors, and demographics.

MDES = Minimum Detectible Effect Size

We also plan to conduct analyses on subgroups defined by baseline measures. These analyses will be considered exploratory, and will not be used as a primary test of the effectiveness of the intervention. Instead, they are intended to help program providers and practitioners understand whether the pattern of the findings for the full sample is similar to or different from trends observed for particular subgroups. We will observe trends for subgroups defined by (1) gender, and (2) sexual experience at baseline.

We acknowledge that statistical power for these exploratory analyses may be insufficient due to smaller sample sizes within the subgroups. For that reason, the analyses are intended not as a primary test of the intervention’s effectiveness, but instead as a means to understand whether the overall pattern of findings are similar to trends observed within and across particular subgroups.

B2. Procedures for Collection of Information

In each of the schools, all youth with parental consent will be considered for follow-up data collection. Mathematica staff will work with the schools prior to follow-up data collection to identify which study participants are still enrolled and which have moved or transferred to another school.

The data collection plan for the follow-up survey is the same across all participating schools and reflects sensitivity to issues of efficiency, accuracy, and respondent burden. The follow-up survey will be administered to consented youth approximately 9 months after completing the baseline survey and again approximately 15 months after completing the baseline survey. As with the baseline survey, the follow-up survey will be web-based and administered in a group setting at each school. The web survey will be smartphone-compatible; Mathematica will provide participants with smartphones, along with a unique login to access the survey from the device.

Mathematica will train staff on answering questions about the study, collecting student assent, and administering the follow-up survey to youth. The evaluation team will work with sites to determine the best day, time, and location for the group survey administration. The team will begin the administration by reviewing the details of the study and obtaining youth assent. 2 Any student who chooses to opt out of the survey will be led to another room with students who do not have permission to participate in the study. Youth who agree to take the survey will be provided with a unique login to access the web application and will be prompted to enter a verification code, such as their date of birth, to begin. The survey will be self-administered. Students will be instructed to begin the survey and work through at their own pace.

The survey asks all youth for some background information and includes a screening question about sexual experience. The survey then routes youth who report ever having sex to additional questions about sexual behavior and their use of contraceptives; those who report never having sex will be routed to other questions. No personally identifying information will appear with the survey. A question by question list of sources for the follow-up survey is in Attachment A, and a description of the sources referenced is in Attachment B. Once they have completed the survey, youth will close the web survey application and return the smartphone to a member of the evaluation team. When the survey administration is complete, Mathematica field staff will work with school staff to arrange make-up administrations for any students who were absent.

Baseline survey administration has been completed in one study site, and web administration using smartphones was successful. However, iff for any future administrations for the baseline or follow-up surveys there is an internet or cellular outage at the school on the day of the survey, field staff will facilitate a self-administered pencil and paper survey instrument (PAPI). The evaluation team will have, as a back-up, pre-identified survey packets that they can hand out to the youth whose names are on the packets, after obtaining youth assent. Each packet will consist of the MPC! follow-up survey and a sealable return envelope. The survey will have a label with a unique ID number (no personally identifying information will appear on the survey or return envelope). Youth will self-administer the survey. The hard copy instrument has three parts (Part A, Part B1, and Part B2) that mirror the web programming, to avoid asking youth who are not sexually experienced detailed questions about their sexual activities. Part A asks for background information and concludes with a single screening question about sexual experience. Youth with sexual experience will complete Part B1, and those without it will complete Part B2. Two members of the evaluation team will monitor activities in each survey room. At the end of the administration, youth will place the entire survey in the return envelope, seal it, and return it to a member of the evaluation team. Completed surveys will be shipped immediately via FedEx to Mathematica’s Survey Operations Center, where they will be logged and then checked for completeness. Any forms with identifying information, such as assent forms, will be shipped separately.

Students who have moved out of the area, have transferred to a non-study school, or are otherwise unable to complete the survey through the in-school data collection will be sent advance letters with the information necessary to log on to the web survey and complete it on their own time. For those that do not respond, we will follow up with postcards, emails and texts (with permission) and phone calls. These participants will also be given the option to complete the survey over the phone with a trained Mathematica interviewer.

B3. Methods to Maximize Response Rates and Deal with Non-Response

OAH expects to achieve a response rate of 90 percent for the 9-month follow-up survey and 85 percent for the 15-month follow-up survey. This expectation is based on response rates achieved in prior follow-up surveys with similar populations, such as with the Personal Responsibility Education Program (PREP-OMB Control Number 0970-0398) and the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA-OMB Control Number 0990-0382) studies. We can expect to achieve these completion rates for the Federal Evaluation of MPC! at the follow-up period for several other reasons. First, the follow-up surveys will occur approximately 9 and 15 months after the baseline administration. This timing will ensure that contact data are quite current, which should minimize location problems. In many cases, youth will be enrolled in the same schools at follow-up that they were enrolled in at baseline, which will simplify locating efforts and improve response rates.

Mathematica staff will work with the school to schedule the date and time of the follow-up survey, and to ensure that it does not conflict with any activities that are required or that students might find more desirable (i.e., lunch, field trip). Prior to administration, the evaluation team will work closely with school contacts to locate respondents in their new classrooms. To maximize attendance, team members will ask schools to post reminders and make announcements prior to and on the day of the survey. Before beginning the administration, Mathematica staff will take attendance and follow up immediately with the school contact regarding any unexpected absentees.

As in the baseline process, to help attain high response rates, field staff will collaborate with each site to arrange additional make-up sessions for students who are absent on the initial day of the survey. In addition, participants who were unable to complete the survey during the in-school data collection will be sent advance letters, postcards, emails, and texts (with permission) that include instructions for logging on to the web survey and completing it online, at a time convenient for them. These participants will also be given the option to complete the survey over the phone with a trained Mathematica interviewer.

When students complete the follow-up survey, they will receive a small gift of appreciation (for in school group administrations, a non-monetary incentive valued at $5 and for out-of school administrations, a $10 gift card for the 9-month follow-up; $15 gift card for the 15-month follow-up). Research suggests that providing an incentive for earlier surveys may contribute to higher response rates for subsequent surveys.3 Therefore, providing a small gift during the 9-month follow-up may help boost response rates during subsequent rounds of data collection. The proposed incentive structure is revised compared to those used on other federally funded studies with similar populations, including PREP (OMB Control Number 0970-0398), where first follow-up response rates in two of the school-based sites are 90 to 94 percent and second follow-up response rates are 84 to 91 percent.4 Revisions have been made to address OMB’s concern over the amount of the incentive and providing incentive to students responding to the survey in school.

In addition, we expect that the site’s continued willing assistance will be very important to maximizing the response rates; we will therefore invest significant effort in maintaining positive relationships to minimize burden on the sites and assure privacy to the youth participants. By applying identical methods for maximizing the response rates of the treatment and control groups, the evaluation team does not anticipate differences in response rates across research groups.

The evaluation team anticipates high response rates to follow-up surveys. Even so, the team will take steps to understand the nature of any non-response and to account for the threat it may pose to the validity of the study’s impact estimates. Using data from the baseline survey, evaluation team members will first test for statistically significant differences across demographic and baseline outcome variables between the treatment and control group members who are follow-up respondents and control for these differences using covariates when estimating program impacts (see Attachment E).

B4. Test of Procedures or Methods to be Undertaken

OAH and other offices within HHS (OPRE, ASPE) have made it a priority to align measures in the follow-up survey across evaluations of similar programs and populations. Many of the items on the follow-up survey5 are identical to the items in the already-approved baseline survey instrument (OMB Control Number 0990-0452). These items are similar to those in surveys OMB has already approved for use in comparable evaluations, including the ongoing Evaluation of Adolescent Pregnancy Prevention Approaches (PPA-OMB Control Number 0990-0382), the Teen Pregnancy Prevention Replication Study (OMB Control Number 0990-0397), the Personal Responsibility Education Program (PREP) Multi-Component Evaluation (OMB Control Number 0970-0398), and the Pregnancy Assistance Fund (PAF - OMB Control Number 0990-0424 for baseline survey) Study.

Mathematica has conducted pretests of the follow-up survey with up to nine youth. The pretests focused primarily on new items not included on the baseline, and are designed to ensure that questions are understandable, use language and terms familiar to respondents, and are consistent with the concepts they aim to measure. The pretest was designed to help us (1) identify typical instrumentation problems such as question wording and incomplete or inappropriate response categories; (2) measure the response burden; and (3) confirm that there are no unforeseen difficulties in administering the instrument. Youth were selected who are similar to those in the expected study population (early high school). Mathematica collected parental consent and child assent for youth who participated in the pretest. The pretest was administered using a hard-copy, paper version of the survey, and trained staff debriefed with respondents. A report describing the process used to pretest the instrument and summarizing the changes made to it as a result of the pretest is in Attachment G.

Once the survey is finalized, it will be programmed as a web-based survey. Mathematica staff will test the web survey thoroughly to ensure that all paths are working correctly and that respondents are routed only to those questions that are appropriate, based on their responses.

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Follow-up survey data for the impact study will be collected and analyzed by OAH’s prime contracting organization, Mathematica Policy Research. OAH consulted with the following individuals on follow-up instrument development, and on the data collection and analysis plan.



For follow-up survey development and impact analysis:

  • Andrea Bucciarelli

Mathematica Policy Research

955 Massachusetts Avenue, Suite 801

Cambridge, MA 02139

(617) 674-8385

  • Jean Knab

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 945-3367

  • Russ Cole

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 716-4549

  • Matthew Stagner

Mathematica Policy Research

111 East Wacker Dr., Suite 920

Chicago, IL 60601

(312) 994-1044

  • John Deke

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 275-2230

  • Melissa Thomas

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 275-2231

  • Sarah Forrestal

Mathematica Policy Research

111 East Wacker Drive, Suite 920

Chicago, IL 60601

(312) 994-1017

  • Jennifer Walzer

Mathematica Policy Research

111 East Wacker Drive, Suite 920

Chicago, IL 60601

(312) 994-1042

  • Brian Goesling

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 945-3355

  • Susan Zief

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 275-2291

  • John B. Jemmott III, Ph.D.

University of Pennsylvania

Perelman School of Medicine

Department of Psychiatry

3535 Market Street, Suite 520

Philadelphia, PA 19104-3309

(215) 573-9366

  • Amy Farb

U.S. Department of Health and Human Services

Office of Adolescent Health

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

(240) 453-2836

  • Loretta Sweet Jemmott, Ph.D.

Drexel University

College of Nursing and Health Professions

1601 Cherry Street

Philadelphia, PA 19102,

(215) 895-2000

  • Tish Hall

U.S. Department of Health and Human Services

Office of Adolescent Health

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

(240) 453-2846

  • Laura Kalb

Mathematica Policy Research

955 Massachusetts Avenue, Suite 801

Cambridge, MA 02139

(617) 301-8989

  • Tara Rice

U.S. Department of Health and Human Services

Office of Adolescent Health

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

(240) 453-2846



1 PREP has two school-based sites. First and second follow-up is closed in one site, and is ongoing in the other.

2 Youth assent is obtained prior to each round of data collection (baseline, 9-month follow-up and 15-month follow-up).

3 Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. 1998. Does the payment of incentives create expectation effects? Public Opinion Quarterly 62:152–64.

4 Response rates provided are for two school-based sites on PREP. First follow up is on-going in one and closed in the other and second follow up is on-going in both sites.

5 The participant-facing name of the study is the Attitudes, Behaviors, and Choices (or ABC) Study. This name appears on the instrument (Instrument 1).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title50164-Statement B
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy