MPC_SS_B_OMB_extension_30Day

MPC_SS_B_OMB_extension_30Day.docx

Federal Evaluation of Making Proud Choices! (MPC!)

OMB: 0990-0452

Document [docx]
Download: docx | pdf

U.S. Department of Health and Human Services Office of Population Affairs
Office of the Director Department
of Health and Human Services

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

Project Officer: Amy Farb

Point of Contact: Tara Rice, [email protected]. 240-453-8123

Part B: Statistical Methods for the Extension and Revision of OMB Clearance of the Collection of Follow-up Survey Data - Federal Evaluation of Making Proud Choices!

OMB Control Number 0990-0452

February 2020




CONTENTS

CONTENTS iii

TABLES iii

ATTACHMENTS iv

B1. Respondent Universe and Sampling Methods 1

B2. Procedures for Collection of Information 2

B3. Methods to Maximize Response Rates and Deal with Non-Response 3

B4. Test of Procedures or Methods to be Undertaken 3

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 4

TABLES

B1.1. Minimum Detectible Impacts for The Federal Evaluation of MPC! 3





ATTACHMENTS

ATTACHMENT A: QUESTION BY QUESTION SOURCE LIST FOR THE FOLLOW-UP survey

ATTACHMENT b: SOURCES REFERENCED FOR THE FOLLOW-UP survey

ATTACHMENT c: PERSONS CONSULTED ON INSTRUMENT DEVELOPMENT AND/OR ANALYSIS OF THE MPC! FOLLOW-UP survey

ATTACHMENT D: CONFIDENTIALITY PLEDGE

ATTACHMENT E: ANALYSIS PLAN

ATTACHMENT f: Section 301 Public Health service act

ATTACHMENT G: 60 DAY FEDERAL REGISTER NOTICE



INSTRUMENTS

Instrument 1: FOLLOW-UP survey



B1. Respondent Universe and Sampling Methods

The evaluation is being conducted in 15 high schools across four school districts, and in required health classes.

At the start of the each enrollment period, most youth in the study are in 9th or 10th grade and enrolled in a school’s required health class or the school’s equivalent when health is not offered. Schools are randomized to one of two conditions: (1) a treatment group taught MPC! by an outside health educator from a local health department or community-based organization, or (2) a control group that receives the health curriculum the school’s health teacher normally provides (i.e., a business-as-usual control condition). School participating in the evaluation for multiple years are re-randomized each year, for a total of 39 randomized schools across the four years of enrollment.

As of Fall 2019, there are 2,868 students eligible to participate in the study. We consented 76 percent of these eligible youth for a total sample of 2,180 students. Follow up surveys have been completed with 86 percent of the sample, in cohorts where follow up administrations are complete. The extended data collection will occur in 7 of the 15 participating schools. We expect the consent and follow up response rates to remain the same during this extension period.

The evaluation sample is expected to be equally male and female, primarily African American or Hispanic (at least 50 percent of the sample), and low-income (with more than 50 percent of the sample qualifying for free and reduced price lunch).

Statistical Power.

The primary impact analysis will focus on those who provide follow-up survey data, regardless of their level of participation in the program, or whether they complete the baseline survey. This will enable the team to conduct a rigorous, intent-to-treat impact analysis that meets the standards of the HHS Evidence Review. As stated above, we have some non-response to the surveys, with a current follow up response rate of 86% across all study sites. At the follow-up (9 months after baseline), for a prevalence rate of 25 percent (such as a sexual initiation), we can estimate that we can detect a 5.5 percentage point difference between the two groups; for a prevalence rate of 50 percent (such as contraception use), we estimate that we can detect an 6.4 percentage point difference between the two groups (Table B1.1).

Table B1.1. Minimum Detectible Impacts for The Federal Evaluation of MPC!


Follow-up
(90% response rate)

MDES

0.128

MDI (50% prevalence rate in control group)

6.4 percentage points

MDI (25% prevalence rate in control group)

5.5 percentage points

Note: These calculations assume an intra-class correlation coefficient of .01; 25 percent of individual-level variance in the outcome explained; and 70 percent of the cluster variance explained, due to baseline measures of the outcome of interest, baseline assessment of other risk behaviors, and demographics.

MDES = Minimum Detectible Effect Size

We also plan to conduct analyses on subgroups defined by baseline measures. These analyses will be considered exploratory, and will not be used as a primary test of the effectiveness of the intervention. Instead, they are intended to help program providers and practitioners understand whether the pattern of the findings for the full sample is similar to or different from trends observed for particular subgroups. We will observe trends for subgroups defined by (1) gender, and (2) sexual experience at baseline.

We acknowledge that statistical power for these exploratory analyses may be insufficient due to smaller sample sizes within the subgroups. For that reason, the analyses are intended not as a primary test of the intervention’s effectiveness, but instead as a means to understand whether the overall pattern of findings are similar to trends observed within and across particular subgroups.

B2. Procedures for Collection of Information

In each of the schools, all youth with parental consent will be considered for follow-up data collection. Mathematica staff will work with the schools prior to follow-up data collection to identify which study participants are still enrolled and which have moved or transferred to another school.

The data collection plan for the follow-up survey is the same across all participating schools and reflects sensitivity to issues of efficiency, accuracy, and respondent burden. The follow-up survey will be administered to consented youth approximately 9 months after completing the baseline survey. As with the baseline survey, the follow-up survey will be web-based and administered in a group setting at each school. The web survey will be smartphone-compatible; Mathematica will provide participants with smartphones, along with a unique login to access the survey from the device.

Mathematica will train staff on answering questions about the study, collecting student assent, and administering the follow-up survey to youth. The evaluation team will work with sites to determine the best day, time, and location for the group survey administration. The team will begin the administration by reviewing the details of the study and obtaining youth assent. 1 Any student who chooses to opt out of the survey will be led to another room with students who do not have permission to participate in the study. Youth who agree to take the survey will be provided with a unique login to access the web application and will be prompted to enter a verification code, such as their date of birth, to begin. The survey will be self-administered. Students will be instructed to begin the survey and work through at their own pace.

The survey asks all youth for some background information and includes a screening question about sexual experience. The survey then routes youth who report ever having sex to additional questions about sexual behavior and their use of contraceptives; those who report never having sex will be routed to other questions. No personally identifying information will appear with the survey. A question by question list of sources for the follow-up survey is in Attachment A, and a description of the sources referenced is in Attachment B. Once they have completed the survey, youth will close the web survey application and return the smartphone to a member of the evaluation team. When the survey administration is complete, Mathematica field staff will work with school staff to arrange make-up administrations for any students who were absent.

Students who have moved out of the area, have transferred to a non-study school, or are otherwise unable to complete the survey through the in-school data collection will be sent advance letters with the information necessary to log on to the web survey and complete it on their own time. For those that do not respond, we will follow up with postcards, emails and texts (with permission) and phone calls. These participants will also be given the option to complete the survey over the phone with a trained Mathematica interviewer.

B3. Methods to Maximize Response Rates and Deal with Non-Response

To date, follow up survey response rates are 86 percent, which is aligned with similar evaluations with mobile populations in urban areas. We expect that our response rates during the extension period, where we are working in 7 of our 15 schools, will remain the same. (see Section B.1).

Mathematica staff will work with the school to schedule the date and time of the follow-up survey, and to ensure that it does not conflict with any activities that are required or that students might find more desirable (i.e., lunch, field trip). Prior to administration, the evaluation team will work closely with school contacts to locate respondents in their new classrooms. To maximize attendance, team members will ask schools to post reminders and make announcements prior to and on the day of the survey. Before beginning the administration, Mathematica staff will take attendance and follow up immediately with the school contact regarding any unexpected absentees.

As in the baseline process, to help attain high response rates, field staff will collaborate with each site to arrange additional make-up sessions for students who are absent on the initial day of the survey. In addition, participants who were unable to complete the survey during the in-school data collection will be sent advance letters, postcards, emails, and texts (with permission) that include instructions for logging on to the web survey and completing it online, at a time convenient for them. These participants will also be given the option to complete the survey over the phone with a trained Mathematica interviewer.

When students complete the follow-up survey, they will receive a small gift of appreciation (for in school group administrations, a non-monetary incentive valued at $5 and for out-of school administrations, a $10 gift card).

In addition, we expect that the site’s continued willing assistance will be very important to maximizing the response rates; we will therefore invest significant effort in maintaining positive relationships to minimize burden on the sites and assure privacy to the youth participants. By applying identical methods for maximizing the response rates of the treatment and control groups, the evaluation team does not anticipate differences in response rates across research groups.

The evaluation team anticipates high response rates to follow-up surveys. Even so, the team will take steps to understand the nature of any non-response and to account for the threat it may pose to the validity of the study’s impact estimates. Using data from the baseline survey, evaluation team members will first test for statistically significant differences across demographic and baseline outcome variables between the treatment and control group members who are follow-up respondents and control for these differences using covariates when estimating program impacts (see Attachment E).

B4. Test of Procedures or Methods to be Undertaken

OPA and other offices within HHS (OPRE, ASPE) have made it a priority to align measures in the follow-up survey across evaluations of similar programs and populations. Many of the items on the follow-up survey2 are identical to the items in the already-approved baseline survey instrument (OMB Control Number 0990-0452). These items are similar to those in surveys OMB has already approved for use in comparable evaluations, including the ongoing Evaluation of Adolescent Pregnancy Prevention Approaches (PPA-OMB Control Number 0990-0382), the Teen Pregnancy Prevention Replication Study (OMB Control Number 0990-0397), the Personal Responsibility Education Program (PREP) Multi-Component Evaluation (OMB Control Number 0970-0398), and the Pregnancy Assistance Fund Study (PAF - OMB Control Number 0990-0424 for baseline survey).

Mathematica has conducted pretests of the follow-up survey with up to nine youth. The pretests focused primarily on new items not included on the baseline, and are designed to ensure that questions are understandable, use language and terms familiar to respondents, and are consistent with the concepts they aim to measure. The pretest was designed to help us (1) identify typical instrumentation problems such as question wording and incomplete or inappropriate response categories; (2) measure the response burden; and (3) confirm that there are no unforeseen difficulties in administering the instrument. Youth were selected who are similar to those in the expected study population (early high school). Mathematica collected parental consent and child assent for youth who participated in the pretest. The pretest was administered using a hard-copy, paper version of the survey, and trained staff debriefed with respondents.

Prior to launching follow up data collection, Mathematica staff tested the web survey thoroughly to ensure that all paths are working correctly and that respondents are routed only to those questions that are appropriate, based on their responses.

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Follow-up survey data for the impact study will be collected and analyzed by OPA’s prime contracting organization, Mathematica. OPA consulted with the following individuals on follow-up instrument development, and on the data collection and analysis plan.

For follow-up survey development and impact analysis:

  • Russ Cole

Mathematica

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 716-4549

  • Matthew Stagner

Mathematica

111 East Wacker Dr., Suite 3000

Chicago, IL 60601

(312) 994-1044

  • John Deke

Mathematica

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 275-2230

  • Melissa Thomas

Mathematica

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 275-2231

  • Sarah Forrestal

Mathematica

111 East Wacker Drive, Suite 3000

Chicago, IL 60601

(312) 994-1017

  • Jennifer Walzer

Mathematica

111 East Wacker Drive, Suite 3000

Chicago, IL 60601

(312) 994-1042

  • Brian Goesling

Mathematica

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 945-3355

  • Susan Zief

Mathematica

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 275-2291

  • John B. Jemmott III, Ph.D.

University of Pennsylvania

Perelman School of Medicine

Department of Psychiatry

3535 Market Street, Suite 520

Philadelphia, PA 19104-3309

(215) 573-9366

  • Amy Farb

U.S. Department of Health and Human Services

Office of Population Affairs

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

(240) 453-2836

  • Loretta Sweet Jemmott, Ph.D.

Drexel University

College of Nursing and Health Professions

1601 Cherry Street

Philadelphia, PA 19102,

(215) 895-2000

  • Tish Hall

U.S. Department of Health and Human Services

Office of Population Affairs

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

(240) 453-2846

  • Laura Kalb

Mathematica

955 Massachusetts Avenue, Suite 801

Cambridge, MA 02139

(617) 301-8989

  • Tara Rice

U.S. Department of Health and Human Services

Office of Population Affairs

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

(240) 453-2846

  • Jean Knab

Mathematica

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 945-3367




1 Youth assent is obtained prior to each round of data collection.

2 The participant-facing name of the study is the Attitudes, Behaviors, and Choices (or ABC) Study. This name appears on the instrument (Instrument 1).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy