Ss_b

SS_B.docx

Federal Evaluation of Making Proud Choices! (MPC!

OMB: 0990-0452

Document [docx]
Download: docx | pdf

Part B: Statistical Methods for the Collection of Baseline Survey Data - Federal Evaluation of Making Proud Choices!

OMB Control Number 0990 - new

July 2016

Submitted to:

U.S. Department of Health and Human Services Office of Adolescent Health
Office of the Director Department
of Health and Human Services

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

Project Officer: Amy Farb


Part B: Statistical Methods for the Collection of Baseline Survey Data- Expanding the Use and Understanding of Evidence-Based Teen Pregnancy

OMB Control Number 0990 - new

July 2016





CONTENTS

PART B INTRODUCTION 1

B1. Respondent Universe and Sampling Methods 1

B2. Procedures for Collection of Information 2

B3. Methods to Maximize Response Rates and Deal with Non-Response 4

B4. Test of Procedures or Methods to be Undertaken 5

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 6

TABLES

B1.1. Minimum Detectible Impacts for The Federal Evaluation of MPC! 2





ATTACHMENTS

ATTACHMENT A: QUESTION BY QUESTION SOURCE LIST FOR THE baseline survey

ATTACHMENT b: SOURCES REFERENCED FOR THE baseline survey

ATTACHMENT c: PERSONS CONSULTED ON INSTRUMENT DEVELOPMENT AND/OR ANALYSIS OF THE MPC! baseline survey

ATTACHMENT d: Consent Letters and Forms, Youth Assent Form (Baseline Survey)

ATTACHMENT e: Consent Form – Focus Group

ATTACHMENT f: CONFIDENTIALITY PLEDGE

ATTACHMENT g: ANALYSIS PLAN

ATTACHMENT H: BASELINE SURVEY PRETEST REPORT

ATTACHMENT I: MPC! 60 day notice


INSTRUMENTS

Instrument 1: baseline survey

Instrument 2: Master Topic Guide for interviews

Instrument 4: Master Staff Survey

Instrument 5: Program Attendance Data Collection Protocol

Instrument 6: Fidelity Facilitator Log

Instrument 7: Master Protocol for Youth Focus Groups



PART B INTRODUCTION

In response to a shifting research agenda in the field of teen pregnancy prevention, the Office of Adolescent Health (OAH) seeks to design a new large-scale, multisite random assignment evaluation of an evidence-based teen pregnancy prevention program that makes a significant contribution to the growing portfolio of research activities OAH has sponsored since the office was established in 2010. Much of OAH’s existing evaluation work focuses on documenting and evaluating the first cohort of grantees funded under the OAH Teen Pregnancy Prevention (TPP) program. With this new evaluation, OAH seeks to launch a “second generation” of evaluation activities - one that addresses a more targeted set of research questions of significant practical relevance to OAH and the broader field. In particular, the new evaluation will seek to advance the existing evidence base by identifying and testing (1) replications of a commonly used but understudied evidence-based teen pregnancy prevention program, and (2) the relative effectiveness of the two more prevalent implementation modes.

This proposed information collection activity is new and focuses on collecting (a) baseline survey data for the impact study, and (b) data for the implementation and fidelity assessment which will provide a detailed understanding of program implementation in the impact study sites and between health teachers and outside health educators.

B1. Respondent Universe and Sampling Methods

The evaluation will be conducted in 39 schools in which health is a required class for youth in upper middle school (such as 8th grade) or lower high school (such as 9th grade). Participating schools will be randomized to one of three conditions (13 schools in each condition) – (1) MPC! taught by the school health teacher, (2) MPC! taught by an outside health educator provided by a local health department or community based organization, or (3) a control group that receives the health curriculum the school’s health teacher normally provides. Eligible evaluation youth will be those that are enrolled in the required health class at the beginning of the study, approximately 100 per school. We expect that 70 percent of the eligible youth across the 39 schools will be given parental permission for study participation (2,730). The evaluation sample is expected to be equally male and female, primarily African American or Hispanic (at least 50 percent of the sample), and low-income (with more than 50 percent of the sample qualifying for free and reduced price lunch).

Statistical Power. In similar studies of teen pregnancy prevention programs in schools with group administration of follow-up surveys (such as PREP and PPA), we have achieved high response rates on follow-up surveys. On PREP, for example, the two studies in schools have greater than 90 percent response rates at the first follow-up (12 months after baseline) and greater than 85 percent at the second follow-up (24 months after baseline). We anticipate similar response rates for this study where the first follow-up is planned for 9 months after baseline and the second follow-up is planned for 15 months after baseline. Also, based on our experiences on PREP and PPA, we expect to retain all of the 39 schools randomized for the study. Power calculations are therefore based on retaining the entire 39 school sample.

At the 9-month follow-up, we expect to retain 90 percent of the youth sample, or 2,457 youth. For a prevalence rate of 25 percent (such as a sexual initiation), we can detect a 7.5 percentage point difference between the two groups; and, for a prevalence rate of 50 percent (such as contraception use), we can detect an 8.6 percentage point difference between the two groups. At the time of the longer term follow-up (15 months after baseline), we expect to retain 85 percent of the sample, or 2,320 youth.1 We will be able to detect similar MDIs at the 15-month follow-up as we will at the 9-month follow-up (Table B1.1)

Table B1.1. Minimum Detectible Impacts for The Federal Evaluation of MPC!


First follow-up
(90% response rate)

Second follow-up
(85% response rate)

MDES

0.180

0.175

MDI (50% prevalence rate in control group)

8.6%

8.7%

MDI (25% prevalence rate in control group)

7.5%

7.6%

Note: These calculations assume an intra-class correlation coefficient of .02; 25 percent of individual-level variance in the outcome explained; and 50 percent of the cluster variance explained, due to baseline measures of the outcome of interest, baseline assessment of other risk behaviors, and demographics.

MDES = Minimum Detectable Effect Size



B2. Procedures for Collection of Information

Baseline Survey. In each of the schools, all eligible youth will be considered for enrollment in the study (discussed in Section B.1). Each school will be asked to provide the evaluation team with a list of eligible youth. Mathematica staff will work collaboratively with the schools to recruit youth for the study and obtain active written consent from the responsible parent or guardian. Typically, Mathematica staff conduct an initial visit to the school to distribute consent forms and give a brief introduction to students, summarizing the study and asking them to return the signed consent form within a specific time frame. Mathematica will thoroughly and efficiently train staff to ensure they can properly inform study participants. We will train staff on explaining the study, answering questions about the study, collecting informed consent, and administering the baseline survey.

Additional visits to the school will be required until a sufficient number of consent forms have been returned, regardless of whether the parents/guardians provide permission or refuse for their child to participate in the study. Schools may also wish to mail the consent forms home either as a separate mailing, or to be included with a pre-planned mailing, such as those that go out at the beginning of the school year. Mathematica staff will offer assistance in assembling these mailings. Depending on school preferences, and if needed, we can reduce burden on schools, parents, and children by offering a verbal consent process or by completing the consent form online through a secure website.

Examples of the consent letter and form are included in Attachment D. Once consent collection has concluded, Mathematica will prepare a final roster of youth at each school with parental consent.

The data collection plan for the baseline survey is the same across the 39 schools and reflects sensitivity to issues of efficiency, accuracy, and respondent burden. The baseline survey will be administered to consented youth shortly after study enrollment.2 The baseline survey will be web-based and administered in a group setting at each school. The web-survey will be smartphone-compatible; Mathematica will provide participants with smartphones, along with a unique PIN and password to access the survey from the device.

The evaluation team will work collaboratively with sites to determine the best day, time, and location for the group survey administration. Mathematica staff will be carefully trained to administer the baseline survey to youth. The evaluation team will begin the administration by reviewing the details of the study and obtaining youth assent. If a student chooses to opt out of the survey, they will be led to another room with students who do not have permission to participate in the study. Youth who agree to take the survey will be provided with a unique PIN and password to access the web survey application on the smartphone handed out by Mathematica staff. The survey will be self-administered. Students will be instructed to begin the survey and work through at their own pace.

The survey will ask all youth for background information and include a screening question about sexual experience. The survey will then route youth who report ever having sex to additional questions about sexual behavior and their use of contraceptives; those who report never having sex will be routed to other questions. No personally identifying information will appear with the survey. A question by question list of sources for the baseline survey is found in Attachment A, and a description of the sources referenced is found in Attachment B.

Once they have completed the survey, youth will close the web survey application and return the smartphone to a member of the evaluation team. When the survey administration is complete, Mathematica field staff will work with school staff to arrange for make-up administrations to capture any students who may have been absent.

Implementation and Fidelity Assessment.

Site Visits for the Implementation Analysis. We plan to visit each school once for the purposes of the implementation study. The specific timing of site visits will be determined after sites are selected and specific implementation plans are known. Site visits will focus on: 1) individual interviews with school administrators; and 2) discussions with facilitators responsible for delivery of health programming (health educators and classroom teachers). The Master Topic Guide (Instrument 2) identifies the information that will be gathered from these staff to document the program context, the implementing organization and partner organizations, implementation systems, youth participation and engagement, and actual service delivery. Site visitors will use these instruments to develop interview protocols customized to each site to ensure that they collect the relevant needed information in an efficient, consistent way from the appropriate respondents.

Staff Survey for the Implementation Analysis. All program facilitators (health educators delivering MPC!, classroom teachers delivering MPC!, and classroom teacher delivering their regular health class) will be invited to complete the staff survey (Instrument 3). The 30-minute survey will be in pencil and paper format and designed to capture targeted input from all of the staff implementing the program since in-person interviews may not be practical or feasible across all site locations. The survey uses closed- and open-ended questions aligned with and designed to provide the data for the planned analyses. We anticipate that the survey will be administered during the site visit in each site.

Participant Focus Groups for the Implementation Analysis. Focus groups (Instrument 6) will be conducted with a subset of program participants during site visits. The objective of the focus groups will be to explore participants’ perspectives on the availability, quality, and value of the program they receive. The focus groups will be used to learn about participants’ perceptions of the benefits of the program and their overall satisfaction with the program content, activities, and facilitation.

Focus groups will be conducted for youth in each of the two treatment conditions: 1) Youth receiving MPC from classroom teachers; 2) Youth receiving MPC from health educators. Each school will have up to one focus group. Participants will be recruited randomly from among youth enrolled in the evaluation.

Site visitors will work with local staff to arrange the groups at convenient times and locations and to recruit 8 to10 youth for each group. The study team will work with site staff to offer concrete assistance in the form food and transportation, as necessary.

Program Attendance Data for the Implementation Analysis. Program attendance data will be collected for every participant at each school. Health educators or classroom teachers will record participant attendance on a form provided by the evaluation team (Instrument 4). This data will allow the evaluation team to document the proportion of program sessions that participants actually attended. It will also allow the evaluation team to conduct exploratory impact analyses on whether the impact of the program varies for youth that received different dosages of the program.

Fidelity Log Data for the Implementation Analysis. Facilitators delivering MPC! (health educators and classroom teachers in the two treatment conditions) will also maintain session fidelity logs to record the specific components they actually completed for each session (Instrument 5). This will enable the evaluation team to better understand the degree to which the program was implemented with fidelity (and the degree to which this may have differed by type of facilitator). The logs will utilize a web-based platform. The fidelity log is a valuable tracking tool based on existing materials developed for the program, and designed to minimize burden on staff by utilizing checkboxes and limiting open-ended responses as feasible.

B3. Methods to Maximize Response Rates and Deal with Non-Response

Baseline Data Collection. OAH expects to achieve a response rate of 95 percent for the baseline survey. This expectation is based on response rates achieved in prior baseline surveys with similar populations, such as with the PREP and PPA studies. We can expect to achieve these completion rates for the Federal Evaluation of MPC! at the baseline period for several other reasons. First, the baseline survey administration will occur shortly after study enrollment and active parental consent is received. This timing will help minimize the likelihood that students have moved out of the school prior to baseline administration. It will also ensure that surveys can be administered to most youth in the location where the program will take place. When students return their consent form, they will receive a small gift of appreciation. Research suggests that providing an incentive for earlier surveys may contribute to higher response rates for subsequent surveys.3 Therefore, providing a small gift of appreciation during consent collection may help boost response rates during baseline data collection shortly thereafter.

Field staff will work collaboratively with each site to arrange additional make-up sessions for students who are absent on the initial day of the survey administration to help attain high response rates. In addition, we expect that obtaining the site’s willing assistance will be very important to maximizing the response rate; we will therefore invest significant effort in gaining their cooperation, minimizing burden on sites, integrating an effective consent process, and assuring privacy to the youth participants. Sites will be given detailed information about the surveys, how they will be administered and on what schedule, what involvement and time will be required of school staff, and how data will be used and protected. Bringing sites into the process while minimizing burden will assure site support of the baseline data collection. Methods to achieve high response rates at follow-up will be discussed in future information collection requests.

Implementation and Fidelity Assessment. To ensure high response rates to data collection efforts associated with the implementation study, site visits will be planned well in advance so that all identified respondents can participate in interviews and the staff survey, as appropriate. To increase participation in focus groups, youth who volunteer to participate will be offered a $25 gift card for their time, as is customary in other federal evaluations such as PREP and PAF. To ensure that attendance and fidelity data are recorded completely and accurately, the evaluation team will routinely review attendance and fidelity information provided by sites and follow up with program staff if information is incomplete. Staff completing the attendance and fidelity logs will be reimbursed $75 to compensate them for their time.

B4. Test of Procedures or Methods to be Undertaken

Baseline Survey. OAH and other offices within HHS (OPRE, ASPE) have made it a priority to align measures in the baseline survey across evaluations of similar programs and populations. Many of the items included on the baseline survey are taken from similar surveys OMB has already approved for use in the ongoing Evaluation of Adolescent Pregnancy Prevention Approaches (PPA), the Teen Pregnancy Prevention Replication Study, and the Personal Responsibility Education Program (PREP) Multi-Component Evaluation.4 To date, 7,441 PPA baseline surveys have been administered; the Replication Study baseline survey has been administered to 7,518 adolescents; and the PREP baseline survey has been administered to 3,871 youth. Any new survey items added specifically for the Federal Evaluation of MPC! were generally drawn from established sources (see Attachments A and B).

Mathematica pretested two different versions of the baseline instrument with two groups of pretest participants to ensure that questions are understandable, use language and terms familiar to respondents, and are consistent with the concepts they aim to measure; to identify typical instrumentation problems such as question wording and incomplete or inappropriate response categories; to measure the response burden; and to confirm that there are no unforeseen difficulties in administering the instrument. Youth selected for the pretest were similar to those in the expected study population (late middle school and early high school). Mathematica collected parental consent and child assent for students who participated in the pretest, administered the pretest using a hard-copy, paper version of the survey, then debriefed with the participants. Attachment H provides a pretest report describing the process used to pretest the instrument and summarizing the changes made to the instrument as a result of the pretest.

Once the survey is finalized, it will be programed as a web-based survey. Mathematica staff will thoroughly test the web survey to ensure that all paths are working correctly and respondents will be routed only to those questions that are appropriate, based on their responses.

Implementation and Fidelity Assessment. The information collection instruments are similar to ones that have been used successfully in prior studies, such as the PAF (OAH), PREP (ACF) and PPA evaluations (OAH).

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Baseline Survey. Baseline and follow-up survey data for the impact study will be collected and analyzed by OAH’s prime contracting organization, Mathematica Policy Research. Attachment C lists the individuals whom OAH consulted on baseline instrument development, and the data collection and analysis plan.

Implementation and Fidelity Assessment. The implementation study site visits and staff survey will be conducted by OAH’s contracting organization, Mathematica Policy Research. Mathematica will also conduct all analyses of the data. Attendance and content coverage data will be collected and analyzed by Mathematica’s subcontractor, Decision Information Resources. Attachment C lists the individuals whom OAH consulted on the collection of the implementation study instruments.

1 Approval for the two follow-up surveys will be requested in subsequent ICRs.

2 The participant-facing name of the study is the Attitudes, Behaviors, and Choices (or ABC) Study. This name appears on the instrument (Instrument 1) and related consent and assent materials (Attachment D).

3 Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. 1998. Does the payment of incentives create expectation effects? Public Opinion Quarterly 62:152–64.

4 ACF received initial OMB approval for the PPA baseline survey on July 26, 2010 (OMB Control Number 0970-0360). In summer 2011, oversight of PPA was transferred to the Office of Adolescent Health (OAH) within the Office of the Assistant Secretary, and the project is now tracked with a different OMB Control Number (0990-0382). The OMB Control Number for the Teen Pregnancy Prevention Replication Study is 0990-0394. OMB approval for the PREP baseline survey was received on May 8, 2013 (OMB Control Number 0970-0398).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title50164-Statement B
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy