Supporting Statement Part B

Supporting Statement Part B.doc

Evaluation of Pregnancy Prevention Approaches - Baseline

OMB: 0970-0360

Document [doc]
Download: doc | pdf


Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches

Part B: Statistical Methods for Baseline Data Collection


June 2011



CONTENTS

Abstract 1

B1. Respondent Universe and Sampling Methods 3

B2. Procedures for Collection of Information 8

B3. Methods to Maximize Response Rates and Deal With Nonresponse 9

B4. Tests of Procedures or Methods to be Undertaken 10

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 10



TABLES

B.1. PPA Program Description, Samples, and Targeted Outcomes, by Site 4

B.2. Expected Sample Sizes 7



Abstract

The Administration for Children & Families (ACF) of the U.S. Department of Health and Human Services (HHS) is assisting the HHS Office of Adolescent Health (OAH) in conducting the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA), an eight-year demonstration designed to study the effectiveness of promising policy-relevant strategies to reduce teen pregnancy. The study was designed to include up to eight evaluation sites, and at this point it appears that there will be seven sites:

  • One site – Chicago Public Schools, implementing the HealthTeacher curriculum – has been recruited, and a baseline survey has been implemented

  • Six federally-funded grantees have been recruited, and this emergency clearance is for baseline instruments specific to these sites. Clearance is needed by July 15, 2011.

Approval for outreach discussions with stakeholders, experts in the field, and program developers was received on November 24, 2008 (OMB Control No. 0970-0360). Approval for the baseline survey data collection and the collection of youth participant records was received on July 26, 2010 (OMB Control No. 0970-0360). Data collection under that latter approval began in the Chicago Public Schools site in fall 2010, after we had informed OMB of the site (per conditions of the Notice of Approval). At the time approval for baseline data collection was received, only one site had been recruited into the study (Chicago Public Schools). The original expectation was that the study would use the same baseline survey in sites subsequently recruited.

We are now seeking emergency clearance for revision to the baseline data collection instrument. Specifically, we are applying for approval of “site-specific instruments” – that is, variations that would be appropriate for the other six sites recruited into the study – based on a modified version of the previously-approved instrument. Clearance is needed by July 15, 2011, so that a site-specific instrument can be printed in time for survey administration in one site, which is slated to begin the second week of August.

Two developments have prompted this request.

First, a large group of federal staff has collaborated to modify the previously-approved PPA baseline instrument into a “concordance baseline instrument” suitable for all HHS pregnancy prevention evaluations, including but not limited to PPA. HHS is trying to maximize consistency across evaluations of federal pregnancy prevention grant programs. In 2010 and 2011, ACF and OAH (in coordination with other HHS offices overseeing pregnancy prevention evaluation) collaborated to consider revisions to the PPA instrument. Through this group and other efforts, HHS:

  • Defined core items that should if possible be included in surveys associated with all federal evaluations

  • Identified other ancillary measures that could be drawn on depending on the particular features of evaluation site programs and target populations

  • Defined a set of performance measures on which all grantees would report, stipulating that all grantees recruited into federal evaluations would collect participant-level data for these performance measures through their evaluation efforts.

The resultant “concordance baseline instrument” thus began with the approved PPA instrument, but now has been revised to contain core items, ancillary measures, and performance measures. This concordance baseline questionnaire is slightly different from the already approved PPA baseline instrument. A copy of the baseline concordance instrument is found in Attachment A.

The second development is the need for instruments that are appropriate for each of the remaining six sites recruited into the study – instruments we are calling “site-specific baseline instruments.” As recruitment of PPA sites has progressed, we have found that variations in the target populations and the program models we will evaluate call for site-by-site variation in the focus of baseline questions. The “concordance baseline instrument” was the starting point for the baseline data collection instrument in each of the PPA sites, and further refinements have been made to make the instruments as appropriate for each site as possible. For example, two of the six sites recruited into the study work with pregnant and parenting youth. Some questions in the concordance instrument would be inappropriate for these populations. For instance, it would be irrelevant (and perhaps insulting) to ask pregnant and parenting youth if they had ever had sexual intercourse. Additions and deletions have been integrated into the site-specific version of the data collection instrument to make it most effective. Each site-specific baseline questionnaire thus reflects:

  • The “concordance baseline instrument” revisions (i.e. revisions to the previously-approved instrument)

  • Site-specific adaptations on that new standard instrument

Each site specific instrument will consistent measure of core constructs – because each will have a set of common core items and performance measures – but they will also permit evaluation of outcomes specific to each site.

To help communicate differences between the concordance instrument and the site-specific instruments, differences are pointed out in an introductory table before each site’s site-specific instrument submitted with this justification.

The feasibility of implementing a set of “site-specific instruments” depends on rapid OMB approval. Most of the new evaluation sites are scheduled to begin sample enrollment in fall 2011, some as early as August or September 2011. By that time, it appears impossible to: a) reach agreement with sites (federally-funded grantees) on question variations within the baseline instrument; b) publish 60- and 30-Day FRNs; and c) receive standard OMB approval of the baseline instrument variants. Following normal review procedures would make it impossible for site organizations to proceed with their scheduled program startup and to meet their evaluation sample enrollment requirements. If we are not able to make these adjustments in the data collection instruments, we risk losing critical information (for example, on target populations or interventions of policy interest) and substantially decreasing the value of the federal money spent on pregnancy prevention evaluation.

HHS is therefore requesting emergency clearance by July 15, 2011 for the site-specific baseline data collection. Emergency review appears justified by the guidance memorandum (M-11-07) on streamlining the PRA process, and in particular by that memo’s reference to risks of disruption of collection as a basis for invoking the emergency review process. A memo detailing the justification for emergency clearance is found in Attachment B. This memo was approved by OMB in an email dated May 4, 2011.

B1. Respondent Universe and Sampling Methods

In the PPA evaluation, HHS has identified seven study sites (including six federal grantees) that will implement different pregnancy prevention approaches.1 In three of these sites, the programs to be tested will be operated in high schools or middle schools. In the other sites, the programs to be tested will be operated in community-based organizations (CBOs) or other non-school based settings. The study will include a sample of approximately 9,000 teens across these seven sites, and the sample size in each site is sufficient to detect policy-relevant impacts by site. In each site, youth will be assigned to a treatment group that receives the program of interest, or to a control group that does not. To ensure that behavior of control group youth is not affected, or “contaminated” by interaction with treatment group youth attending the same school or CBO program, random assignment in five of the sites will be done at the organization level (that is, the school, CBO, or other “cluster”). In two of the sites, random assignment will be done at the individual level, where youth are receiving services on an individual basis, and the risks of contamination are low.

A baseline survey will be conducted with both the program and control groups before the youth in the program group are exposed to the pregnancy prevention programs. The first follow-up survey will be conducted with participating youth no sooner than 3-6 months after the end of the scheduled program intervention for each sample member (and a corresponding schedule for the control group). A second follow-up survey will be conducted with participating youth no later than 18-24 months after the scheduled end of the program. (Approval of follow-up surveys will be sought in separate submissions.) The exact timing of the two follow-up surveys will be determined in each site, taking into account the length of the program and the age of the target population. In a few sites, additional follow-up surveys may be required. Wherever possible, there will be group administration of the self-administered survey; when necessary to increase response rates, this method will be augmented with web survey and telephone follow-up2.

The universe of potential respondents will vary across study sites, depending on the type of program in place at each site. Table B.1 describes each of the sites, the programs being implemented, the sample size, the target population, and the targeted outcomes.


Table B.1. PPA Program Description, Samples, and Targeted Outcomes, by Site

Program Description

Targeted Sample

Expected Total Sample Enrolled

Targeted Outcomes

OhioHealth Research and Innovation Institute (Columbus, Ohio)

T.O.P.P: 18-month clinic/hospital-based program to delay repeat pregnancies among adolescents 10-19 by improving access to reproductive health services and contraceptive care. Program consists of:

1) monthly telephone calls from a nurse educator to provide contraceptive information and help coordinate access to contraceptive services; 2)access to contraceptive services through a mobile OB/GYN trailer and transportation to clinic services.

Pregnant and parenting teens, ages 10-19 in OhioHealth hospitals and clinics

600

  • Repeat pregnancy

  • Repeat birth

  • Receipt of contraceptive services

  • Attitudes toward contraception

  • Knowledge of contraceptive methods

  • Premature repeat birth

  • Sexually transmitted infections

  • Contraceptive use/engagement in unprotected sex

Children’s Hospital Los Angeles (Los Angeles, CA)

Project AIM: Evidence-based youth development program for teen parents under age 21 receiving case management as part of California’s Adolescent Family Life and Cal-Learn programs. Program consists of six 60-minute individual sessions and three 90-minute group sessions, in addition to ongoing case management and access to referrals for other services. Adapted for use in preventing repeat pregnancies among 15-18 year old females by focusing on aspirations and future planning while incorporating content specific to teen mothers (such as contraceptive use, relationship issues, and balancing their roles as adolescents and young mothers, etc.)

Pregnant and parenting teen mothers receiving case management services through clinic sites, ages 15-18

1400

  • Interval to repeat pregnancy

  • Consistent contraception use

  • Self-sufficiency (employment and earnings){

  • Academic/employment progress

Oklahoma Institute for Child Advocacy (Oklahoma, Illinois, Maryland, California)

Power Through Choices: Sexuality education curriculum implemented in foster care group homes consisting of ten 90 minute sessions, for a total of 15-hours of curriculum. Teaches youth how to avoid sexual risk behaviors, pregnancy, and sexually transmitted infections. Topics include anatomy/reproductive health, increasing communication skills, avoiding sexually transmitted infections/HIV, and preventing pregnancy through the use of contraception.

Youth in foster care group homes, ages 14-18


1080

  • Incidence of teen pregnancy

  • Consistent use of contraceptives

  • Consistent use of condoms

  • Number of sexual partners

  • Delay to initiation of consensual sex

EngenderHealth (Austin, TX)

Gender Matters: 20-hour program focused on helping teens achieve a sound understanding of concepts of health gender roles, healthy relationships, and empowerment to delay sexual initiation and increase consistency of condom use. Focuses on concepts of masculinity and femininity and their connections to sexual risk behavior.

Youth participating in the Travis County Summer Youth Employment Program, ages 14-15

1125

  • Rate of pregnancy

  • Delay to onset of sexual intercourse

  • Use of most contraceptive methods

  • Consistent and correct use of condoms

  • Balance of power dynamics within intimate relationships

  • Sense of independent self

  • Frequency and quality of intimate partner communication

Live the Life Ministries

WAIT Training: 8-hour abstinence-based curriculum, to be delivered by teachers in schools as a required class in 7th and 8th grades, for a total of 16 hours. The intervention is delivered in a short, intensive period, typically over eight consecutive school days each year. Focuses on educating young people on pregnancy prevention, setting future goals, responsible behavior, and healthy relationships. Emphasizes that young adolescents should postpone sexual activity and that practicing abstinence is the only way to eliminate the risk for pregnancy and STDs, including HIV.

Youth in 7th grade in participating schools

1600

  • Teen pregnancy rate

  • Teen STD rates

  • Accountability due to knowledge

  • Levels of toxic relationship conflict

  • Rates of teen sex and age of sexual debut

  • Parent-child relationships

Princeton Center for Leadership Training (New Jersey, North Carolina)

TeenPEP: School-based peer-to-peer program in which trained faculty advisors select youth to become a cohesive team of peer educators and serve as sexual health advocates and role models. These peer educators conduct five 90-minute structured and scripted outreach workshops, under the supervision of faculty advisors, for high school 9th graders. Topics include sexual health information, communication with partners and parents, problem-solving, decision-making, negotiation, refusal skills, and self management skills.

Youth in Grade 9 in participating schools

1600

Rate of teen pregnancy

Rate of teen births

Initiation of sex

Behaviors that reduce risk for pregnancy





Of the five evaluation sites where random assignment of “clusters” will be used, three will deliver in-school programs to all eligible students. These are Princeton Center for Leadership Training; Live the Life; and Chicago Public Schools, which is not included in this clearance, since the baseline was already administered there using the previously-approved instrument. We plan to randomly assign 16 schools at each site, half to the program group and half to the control group. We estimate that the evaluation will enroll an average of 100 students from each school, for a total initial sample of 1,600 in each site. Should a school have an appreciably larger population of students than is needed for the target sample size, we will subsample students.

The two remaining cluster random assignment sites involve programs delivered through community organizations. One program (Oklahoma Institute) is to be delivered by non-profit organizations through foster care group homes, the other (Engender Health) by a non-profit organization working with a summer youth employment program. In these sites, we will randomly assign the relevant clusters: 40 group homes and 60 job sites, in both cases half to the program group and half to the control group. We estimate that each group home will enroll approximately 27 youth (over several cohorts) and each job site will enroll approximately 19 youth, for an expected sample size of about 1,100 at each of these two sites.

In the two programs delivered to participants on an individual basis via case managers or nurse educators (Children’s Hospital of Los Angeles and Ohio Health), we plan to randomly assign individual participants. (In some sites, it may be important to conduct random assignment in a way that ensures the program and experimental groups will be balanced in terms of participants’ gender, age, or other characteristics.) We anticipate one of the sites will enroll 1400 youth and the other will enroll 600 youth, and we plan to include all of the youth in the respondent universe. As in the other sites, we would only subsample if the population were much larger than anticipated, and in that case we would use a sampling scheme like that described above.

Table B.2 summarizes our sample size estimates. Based on our plans to include five sites with cluster random assignment and two with individual-level random assignment, we expect the total sample size will be approximately 9,000.

Table B.2. Expected Sample Sizes

Type of Program

Number of Sites

Average Sample
Size Per Site

Total Sample Size by Program Type

Required in-school

3

1,600

4,800

Community-based

2

1,100

2,200

Clinic/service-based (individual)

2

1,000

2,000

Total

7


9,000



Response rates should be high, but there will still be some attrition. We expect to achieve a 90 percent response rate on the baseline survey. Response rates for follow-up surveys should be 80 percent or higher on the first follow-up and 70 percent or higher on the second. These rates are comparable to the response rates achieved on the study of Title V abstinence education programs conducted by Mathematica Policy Research.3 Even with such high response rates, however, survey nonresponse can bias impact estimates if outcomes of survey respondents and nonrespondents differ, or if the types of individuals who respond to the surveys differ for the treatment and control groups. To correct for differences between respondents and nonrespondents on follow-up surveys, we will construct sample weights that mirror the characteristics of the full sample, so that the baseline characteristics of the responders to the follow-up survey to mirror those of the full sample.

B2. Procedures for Collection of Information

HHS will collect information on youth baseline characteristics and behaviors from approximately 9,000 youth across the seven selected sites (see Table B.1 for distribution). Whenever possible, the assignment to treatment (receipt of one of the approaches to reducing teen pregnancy) or control groups (not receiving such treatment) will take place at the cluster level (school, group home, or worksite) in order to minimize contamination between control and treatment group youth. When there are more youth at a site than required for the sample, youth will be subsampled.

Where random assignment is by cluster, baseline data collection will occur in groups after parental consent is received (or after consent by sample members over 18). Sites will provide the HHS contractor with youth rosters and will assist in obtaining active parental consent to participate in the PPA evaluation. To assist the site in gaining parental consent, HHS developed a set of Frequently Asked Questions. The contractor will prepare a final survey roster of all youth at the site for whom it has received parental consent and who are expected to complete the questionnaire on survey day. Contractor staff will work with sites to determine a date and exact venues for conducting group survey administrations at the sites with “consented” youths. Contractor staff will arrive at the site for the survey day, two staff members per survey room. When in the survey room(s), contractor staff will use the survey roster to take attendance and determine whether any youth are missing and to exclude any not on the survey roster.

Survey administration then begins with contractor staff handing out pre-identified survey packets to the youth whose names are on the packets, and obtaining youth assent. Each packet will consist of the PPA paper-and-pencil interview (PAPI) questionnaire and a sealable survey return envelope. The questionnaire and envelope will have a label with a unique ID number (no personally identifying information will appear on the questionnaire or return envelope). Youth will self-administer the questionnaire. Questionnaire Part A asks for background information and concludes with a single screening question about sexual experience. Youth with sexual experience will complete Part B1 and those without will complete Part B24. Two contractor staff members will monitor activities in each survey room. At the end of the interview, youth will place the entire PPA questionnaire Parts A, B1, and B2 (both the used and the unused sections) in the return envelope, seal it, and return it to a contractor staff member. Staff will send the completed questionnaires to the contractor’s office, where the questionnaires will be receipted and checked for completeness and readiness for scanning. All questionnaires that pass the check will be sent to a scanning vendor to be scanned. All scanned data will be electronically transmitted to the contractor.

Two of the sites will administer surveys to individuals rather than to groups. In one site, contractor staff will use an audio computer-assisted survey instrument (ACASI) to administer the baseline survey to young mothers in their homes. Once completed, data will be uploaded to a secure website housed by the contractor. In the second site, the baseline survey will be administered to young mothers on-site in clinics or hospitals when they are attending a prenatal care visit or have delivered. Once completed, surveys will be sent to the contractor for scanning.

If any youth are not available for the survey administration or make up sessions, contractor staff will contact them and provide a PIN/password for web completion or, if necessary, will interview them by telephone using the PAPI instrument. After such completions, the same receipting and scanning processes as for PAPI completions will take place. Materials for each site are presented in separate files delivered with this justification statement.  For each site, an instrumentation set is contained in a separate file, including the baseline questionnaire, the consent form, and the assent form, all tailored to the specific site. No materials are presented for the Chicago Public Schools site, because that site is using the already-approved forms. .

B3. Methods to Maximize Response Rates and Deal With Nonresponse

We expect a better than 90 percent response rate to the baseline survey, because of several conditions. Survey administration will occur shortly after active consent is received. This timing will ensure our contact data are current (no location problems) and that surveys can be administered to most youth in the location where the program would take place (for example, the school or community-based organization). In addition, we expect that site assistance will help maximize the response rate; we will invest in gaining site cooperation, minimizing burden on sites, integrating an effective consent process, and assuring privacy and confidentiality to the youth participants. Sites will be given detailed information about the surveys, how they will be administered and on what schedule, what involvement and time will be required of school staff, and how data will be used and protected. Bringing sites into the process while minimizing burden will assure site support of the PPA data collection.

Participants completing the baseline survey at three sites will receive a gift card5. In two of the sites (Children’s Hospital of Los Angeles and Ohio Health), a gift card will be given because participants are adolescent mothers who have recently given birth and will be completing the baseline survey in their homes or in the hospital, most likely with their newborns present. In addition, in these sites some respondents will not have had any prior connection to the grantee organization, so providing a gift card as a thank you for participating seems essential to obtain high response rates and encourage participation in future rounds of follow-up data collection. In the third site (Oklahoma Institute), participants are youth living in foster care homes who could potentially transition out of the foster care system prior to follow-up, losing their connection to the grantee organization. As with the previous sites, providing a gift card as a thank you seems essential to obtain high response rates and encourage participation in future rounds of data collection.

Methods to achieve high response rates at follow-up will be discussed in future information collection requests.

B4. Tests of Procedures or Methods to be Undertaken

We conducted pretests of the original PPA baseline instrument, which serves as the foundation of the site-specific baseline instruments. We recruited pretest participants and study staff talked directly with all interested teens to explain the pretest and the need to obtain parental consent prior to their participation.

Youth were asked to participate in one of five pretest administrations, during which small groups of four or five teens completed the self-administered questionnaire in a group setting and then went to a one-hour one-on-one debriefing with a researcher.

In many ways, the pretest sample represented the two population extremes that we are likely to find in the real study: youth from high socio-economic backgrounds who were active participants in a peer mentoring program that focused on sexual health participated, as well as youth from low socio-economic backgrounds who were receiving social support services from a community organization. The administration of the pretest mirrored as closely as possible what will happen during the actual study in a classroom environment.

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The PPA baseline survey will be administered by HHS’ contracting organization, Mathematica Policy Research6. The same contractor will analyze data with support from evaluation colleagues at Child Trends. Individuals whom HHS consulted on the collection and/or analysis of the baseline data include those listed below.

Alan Hershey

Mathematica Policy Research, Inc.

P.O. Box 2391

Princeton, NJ 08543

(609) 275-2384


Christopher Trenholm

Mathematica Policy Research, Inc.

P.O. Box 2391

Princeton, NJ 08543

(609) 936-279-6384


Laura Kalb

Mathematica Policy Research, Inc.

955 Massachusetts Avenue, Suite 801

Cambridge, MA 02139

(617) 301-8989


Kristin Moore

Child Trends

4301 Connecticut Ave. NW
Washington, DC 20008-2333
(202) 362-5580


Jennifer Manlove

Child Trends

4301 Connecticut Ave. NW
Washington, DC 20008-2333
(202) 362-5580



TECHNICAL WORK GROUP MEMBERS


Meredith Kelsey

Abt Associates

55 Wheeler St.

Cambridge, MA 02138


Christine Markham

The University of Texas School of Public Health

P.O. Box 20186

Houston, TX 77225

(713) 500-9646


Pat Paluzzi

President

Healthy Teen Network

1501 Saint Paul St., Suite 124

Baltimore, MD 21202

(410) 685-0410



Susan Philliber

Philliber and Associates

16 Main St.

Accord, NY 12404

(845) 626-2126




Michael Resnick

Division of Adolescent Health and Medicine

717 Delaware St. SE, Suite 370

Minneapolis, MN 55414-2959

(612) 624-9111


We have also consulted with:


Stan Koutstaal (and possibly other staff)

Family and Youth Services Bureau

Division of Abstinence Education

U.S. Department of Health and Human Services

370 L’Enfant Promenade, SW

Washington, DC 20477

(202) 401-5457


Seth Chamberlain

Administration for Children and Families

U.S. Department of Health and Human Services

370 L’Enfant Promenade, SW

Washington, DC 20477

(202) 260-2242


Lisa Trivits (and possibly other staff)

Office of the HHS Assistant Secretary for Planning and Evaluation (ASPE).

U.S. Department of Health and Human Services

370 L’Enfant Promenade, SW

Washington, DC 20477

(202) 260-2242


Inquiries regarding statistical aspects of the study design should be directed to:

Amy Farb

Office of Adolescent Health

1101 Wooton Parkway

Rockville, MD 20852

(240) 453-2836


Dr. Farb is the project officer.




1 The feasible number of evaluation sites has been adjusted from eight (projected in earlier submissions to OMB) to seven because one site tentatively recruited was unable to enlist the requisite number of schools.

2 Two sites will administer the baseline survey to individuals: one using paper and pencil instrument (PAPI), the other using audio computer assisted self interview (ACASI) to address literacy concerns. A third site will read the baseline survey aloud to respondents in a group-administered setting (using PAPI).

3 Trenholm, Christopher, Barbara Devaney, Kenneth Fortson, Lisa Quay, Justin Wheeler, and Melissa Clark. “Impacts of Four Title V, Section 510 Abstinence Education Programs.” Final report submitted to the U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation. Princeton, NJ: Mathematica Policy Research, 2007.

4 In three of the sites (CHLA, Ohio Health, and OICA) it is already known that youth are sexually active so a separate Section B is not needed for non-sexually active youth.

5 The amounts of these payments or gift cards vary by site, because they were determined by grantees and their local evaluators in applying for the grants they have been awarded. Gift card amounts are: $10 for Oklahoma and Ohio Health and $20 for Children’s Hospital of Los Angeles, where the baseline will be administered in the respondent’s home.



6 In three sites, Mathematica has lead responsibility for administering the baseline survey. In two sites, Mathematica and the local evaluator share responsibility for administering the baseline surveys. In the two remaining sites, the local evaluator has lead responsibility for administering the baseline survey, with support from Mathematica.


File Typeapplication/msword
AuthorBarbara Collette
Last Modified ByDSHOME
File Modified2011-07-05
File Created2011-07-05

© 2024 OMB.report | Privacy Policy