Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches (OMB Control #0970-0360)
Part B: Statistical Methods for Baseline Data Collection
In the PPA evaluation, ACF will identify eight study sites that will implement different pregnancy prevention approaches. In approximately six of these sites, the programs to be tested will be school-based – operated, for example, in high schools or middle schools. In other sites, the programs to be tested will be operated in community-based organizations (CBOs). The study will use a sample of approximately 10,800 teens across these eight sites, a sufficient size to detect policy-relevant impacts of the programs. In each site, youth will be assigned to a treatment group that receives the program of interest or to a control group that does not. To ensure that behavior of control group youth is not affected, or “contaminated” by interaction with treatment group youth attending the same school or CBO program, random assignment will done generally at the organization level (that is, the school or CBO). However, it is possible that at some sites random assignment might be done at the individual level, where risks of contamination are low.
A baseline survey will be conducted with both the program and control groups before the youth in the program group are exposed to the pregnancy prevention programs. Wherever possible, there will be group administration of the self-administered survey; when necessary to increase response rates, this method will be augmented with web survey and telephone follow-up. We will also collect relevant school records and achievement data (e.g. school attendance, receipt of free or reduced-price lunch, etc.), as well as program participation data.
The universe of potential respondents will vary across study sites, depending on the type of program in place at each site. Hence, we first describe the possible types of program structures and the corresponding study design.
Of the eight sites in the evaluation, we estimate that six will have in-school programs delivered to all eligible students and two will have elective programs that could be provided in or out of school (for example, at a community based organization, or CBO). In the former six sites, we expect to conduct random assignment at the school level. Specifically, we plan to randomly assign 16 schools at each site, with half assigned to the program group and half to the control group. We estimate that each school will enroll an average of 100 students in the relevant classes each year; consequently, our anticipated total sample size for each of these sites is 1,600. We will target all 1,600 students for surveys. Should a school have an appreciably larger population of students such that the total sample size would be much larger, we will subsample students in that school. Subsampling, if necessary, would be a simple random sample within explicit strata defined by school, grade (when relevant), and gender.
For the two elective programs, we expect to randomly assign individual students. (If it is important to the school or program, we can conduct random assignment in a way that ensures the program and experimental groups will be balanced in terms of students’ gender, age, or other characteristics.) We will select sites that can enroll 600 youth within 12 to 18 months, and we plan to include all of the youth in the respondent universe. As in the other sites, we would only subsample if the population were much larger than anticipated, and in that case we would use a sampling scheme like that described above.
Table B1.1 summarizes our sample size estimates. Based on our plans to include six sites with school-level random assignment and two with individual-level random assignment, we expect the total sample size will be approximately 10,800.
Table B1.1. Expected Sample Sizes
Type of Program |
Number of Sites |
Average
Sample |
Total Sample Size by Program Type |
Required in-school |
6 |
1,600 |
9,600 |
Elective in-school or out-of-school |
2 |
600 |
1,200 |
Total |
8 |
|
10,800 |
We expect to achieve a 90 percent response rate on the baseline survey (and an 80 percent or higher response rate on follow-up surveys). These rates are comparable to the response rates achieved on the study of Title V abstinence education programs conducted by Mathematica Policy Research.1 Even with such high response rates, however, survey nonresponse can bias impact estimates if outcomes of survey respondents and nonrespondents differ, or if the types of individuals who respond to the surveys differ for the treatment and control groups. To correct for differences between respondents and nonrespondents on follow-up surveys, we will construct sample weights that mirror the characteristics of the full sample, so that the baseline characteristics of the responders to the follow-up survey mirror those of the full sample.
ACF will collect information on youth baseline characteristics and behaviors from approximately 10,800 youth across eight selected sites (see Table BI.1 for distribution). Whenever possible, the assignment to treatment (receipt of one of the approaches to reducing teen pregnancy) or control groups (not receiving such treatment) will take place at the site, school, or classroom level in order to minimize contamination between control and treatment group youth. When there are more youth at a site than anticipated, youth will be subsampled.
Sites will provide the ACF contractor with youth rosters and will assist in obtaining active parental consent to participate in the PPA evaluation. To assist the site in gaining parental consent, ACF developed a set of Frequently Asked Questions (see Attachment G). The contractor will prepare a final survey roster of all youth at the site for whom it has received parental consent and who are expected to complete the questionnaire on survey day. Contractor staff will work with sites to determine a date and exact venues for conducting group survey administrations at the sites with “consented” youths. Contractor staff will arrive at the site for the survey day, two staff members per survey room. When in the survey room(s), contractor staff will use the survey roster to take attendance and determine whether any youth are missing and to exclude any not on the survey roster.
Survey administration then begins with contractor staff handing out pre-identified survey packets to the youth whose names are on the packets, and obtaining youth assent. Each packet will consist of the PPA paper-and-pencil interview (PAPI) questionnaire and a sealable survey return envelope. The questionnaire and envelope will have a label with a unique ID number (no personally identifying information will appear on the questionnaire or return envelope). Youth will self-administer the questionnaire. Questionnaire Part A asks for background information and concludes with a single screening question about sexual experience. Youth with sexual experience will complete Part B1 and those without will complete Part B2. Two contractor staff members will monitor activities in each survey room. At the end of the interview, youth will place the entire PPA questionnaire Parts A, B1, and B2 (both the used and the unused sections) in the return envelope, seal it, and return it to a contractor staff member. Staff will send the completed questionnaires to the contractor’s office, where the questionnaires will be receipted and checked for completeness and scannability. All questionnaires that pass the check will be sent to a scanning vendor to be scanned. All scanned data will be electronically transmitted to the contractor.
If any youth are not available for the survey administration or make up sessions, contractor staff will contact them and provide a PIN/password for web completion or, if necessary, will interview them by telephone using the PAPI instrument. After such completions, the same receipting and scanning processes as for PAPI completions will take place.
We expect a better than 90 percent response rate to the baseline survey because survey administration will occur shortly after active parental consent is received. This timing will ensure our contact data are current (no location problems) and that surveys can be administered to most youth in the location where the program would take place (for example, the school). In addition, we expect that obtaining the site’s willing assistance will be very important to maximizing the response rate; we will therefore invest significant effort in gaining their cooperation, minimizing burden on sites, integrating an effective consent process, and assuring privacy to the youth participants. Sites will be given detailed information about the surveys, how they will be administered and on what schedule, what involvement and time will be required of school staff, and how data will be used and protected. Bringing sites into the process while minimizing burden will assure site support of the PPA data collection.
Methods to achieve high response rates at follow-up will be discussed in future information collection requests.
We conducted pretests of the instruments to be used in the evaluation. We recruited pretest participants and study staff talked directly with all interested teens to explain the pretest and the need to obtain parental consent prior to their participation.
Youth were asked to participate in one of five pretest administrations, during which small groups of four or five teens completed the self-administered questionnaire in a group setting and then went to a one-hour one-on-one debriefing with a researcher.
In many ways, the pretest sample represented the two population extremes that we are likely to find in the real study: youth from high socio-economic backgrounds who were active participants in a peer mentoring program that focused on sexual health participated, as well as youth from low socio-economic backgrounds who were receiving social support services from a community organization. The administration of the pretest mirrored as closely as possible what will happen during the actual study in a classroom environment.
Attachment H is a copy of the Pretest Report containing recommendations for changes to the instrument and procedures.
The PPA baseline survey will be administered by ACF’s contracting organization, Mathematica Policy Research. The same contractor will analyze data with support from evaluation colleagues at Child Trends. Individuals whom ACF consulted on the collection and/or analysis of the baseline data include those listed below.
Alan Hershey
Mathematica Policy Research, Inc.
P.O. Box 2391
Princeton, NJ 08543
(609) 275-2384
Christopher Trenholm
Mathematica Policy Research, Inc.
P.O. Box 2391
Princeton, NJ 08543
(609) 936-279-6384
Laura Kalb
Mathematica Policy Research, Inc.
Kristin Moore
Child Trends
4301 Connecticut Ave. NW
Washington, DC
20008-2333
(202) 362-5580
Jennifer Manlove
Child Trends
4301 Connecticut Ave. NW
Washington, DC
20008-2333
(202) 362-5580
955 Massachusetts Avenue, Suite 801
Cambridge, MA 02139
(617) 301-8989
TECHNICAL WORK GROUP MEMBERS
Meredith Kelsey
Abt Associates
55 Wheeler St.
Cambridge, MA 02138
Christine Markham
The University of Texas School of Public Health
P.O. Box 20186
Houston, TX 77225
(713) 500-9646
Pat Paluzzi
President
Healthy Teen Network
1501 Saint Paul St., Suite 124
Baltimore, MD 21202
(410) 685-0410
Susan Philliber
Philliber and Associates
16 Main St.
Accord, NY 12404
(845) 626-2126
Michael Resnick
Division of Adolescent Health and Medicine
717 Delaware St. SE, Suite 370
Minneapolis, MN 55414-2959
(612) 624-9111
We have also consulted with:
Stan Koutstaal (and possibly other staff)
Family and Youth Services Bureau
Division of Abstinence Education
U.S. Department of Health and Human Services
370 L’Enfant Promenade, SW
Washington, DC 20477
(202) 260-2242
Lisa Trivits (and possibly other staff)
Office of the HHS Assistant Secretary for Planning and Evaluation (ASPE).
U.S. Department of Health and Human Services
370 L’Enfant Promenade, SW
Washington, DC 20477
(202) 260-2242
Inquiries regarding statistical aspects of the study design should be directed to:
Seth Chamberlain
Office of Planning, Research, and Evaluation
Administration for Children & Families
U.S. Department of Health and Human Services
370 L’Enfant Promenade, SW
Washington, DC 20477
(202) 260-2242
Mr. Chamberlain is the project officer and has overseen the design of the baseline data collection instrument.
1 Trenholm, Christopher, Barbara Devaney, Kenneth Fortson, Lisa Quay, Justin Wheeler, and Melissa Clark. “Impacts of Four Title V, Section 510 Abstinence Education Programs.” Final report submitted to the U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation. Princeton, NJ: Mathematica Policy Research, 2007.
File Type | application/msword |
File Title | Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches Part B: Statistical Methods for Base |
Author | Mary Hess |
Last Modified By | DHHS |
File Modified | 2010-02-23 |
File Created | 2010-02-18 |