Supporting Statement A

Supporting Statement A.doc

Evaluation of Pregnancy Prevention Approaches - Baseline Data Collection

OMB: 0970-0360

Document [doc]
Download: doc | pdf

Supporting Justification for OMB Clearance of

Evaluation of Pregnancy Prevention Approaches – Baseline Data Collection

(OMB Control #0970-0360)



Part A: Justification for the Collection of Baseline Data



February 2010

The Administration for Children & Families (ACF) of the U.S. Department of Health and Human Services (HHS) is conducting the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA), an eight-year demonstration designed to study the effectiveness of promising policy-relevant strategies to reduce teen pregnancy. We now seek OMB approval for baseline survey data collection, as well as collection of youth participant records and achievement data from schools and other organizations.

A1. Circumstances Making the Collection of Information Necessary

For decades, policymakers and the general public have remained concerned about the prevalence of sexual intercourse among adolescents. Although adolescents today are waiting somewhat longer before having sex than they did in the 1990s, 60 percent of teenage girls and more than 50 percent of teenage boys report having had sexual intercourse by their 18th birthday.1 Approximately one in five adolescents has had sexual intercourse before turning 15.2 Rates of teenage pregnancy declined by 38 percent from 1990 to 2004, and the rate of teen births followed a similar decline3 until recently, when the rate of births rose by 5 percent from 2005 to 2007 for teens aged 15-19.4

The Administration for Children & Families (ACF) is interested in identifying and evaluating promising approaches to reduce teen pregnancy, associated risk behaviors, and their consequences. The baseline data collection described in this ICR, combined with subsequent follow-up data collections, will provide important information to guide policy decisions aimed at addressing this serious concern.

Legal or Administrative Requirements that Necessitate the Collection

Public Law 110-161, which set fiscal year (FY) 2008 appropriations levels, included the following language: “$4,500,000 shall be available from amounts available under section 241 of the Public Health Service Act to carry out evaluations (including longitudinal evaluations) of adolescent pregnancy prevention approaches.” The same language appropriated $4,450,000 in each of FYs 2009 and 2010. These funds have been used for the PPA evaluation.

To accomplish the objective of the appropriations, ACF seeks OMB approval of the baseline survey instrument of program participants. In addition, ACF seeks approval to collect basic records data on the characteristics of sampled youth, such as age, grade, promotion, receipt of free or reduced-price school lunch eligibility, and attendance, as well as program participation data and academic achievement data, such as standardized test scores and grade point average.

Study Objectives

The objective of the PPA evaluation is to test selected promising approaches to prevent teen pregnancy among middle school- and high school-aged teens. The evaluation will help ACF determine the effectiveness of various approaches in affecting key outcomes related to pregnancy prevention (for example, sexual debut, pregnancy, sexually transmitted disease [STD] infection, and so on). Ultimately, the purpose of the evaluation is to provide stakeholders—including practitioners and federal and other policymakers—with information on a range of approaches that hold promise for preventing teen pregnancy, and, through a subsequent follow-up survey, to assess rigorously the effectiveness of these approaches.

In the PPA evaluation, ACF will identify eight study sites that will implement different pregnancy prevention approaches. In approximately six of these sites, the programs to be tested will be school-based—operated, for example, in high schools or middle schools. In the other sites, the programs to be tested will be operated in community-based organizations (CBOs). The study will use a sample of approximately 10,800 teens across these eight sites, a sufficient size to detect policy-relevant impacts of the programs. In each site, youth will be assigned to a treatment group that receives the program of interest, or to a control group that does not. To ensure that behavior of control group youth is not affected, or “contaminated” by interaction with treatment group youth attending the same school or CBO program, random assignment will be done generally at the organization level (that is, the school or CBO). However, it is possible that at some sites random assignment might be done at the individual level, where risks of contamination are low.

A baseline survey will be conducted with both the program and control groups before the youth in the program group are exposed to the pregnancy prevention programs. Wherever possible, there will be group administration of the self-administered survey; when necessary to increase response rates, this method will be augmented with web survey and telephone follow-up. We will also collect relevant records and achievement data (e.g. school attendance, receipt of free or reduced-price lunch, program participation, etc.).

Through the baseline and follow-up surveys (as well as the youth records and achievement data collected), ACF will address the following research questions on program impact:

  • Are the (selected) approaches effective at meeting their immediate objectives (for example, improving knowledge of pregnancy risks)?

  • Are the approaches effective at reducing adolescent pregnancy?

  • What are their effects on related outcomes, such as postponing sexual activity and reducing or preventing sexual risk behaviors and STDs?

  • Do these approaches work better for some groups of adolescents than for others?

ACF is interested in evaluating fairly intensive programs and strategies that can reasonably be expected to produce change. Some programs may thus involve participants over an extended period (for example, curricula covering one or more semesters, sequenced courses provided during different years in high school, or year-long community programs).

Major evaluation activities will include the following:

  • Identifying promising strategies and programs through a review of the literature and interviews with the “field” (for example, researchers, policy experts, and program developers) in order to focus the evaluation on interventions that are of substantial interest to the field and show the most promise for reducing rates of teen sexual activity and pregnancy.

  • Recruiting sites to participate in an evaluation of selected interventions (from among those identified by the field) and providing assistance on evaluation support activities.

  • Collecting data on the research sample at baseline (the focus of this OMB submission) and at two follow-up data collections, scheduled approximately 12 and 36 months after the start of the programs.

  • Analyzing data collected and preparing reports with the results.

ACF is conducting this evaluation through a lead contractor, Mathematica, and its subcontractors: Child Trends, The National Campaign to Prevent Teen and Unwanted Pregnancy, National Abstinence Education Association, and Twin Peaks, LLC.

A2. Purpose and Use of the Information Collection

If this request is approved, the PPA evaluation will collect baseline data on sample members’ characteristics, their sexual activity, prior receipt of services related to reproductive health, their success in school, and information on how they can be contacted later. These data will be obtained from a baseline survey administered to sample youth and from records data available from the programs and/or schools participating in the study.

The data will serve several purposes. Identifying and contact information will help the study team track sample youth throughout the evaluation, and locate them for follow-up if they have graduated, moved to another school, dropped out, or were absent for group follow-up data collection. Baseline variables are also important in several ways for the analysis. They will be used to establish baseline equivalence of the treatment and control groups and thus to confirm the integrity of the random assignment process. Baseline variables will be used to define subgroups for which impacts will be estimated, and to adjust impact estimates for the baseline characteristics of nonrespondents to the follow-up survey. Many baseline variables will be measures of outcomes to be measured again at follow-up; their baseline values can be used to improve the precision of impact estimates by their inclusion as covariates in the impact models.

Baseline data will measure: teens’ demographic and socioeconomic characteristics; knowledge, attitudes, and expectations; dating experience; knowledge, attitudes, and expectations about sexual activity and contraception; stressors and supports; and school and community characteristics (as well as collect contact information). The baseline survey instrument, as well as an outline for collecting school records and performance data, are attached. Attachment A lists the topics to be covered in the baseline instrument, our justification for their inclusion, and how the data from the questions will be used (as a covariate, to define subgroups, to determine intermediate outcomes, or to determine behavioral outcomes). A list of national surveys reviewed in developing the baseline survey instrument for the PPA evaluation is provided in Attachment B.5 Attachment C provides contact information of the persons or federal entities consulted in the drafting and refinement of the baseline survey instrument.

A3. Use of Improved Information Technology and Burden Reduction

The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Where feasible, information will be gathered from existing data sources; the information being requested through surveys is limited to that for which the youth are the best or only information sources. Improved information technology will be used when appropriate and cost-effective (for example, automated transfer techniques for program and school records). During the baseline data collection, self-administered paper-and-pencil instruments (PAPIs) will be used for all group-based completions In those instances in which the survey must be administered to individuals, respondents will be provided a PIN/password for web completion or will be administered a telephone survey. The advantages of PAPI over more technologically innovative approaches, such as laptops or personal digital assistants (PDAs), are that it enables respondents to set their own pace; provides accurate responses to sensitive questions; reduces costs; and simplifies administration logistics, as the majority of interviews will be conducted in a classroom setting. This method is also consistent with other recent youth surveys and evaluations. Studies have shown no difference between PAPI and computer-assisted self-interviewing (CASI) in reports of most measures of male-female sexual activity, including reports such as ever having had sexual intercourse, recent sexual activity, number of partners, condom use, and pregnancy.6,7,8,9,10,11 Turner et al.5 found that CASI improved reporting on low-prevalence behaviors such as male-male sex, injection drug use, and sexual contact with intravenous drug users.

A4. Efforts to Identify Duplication and Use of Similar Information

The information collection requirements for the PPA evaluation have been carefully reviewed to determine what information is already available from existing studies and what will need to be collected for the first time. Although the information from existing studies provides value to our understanding of reducing teenage sexual risk behavior, ACF does not believe that it provides sufficient information on a sufficient range of programs to policymakers and stakeholders aiming to reduce this behavior. The data collection for the PPA evaluation is an essential step to providing this information.

A5. Impact on Small Businesses or Other Small Entities

Programs in some sites may be operated by community-based organizations. The data collection plan is designed to minimize burden on such sites by providing staff from Mathematica Policy Research to assist in group data collection. For respondents who do not complete the survey in the group setting, Mathematica will provide passwords for web completion or will conduct a telephone data collection, thus minimizing requirements for extensive “sample pursuit” by site staff.

A6. Consequences of Collecting Information Less Frequently

Baseline data are essential to conducting a rigorous evaluation of pregnancy prevention programs, per appropriations. In absence of such data, funding decisions on teen pregnancy prevention programs will continue to be based on insufficient and outdated information on program effectiveness.

A7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances for the proposed data collection.

A8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

The 60-day notice was published in the Federal Register on November 9, 2009. The text is found in Attachment D. At this time there are no comments or responses to questions.

In Attachment C we provide the names and contact information of the persons consulted in the drafting and refinement of the baseline survey instrument, a list of institutions from which we received input on drafts of the instrument, and a list of members of the Technical Work Group for the evaluation who provided comments on a near-final draft instrument.

A9. Explanation of Any Payment or Gift to Respondents

No payment or gift to youth respondents will be made during the baseline interview.

A10. Assurance of Confidentiality Provided to Respondents

ACF has embedded protections for privacy in the study design. Data collection will occur only if informed consent is provided by a parent or legal guardian if the respondent is a minor, or by respondents themselves if they are 18 or older. The consent form will explain the data being collected, and its use. The form will also state that answers will be kept private, that youths’ participation is voluntary, and that they may refuse to participate at any time. Participants and their parents/guardians will be told that, to the extent allowable by law, individual identifying information will not be released or published; rather, data collection will be published only in summary form with no identifying information at the individual level. In addition, our protocol during the self-administration of the paper-and-pencil instrument will provide reassurance that we take the issue of privacy seriously. It will be made clear to respondents that identifying information will be kept separate from questionnaires. The questionnaire and envelope will have a label with a unique ID number; no identifying information will appear on the questionnaire or return envelope. Before turning completed questionnaires in to field staff, respondents will place them in envelopes and seal them. This approach has been shown in research to yield the same reports of sexual activity as computer-assisted surveys in school settings, and a lower incidence of student concerns about privacy. Identifying and contact information will be stored in secure files, separate from survey and other individual-level data.

A copy of the parental consent form for the program participants is presented in Attachment E, and a copy of the student assent form in Attachment F.

A11. Justification for Sensitive Questions

Many of the measures in the baseline survey ask for information of a sensitive nature (Exhibit A11.1 – full references for sources cited in this table may be found at the back of Supporting Statement A), because the programs we will be evaluating are designed specifically to reduce sexual activity and associated risk behaviors among teens. Comprehensive measures of behavior are included because they will provide more accurate representations of teen sexual behavior, and the responses will significantly supplement the knowledge currently available on program effectiveness.

Exhibit A11.1: Summary of Sensitive Questions and their Justification

Topic

Justification1

Intentions regarding sexual activity (questions 3.12-3.14)

Intentions regarding engaging in sex and other risk-taking behaviors are extremely strong predictors of subsequent behavior (Buhi and Goodson, 2007). Intentions are strongly related to behavior and will be an important mediator predicting behavior change.


Sexual activity questions 3.17, 4.1–4.23)

Sexual activity is an important outcome for the evaluation and sexual activity at baseline is a powerful predictor of later outcomes. Having data at baseline increases the precision of our estimates of impacts on sexual activity at followup.


Drug and alcohol use (questions 5.1–5.12)

There is a substantial body of literature linking various high-risk behaviors of youth, particularly drug and alcohol use, sexual intercourse, and risky sexual behavior. The effectiveness of various program strategies is expected to differ for youth who are and are not experimenting with or using drugs and alcohol (Tapert et al., 2001; Li et al., 2001; Boyer et al., 1999; Fergusson and Lynskey, 1996; Sen, 2002; Dermen et al., 1998; Santelli et al., 2001.)


Sexual Orientation (question 6.5)

There is mixed evidence linking reported sexual orientation with early or late sexual initiation, risky behavior, and number of partners (Blake, et al. 2001, Goodenow et al., 2002; Resnick, et al., 1998, Magura, et al. 1994; Raj, et al. 2000). Nevertheless, we expect to control for baseline differences in this measure given its potential importance across an 8 site study. In addition, for interventions that focus particular attention on gay, lesbian and bisexual youth, we will use this measure to estimate impacts separately for this subgroup.

1Full references for sources cited in table may be found at the end of Appendix A.



Sensitive questions are drawn from previously-successful youth surveys and evaluations (see Attachment B). The items have been carefully selected, and we have been guided by past experience in determining whether or not the benefits of measures may outweigh concerns about the heightened sensitivity among sample members, parents, and program staff to specific issues. Although these questions are sensitive, they are commonly and successfully asked of youth similar to those who will be in the study, and we have pretested all of these specific survey questions among a diverse group of teens without any concerns raised about the questions’ sensitivity. Many of the sensitive items related to sexual activity will be asked only of sample members who report being sexually active.

A12. Estimates of Annualized Burden Hours and Costs

The PPA information collection does not impose a financial burden on youth respondents. Respondents will not incur any burden other than the time spent answering the questions contained in the questionnaires.

Exhibit A12.1 summarizes the reporting burden on study participants. Enrollment will occur over three years, so this burden is based on one-third (3,600) of the expected sample (10,800). Questionnaire response times were estimated from pretests with student respondents and from prior experience. The annual burden for questionnaire response is estimated from the total number of completed questionnaires proposed and the time required to complete the questionnaires. The total annual burden is expected to be 1,864 hours.

The total annual burden cost for the school records, performance, and program participation collection is calculated by multiplying the hourly mean wage of $15.57 (per the latest – May 2008 – National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor website) times eight hours times eight entities (six school districts plus two other organizations). This calculation thus assumes that each district or organization assigns a person as liaison, and each district or organization is asked to provide data on the sample on three occasions, but only once per year.

Exhibit A12.1. Reporting Burden on Study Participants

Instrument

Annual
Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Total Annual Burden Hours

Average Hourly Wage of Respondents

Total Annual Burden
Cost

Baseline Instrument


3,600

1

0.5

1,800

$0

$0

School Records, Performance, and Program Participation Data Collection

8

1

8.0

64

$15.57

$996.48

Total




1,864


$996.48

A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

These information collection activities do not place any additional cost on respondents.

A14. Annualized Cost to the Federal Government

This clearance request is specifically for collecting data at baseline. Total estimated cost to the government is $1,676,325 for instrument development and data collection. Because baseline data collection will be carried out over three years, as successive sites start up and enroll samples, the estimated annualized cost to the government for baseline data collection is $558,755 per year.

A15. Explanation for Program Changes or Adjustments

No program adjustments are anticipated based on this data collection.

OMB gave approval on November 24, 2008, for outreach discussions with stakeholders, experts in the field, and program developers (OMB Control No. 0970-0360). OMB also gave approval on August 31, 2009 under a generic clearance (0970-0355) to conduct pre-tests of the baseline instrument. ACF’s contractor, Mathematica Policy Research, Inc., conducted the pre-test and took the results into account – as well as advice from experts in the field – in redrafting the instrument.

ACF now seeks OMB approval for the baseline survey and collection of youth school records, performance, and program participation data. These will take place over three years, as successive sites start evaluation sample enrollment and implement their programs. The data will be used for the impact analysis. Approval for follow-up surveys will be requested in a subsequent submission, as will data collection instruments for the process evaluation.

A16. Plans for Tabulation and Publication and Project Time Schedule

1. Analysis Plan

This phase of the PPA demonstration and evaluation involves collecting baseline information that will be used for the impact evaluation during the follow-up data collection.

Before estimating impacts, ACF will conduct two analyses of the data from the baseline survey. First, ACF will use the data to describe the study sample and help define subgroups of policy interest. This step will enable ACF to compare the characteristics of youth in the study with youth nationwide and provide guidance on how the study sample and findings might generalize to a broader policy setting. Second, ACF will assess whether random assignment resulted in similar baseline characteristics of youth, on average, for the treatment and control groups.

Pregnancy prevention approaches emphasize different outcomes. Some focus on promoting abstinence; others focus on use of contraceptives and avoiding STDs. The baseline data collected from program participants will ultimately be used to evaluate the effectiveness of these promising approaches with particular emphasis on the outcomes they target, as well as common outcomes across all approaches.

Unbiased impact estimates can be obtained from the difference in the mean outcomes between the treatment and control groups. However, we can improve precision by controlling in our regression model for covariates, especially baseline measures of outcomes. Regression adjustment can also address any differences between the treatment and control groups in baseline characteristics that arose by chance or from survey nonresponse.

The empirical specification for the model will depend on the unit of random assignment, which will depend on the type of program provided at a specific site. As we discuss further in section B1, most sites will use random assignment of entire schools, but some sites will employ random assignment of individuals within the site. With random assignment of students, our model can be expressed as:

(1) ,

where yi is the outcome of interest for student i; xi is a vector of baseline characteristics for student i, including baseline measures of the key outcomes; Ti is an indicator equal to one if the student is in the treatment group and zero if in the control group; and i is a random error term for student i. The vector of baseline characteristics xi will include demographic characteristics such as age, gender, race/ethnicity, and baseline measures of key outcomes. The parameter estimate for is the estimated impact of the program.

In most sites, schools will be randomly assigned and the estimation must account for the correlation of outcomes between students in the same school, as they may be exposed to similar influences not otherwise captured in the regression model. Therefore, each student cannot be considered statistically independent. We can modify the previous regression model as:

(2) .

The general structure of the model is the same, but now yis is the outcome measure for student i in school s (and similarly for the vector of baseline characteristics xis and the error term is). The treatment status Ts is now defined by school rather than by individual. Most importantly, the error term in Equation (2) accounts for the clustering of students within schools because of the inclusion of the school-level error term s—a school “random effect.” If this error term is excluded, the precision of the impact estimates could be seriously overstated. As in Equation (1), the estimated impact of the program is .

The specific maximum-likelihood methods for estimating the parameters of the models will depend on the form of the dependent variable. Logistic regression procedures will be specified for binary outcomes (such as whether the student has an STD) and multinomial regression procedures will be specified for categorical outcomes (such as the number of sexual partners).

Random assignment provides an unbiased estimate of the impact on all eligible youth, but some youth may never show up for services or classes. Assuming the program has no effect on youth who never show up, we can make a simple adjustment to calculate the impact on participants by dividing the impact on eligible youth by the participation rate. (However, this adjustment cannot be used in the more likely scenario that youth receive some, but not all, of the intervention.)

The effects of pregnancy prevention approaches may differ for different groups of youth. We will estimate impacts for subgroups of youth by adding to Equations (1) and (2) a term that interacts the treatment indicator by a binary indicator indicating whether the youth is in the subgroup or not. The estimate of the coefficient on this term provides an estimate of the difference in the program effect across the subgroups.

2. Time Schedule and Publications

The entire PPA evaluation will be conducted over an eight-year period. ACF began consultation with stakeholders about the design of the study and identification of potential programs and sites in September 2008 and will continue through March 2011. The baseline data collection, for which ACF is currently seeking OMB approval, will take place over a three-year period beginning in September 2010 and ending by May 2013. The 12-month and 36-month follow-up data collections are projected to occur between May 2011 and May 2015. The process evaluation will take place between fall 2010 and spring 2013. No formal publications are planned from the baseline information collection.

A17. Reason(S) Display of OMB Expiration Date is Inappropriate

All instruments will display the OMB number and the expiration date.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

















SUPPORTING REFERENCES FOR INCLUSION OF SENSITIVE QUESTIONS

OR GROUPS OF QUESTIONS

Blake, Susan M., Rebecca Ledsky, Thomas Lehman, Carol Goodenow, Richard Sawyer, and Tim Hack. "Preventing Sexual Risk Behaviors among Gay, Lesbian, and Bisexual Adolescents: The Benefits of Gay-Sensitive HIV Instruction in Schools." American Journal of Public Health, vol. 91, no. 6, 2001, pp. 940-46.

Boyer, Cherrie B., Jeanne M. Tschann, and Mary-Ann Shafer. "Predictors of Risk for Sexually Transmitted Diseases in Ninth Grade Urban High School Students." Journal of Adolescent Research, vol. 14, no. 4, 1999, pp. 448-65.

Buhi, Eric R. and Patricia Goodson. "Predictors of Adolescent Sexual Behavior and Intention: A Theory-Guided Systematic Review." Journal of Adolescent Health : Official Publication of the Society for Adolescent Medicine., vol. 40, no. 1, 2007, pp. 4.

Dermen, K. H., M. L. Cooper, and V. B. Agocha. "Sex-Related Alcohol Expectancies as Moderators of the Relationship between Alcohol use and Risky Sex in Adolescents." Journal of Studies on Alcohol., vol. 59, no. 1, 1998, pp. 71.

DiClemente RJ, Durbin M, Siegel D, Krasnovsky F, Lazarus N, and Comacho T. "Determinants of Condom use among Junior High School Students in a Minority, Inner-City School District." Pediatrics, vol. 89, no. 2, 1992, pp. 197-202.

DiClemente RJ, Lodico M, Grinstead OA, Harper G, Rickman RL, Evans PE, and Coates TJ. "African-American Adolescents Residing in High-Risk Urban Environments do use Condoms: Correlates and Predictors of Condom use among Adolescents in Public Housing Developments." Pediatrics, vol. 98, no. 2, 1996, pp. 269-78.

DiIorio, Colleen, William N. Dudley, Johanna E. Soet, and Frances Mccarty. "Sexual Possibility Situations and Sexual Behaviors among Young Adolescents: The Moderating Role of Protective Factors." Journal of Adolescent Health : Official Publication of the Society for Adolescent Medicine., vol. 35, no. 6, 2004, pp. 528.

Dittus PJ and Jaccard J. "Adolescents' Perceptions of Maternal Disapproval of Sex: Relationship to Sexual Outcomes." The Journal of Adolescent Health : Official Publication of the Society for Adolescent Medicine, vol. 26, no. 4, 2000, pp. 268-78.

Fergusson, David M. and Michael T. Lynskey. "Alcohol Misuse and Adolescent Sexual Behaviors and Risk Taking." Pediatrics, vol. 98, no. 1, 1996, pp. 91.

Goodenow, C., J. Netherland, and L. Szalacha. "AIDS-Related Risk among Adolescent Males Who have Sex with Males, Females, Or both: Evidence from a Statewide Survey." American Journal of Public Health, vol. 92, 2002, pp. 203-210.

Li, Xiaoming, Bonita Stanton, Lesley Cottrell, James Burns, Robert Pack, and Linda Kaljee. "Patterns of Initiation of Sex and Drug-Related Activities among Urban Low-Income African-American Adolescents." Journal of Adolescent Health : Official Publication of the Society for Adolescent Medicine., vol. 28, no. 1, 2001, pp. 46.

Magura, S., J. L. Shapiro, and S. -. Kang. "Condom use among Criminally-Involved Adolescents." AIDS Care, vol. 6, no. 5, 1994, pp. 595.

Raj, Anita, Jay G. Silverman, and Hortensia Amaro. "The Relationship between Sexual Abuse and Sexual Risk among High School Students: Findings from the 1997 Massachusetts Youth Risk Behavior Survey." Maternal and Child Health Journal, vol. 4, no. 2, 2000, pp. 125-134.

Resnick, M. D., P. S. Bearman, R. W. Blum, K. E. Bauman, K. M. Harris, J. Jones, J. Tabor, T. Beuhring, R. Sieving, M. Shew, L. H. Bearinger, and J. R. Udry. "Protecting Adolescents from Harm: Findings from the National Longitudinal Study on Adolescent Health." JAMA : The Journal of the American Medical Association., vol. 278, no. 10, 1997, pp. 823.

Santelli, John S., Leah Robin, Nancy D. Brener, and Richard Lowry. "Timing of Alcohol and Other Drug use and Sexual Risk Behaviors among Unmarried Adolescents and Young Adults." Family Planning Perspectives, vol. 33, no. 5, 2001.

Sen, Bisakha. "Does Alcohol-use Increase the Risk of Sexual Intercourse among Adolescents? Evidence from the NLSY97." Journal of Health Economics., vol. 21, no. 6, 2002, pp. 1085.

Tapert, Susan F., Gregory A. Aarons, Georganna R. Sedlar, and Sandra A. Brown. "Adolescent Substance use and Sexual Risk-Taking Behavior." Journal of Adolescent Health : Official Publication of the Society for Adolescent Medicine., vol. 28, n3, 2001, pp.181.




1 Abma, J. C., G. M. Martinez, W. D. Mosher, and B. S. Dawson. “Teenagers in the United States: sexual activity, contraceptive use, and childbearing”, Vital and Health Statistics, vol. 23, no. 24, 2004, pp. 1–48.

2 Albert, B., S. Brown, and C. Flannigan, eds. 14 and Younger: The Sexual Behavior of Young Adolescents. Washington, DC: National Campaign to Prevent Teen Pregnancy, 2003.

3 Teen birth rates declined by 34% from 1991–2005. See: Hamilton, B. E., J. A. Martin, and S. J. Ventura. “Births: Preliminary data for 2006.” National Vital Statistics Reports, vol. 56, no. 7. Hyattsville, MD: National Center for Health Statistics, 2007.

4 Hamilton BE, Martin JA, Ventura SJ. Births: Preliminary data for 2007. National vital statistics reports, Web release; vol 57 no 12. Hyattsville, MD: National Center for Health Statistics. Released March 18, 2009.

5 In order to best fit the proposed PAPI survey mode for the targeted age range, nearly all proposed survey items were adapted, to some degree, from those found on these national surveys. Adaptations included modifications in the wording to make questions easier to understand in PAPI administration, and/or modifications in response categories to simplify the options available, or to address more directly the main goal of the baseline survey, which is to support an eventual impact evaluation.

6 Turner, C.F., L. Ku, S.M. Rogers, L.D. Lindberg, J.H. Pleck, and F.L. Sonenstein. “Adolescent Sexual Behavior, Drug Use, and Violence: Increased Reporting with Computer Survey Technology.” Science, vol. 280, 1998, pp. 867–873.

7 Beebe, Timothy J., Patricia A. Harrison, James A. McCrae Jr., Ronald E. Anderson, and Jayne A. Fulkerson. “An Evaluation of Computer-Assisted Self-Interviews in a School Setting.” Public Opinion Quarterly, vol. 62, 1998, pp. 623–632.

8 Beebe, Timothy J., Patricia A. Harrison, Eunkyung Park, James A. McRae, Jr., and James Evans. “The Effects of Data Collection Mode and Disclosure on Adolescent Reporting and Health Behavior.” Social Science Review, vol. 24, no. 4, 2006, pp. 476–488.

9 Brener, Nancy D., Danice K. Eaton, Laura Kann, JoAnne Grunbaum, Lori A. Gorss, Tonja M. Kyle, and James G. Ross. “The Association of Survey Setting and Mode with Self-Reported Health Risk Behaviors Among High School Students.” Public Opinion Quarterly, vol. 70, 2006, pp. 354–374.

10 Webb, P.M., G.D. Zimet, J.D. Fortenberry, and M.J. Blythe. “Comparability of a Computer-Assisted Versus Written Method for Collecting Health Behavior Information from Adolescent Patients.” Journal of Adolescent Health, vol. 24, no. 6, 1999, pp. 383–388.

11 Schochet, Peter Z. “An Approach for Addressing the Multiple Testing Problem in Social Policy Impact Evaluations.” Evaluation Review, vol.33, no.6, December 2009.


File Typeapplication/msword
File TitleSupporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches Part A: Justification for the Collec
AuthorMary Hess
Last Modified ByDHHS
File Modified2010-02-23
File Created2010-02-18

© 2024 OMB.report | Privacy Policy