0990-PPA_-_Implementation_Supporting Statement A_revised 3.17.11

0990-PPA_-_Implementation_Supporting Statement A_revised 3.17.11.docx

Evaluation of Pregnancy Prevention Approaches:Implementation Study Data Collection

OMB: 0990-0375

Document [docx]
Download: docx | pdf



Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches (OMB Control #0990-xxx)

Part A: Justification for the Collection of Implementation Data



October 2010



CONTENTS

A1. Circumstances Making the Collection of Information Necessary 1


A2. Purpose and Use of the Information Collection 3


A3. Use of Improved Information Technology and Burden Reduction 4


A4. Efforts to Identify Duplication and Use of Similar Information 4


A5. Impact on Small Businesses or Other Small Entities 4


A6. Consequences of Collecting Information Less Frequently 4


A7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 5


A8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 5


A9. Explanation of Any Payment or Gift to Respondents 5


A10. Assurance of Confidentiality Provided to Respondents 5


A11. Justification for Sensitive Questions 6


A12. Estimates of Annualized Burden Hours and Costs 6


A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 7


A14. Annualized Cost to the Federal Government 7


A15. Explanation for Program Changes or Adjustments 7


A16. Plans for Tabulation and Publication and Project Time Schedule 7


1. Analysis Plan 7

2. Time Schedule and Publications 8


A17. Reason(S) Display of OMB Expiration Date is Inappropriate 8


A18. Exceptions to Certification for Paperwork Reduction Act Submissions 8



The Office of Adolescent Health (OAH), Office of the Assistant Secretary for Health (OASH), U.S. Department of Health and Human Services in collaboration with the Administration for Children & Families (ACF) of the U.S. Department of Health and Human Services (HHS) is conducting the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA), an eight-year demonstration designed to study the effectiveness of promising policy-relevant strategies to reduce teen pregnancy. We now seek OMB approval for implementation study data collection. OMB has previously provided clearance for informal interviews with stakeholders, the first phase of the project (control number 0970-0360).

A1. Circumstances Making the Collection of Information Necessary

For decades, policymakers and the general public have remained concerned about the prevalence of sexual intercourse among adolescents. Although adolescents today are waiting somewhat longer before having sex than they did in the 1990s, 60 percent of teenage girls and more than 50 percent of teenage boys report having had sexual intercourse by their 18th birthday.1 Approximately one in five adolescents has had sexual intercourse before turning 15.2 Rates of teenage pregnancy declined by 38 percent from 1990 to 2004, and the rate of teen births followed a similar decline3 until recently, when the rate of births rose by 5 percent from 2005 to 2007 for teens aged 15-19.4

The Office of Adolescent Health (OAH) and the Administration for Children & Families (ACF) are interested in identifying and evaluating promising approaches to reduce teen pregnancy, associated risk behaviors, and their consequences. The implementation study data collection described in this ICR, combined with baseline and follow-up data collections, will provide important information to guide policy decisions aimed at addressing this serious concern.

This ICR specifically requests clearance to collect information in the following ways:

  • Interviews with program staff, and with community members where programs are implemented, using the Master Topic Guide (see Attachment E);

  • Focus groups with front-line staff, using a Discussion Guide (see Attachment F);

  • Focus groups with youths in both the program and control conditions, using a Discussion Guide (see Attachment G); and

  • Interviews with program staff and community members in the control condition, when appropriate, using a topic guide (see Attachment H).



Legal or Administrative Requirements that Necessitate the Collection

Public Law 110-161, which set fiscal year (FY) 2008 appropriations levels, included the following language: “$4,500,000 shall be available from amounts available under section 241 of the Public Health Service Act to carry out evaluations (including longitudinal evaluations) of adolescent pregnancy prevention approaches.” The same language appropriated $4,450,000 in each of FYs 2009 and 2010. These funds have been used for the PPA evaluation.

To help accomplish the objective of the appropriations, OAH and ACF seek OMB approval of the implementation study protocols.

Study Objectives

The objective of the PPA evaluation is to test selected promising approaches to prevent teen pregnancy among middle school- and high school-aged teens. The evaluation will help OAH and ACF determine the effectiveness of various approaches in affecting key outcomes related to pregnancy prevention (for example, sexual debut, pregnancy, and sexually transmitted disease [STD] infection). Ultimately, the purpose of the evaluation is to provide stakeholders—including practitioners and federal and other policymakers—with information on approaches that hold promise for preventing teen pregnancy, and the effectiveness of these approaches.

In the PPA evaluation, OAH and ACF will identify eight study sites (locations) that will implement different pregnancy prevention approaches. In approximately six of these sites, the programs to be tested are expected to be school-based—operated, for example, in high schools or middle schools. In the other sites, the programs to be tested will be operated in or by community-based organizations (CBOs). The study will enroll a sample of approximately 10,800 teens across these eight sites, a sufficient size to detect policy-relevant impacts of the programs. In each site, youth will be assigned to a treatment group that receives the program of interest, or to a control group that does not. To ensure that behavior of control group youth is not affected, or “contaminated” by interaction with treatment group youth attending the same school or CBO program, random assignment will be done generally at the organization level (that is, the school or CBO). However, it is possible that at some sites random assignment might be done at the individual level, where risks of contamination are low.

OAH and ACF are interested in evaluating fairly intensive programs and strategies that can reasonably be expected to produce change. Some programs may thus involve participants over an extended period (for example, curricula covering one or more semesters, sequenced courses provided during different years in high school, or year-long community programs).

Major evaluation activities include the following:

  • Identifying promising strategies and programs to focus the evaluation on interventions of substantial interest to the field that show promise for reducing rates of teen sexual activity and pregnancy.

  • Recruiting sites to participate in an evaluation of selected interventions and providing assistance to sites on evaluation support activities.

  • Collecting data on the research sample at baseline (the focus of a previous OMB submission) and at two follow-up data collections, tentatively scheduled to occur approximately 12 and 36 months after sample members are enrolled.

  • Collecting information on program implementation during the evaluation period – the focus of this OMB submission – from program records and site visits at two points in the program implementation period.

  • Analyzing data collected and preparing reports with the results.

Through the implementation study, OAH and ACF will address four main objectives. First, the study will help us understand how each program is intended to operate in the participating sites and how it is expected to affect youths. What is the plan for each intervention’s implementation? How is the program expected to work?

Second, the study will document the implementation of each program. How was each program actually delivered? What services and activities were offered, how were they carried out, and to what extent did youths participate and become engaged in them? In what context were these services and activities provided? How did these services and activities differ from those of other similar programs in the community?

The third objective is to assess the extent to which program implementation adhered to the program model and site implementation plans. Was each program implemented with fidelity to the developer’s intentions and the site’s implementation plans? Was the quality of program delivery good, to the extent that quality can be assessed?

Finally, the implementation study will describe the contrast between the program as implemented and the “business as usual” counterfactual. How were the activities and services provided by the program similar to and different from those available to control group youths? How did the experiences of program group youths differ from those of control group youths?

Understanding the programs, documenting their implementation and context, and assessing fidelity of implementation will enable us to describe each implemented program and the treatment-control contrast evaluated in each site. This information will help us interpret impact analysis findings and may help explain any unexpected findings, differences in impacts across programs, and differences in impacts across locations or population subgroups.

OAH and ACF are conducting this evaluation through a lead contractor, Mathematica Policy Research, Inc., and its subcontractors: Child Trends, the National Campaign to Prevent Teen and Unwanted Pregnancy, the National Abstinence Education Association, Public Strategies, Inc., and Twin Peaks Partners, LLC.

A2. Purpose and Use of the Information Collection

If this request is approved, the PPA evaluation will collect data on program implementation. This will include information about each site’s program design and theory of change, program administration and funding, resources required to implement the program, key program activities and features, dimensions of program delivery and youth participation, adaptations of the programs to fit the context, and fidelity to the curriculum or program guidelines and site plans. Information on these topics will be obtained from existing program documents as well as individual and group interviews with program developers, program leaders and staff, participating youths, school representatives, program partners, and other community members knowledgeable about related services for adolescents, as well as observation of program activities. Attachment A lists the topics on which information will be collected, and the planned sources of information for each topic.

The data will serve two main purposes. First, the information will enable the study team to produce clear, detailed descriptions of each intervention that is evaluated and the counterfactual in each site. This documentation is critical for understanding the meaning of impact estimates. Second, the data will be used to assess fidelity of implementation. This information is essential for determining whether the interventions were implemented well and whether the evaluation provided a good test of each site’s intervention.

A3. Use of Improved Information Technology and Burden Reduction

The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Where feasible, information will be gathered by extracting needed information from existing documents. Protocols for interviews and group discussions during site visits will be customized for each site to focus on information that is relevant for that site and that could not be obtained from documents.

Improved information technology will be used when appropriate. For example, when program documents can be sent electronically, we will not request a hard copy of the documents.

A4. Efforts to Identify Duplication and Use of Similar Information

The information collection requirements for the PPA evaluation have been carefully reviewed to determine what information is already available from existing studies and what will need to be collected for the first time. Although prior studies contribute to our understanding of teenage sexual risk behavior and past efforts to reduce it, OAH and ACF do not believe they provide sufficient information on a sufficient range of programs to policymakers and stakeholders. Furthermore, Congress requires evaluations, including longitudinal evaluations, of adolescent pregnancy prevention approaches. The data collection for the PPA evaluation is an essential step in providing this information.

A5. Impact on Small Businesses or Other Small Entities

Programs in some sites may be operated by or in collaboration with small community-based organizations. The implementation data collection plan is designed to minimize burden on such organizations by focusing interviews with their staff on their direct role in the intervention and its development or planning.

A6. Consequences of Collecting Information Less Frequently

Implementation data are essential to conducting a rigorous evaluation of pregnancy prevention programs, per appropriations. In the absence of such data, the meaning of estimated program impacts may be uncertain, and future funding and operational decisions about teen pregnancy prevention programs will be based on insufficient information about program implementation issues.

Collecting implementation data less frequently would make it impossible to assess fidelity to program developers’ standards and site implementation plans. Moreover, we would lose the opportunity to document the evolution of site operations during the evaluation and provide lessons based on the experiences of the sites.

A7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances for the proposed data collection.

A8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

The 60-day notice was published in the Federal Register on July 12, 2010. The text is found in Attachment B. At this time there are no comments or responses to questions.

In Attachment C we provide the names and contact information of the persons consulted in the drafting and refinement of the implementation study protocols, a list of institutions from which we received input on drafts of the protocols, and a list of members of the Technical Work Group who attended a review meeting in spring 2010; plans for the implementation study were presented to this group, and the ensuing discussion contributed to the refinement of the plan as presented here..

A9. Explanation of Any Payment or Gift to Respondents

No payment or gift will be made to program staff and community members for being interviewed during site visits. We propose to offer refreshments to staff and youths who participate in focus groups.

A10. Assurance of Confidentiality Provided to Respondents

OAH and ACF have embedded protection of privacy in the study design. A Certificate of Confidentiality has been obtained (as of June 8, 2010; in place through September 30, 2016) from the National Institute for Child Health and Human Development for this study. A description of the implementation component of PPA – for which this package is being submitted – was included in the application for the Certificate of Confidentiality.

Implementation study respondents (program developers, site staff members, and community members) will receive information about privacy protection when arrangements are made for meeting with them, and information about privacy will be repeated as part of the study field staff’s introductory comments during site visits (see Attachment F for an example of these introductory comments). Site visit staff will be informed about privacy procedures during training and will be prepared to describe them and to answer questions raised by local program staff.

Youth who comprise the sample for the PPA study must have parental permission to participate, in addition to providing their own assent for each data collection. The permission form that parents sign at the time their child is being enrolled allows the study team to collect baseline data and follow-up data through questionnaires and to invite their child to participate in a focus group to discuss his/her experiences in the program. (This form was approved as part of the Baseline ICR.5) Before completing questionnaires, the sample member youth will also complete an assent form. Youth invited to a focus group as part of the implementation study will be asked at that time to complete another assent form before the focus group is conducted. This form will state that answers will be kept private, that youths’ participation is voluntary, that they may refuse to participate, and that identifying information about them will not be released or published.

The assent form that sample members will be asked to sign before the start of a focus group is included as Attachment D.

A11. Justification for Sensitive Questions

The implementation study protocols do not contain sensitive questions. Interview guides for data collection from staff focus on the components of the pregnancy prevention programs being evaluated and the experiences of staff in implementing them. Focus groups with youth will address their experiences in the program, and not their sexual experiences or other personal behaviors.

A12. Estimates of Annualized Burden Hours and Costs

Exhibit A12.1 summarizes the estimated annual reporting burden on implementation study participants. The burden estimates are based on site visits to eight programs over three years. We expect to conduct two visits to six programs and three visits to two programs; this variation is expected because some programs will implement earlier than others, allowing more time to chart the development of the program. Interview times were estimated based on prior experience.

Average hourly wages for program staff and community members were estimated from the latest – May 2008 – National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor website. For youths participating in a focus group discussion, the average hourly wage is assumed to be $0.

The annual burden is estimated from the average total anticipated annual number of respondents, the number of sites, the estimated time required to complete the interviews, and the average hourly wage for respondents. The average total annual burden is expected to be 300 hours.



Exhibit A12.1. Annual Reporting Burden on Implementation Study Participants

Instrument

Annual
Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Total Annual Burden Hours

Average Hourly Wage of Respondents

Total Annual Burden
Cost

Staff and community member interviews (Master Topic Guide)


48

1

1.5

72

$41

$2,952

Guide for Focus Group Discussion with Frontline Staff

48

1

1.5

72

$35

$2,520

Guide for Focus Group Discussion with Participating Youths

216

1

1.5

324

$0

$0

Guide for Discussion with Control Group Schools about Counterfactual

48


1



1


48


$41


$1,968

Total




516


$7,440



A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

These information collection activities do not place any additional cost on respondents.

A14. Annualized Cost to the Federal Government

This clearance request is specifically for the implementation study. Total estimated cost to the government, for data collection, analysis, and reporting, is $888,361. Because the implementation study data collection, analysis and reporting will be carried out over a period of three years, the estimated annualized cost to the government for baseline data collection is $296,120 per year.

A15. Explanation for Program Changes or Adjustments

No program adjustments are anticipated based on this data collection.

OAH and ACF now seek OMB approval for the collection of implementation data during on-site visits. These data will be collected over three years, as successive sites start evaluation sample enrollment and implement their programs. The data will be used for the implementation analysis.

A16. Plans for Tabulation and Publication and Project Time Schedule

1. Analysis Plan

After completing a site visit, each site visitor team will prepare or update a site profile, which closely follows the organization of the master topic guide. Staff will use all relevant information from site visit interviews, group discussions and focus groups, direct observations, and program documents and records. Since there will be a substantial volume of text, the expanded program profile may be presented in a different format to avoid creating a very cumbersome table. However, for summary purposes, a brief version of each site’s profile will be prepared in this table format.

Analyzing mostly qualitative data such as the information that will be recorded in the site profiles requires creating data structures and using them systematically. We will use Atlas.ti, a qualitative analysis software package, to create and apply a coding system for organizing and categorizing data based on the structure of the research questions and topics. The coding will enable us to retrieve data linked to specific questions and topics, and facilitate analyses of themes across programs and sites. Site visitors will be trained to use the coding structure, and their reliability in coding will be established before they apply the coding to data from their site visits.

The qualitative analyses will entail site-specific and cross-site analyses. For each site, we will construct narrative site summaries, detailed diagrams and timelines to illustrate each site’s program design and theory of change, participation tables to describe participants’ exposure to program services, theme tables to assemble evidence under each topic, an assessment of implementation fidelity, and examinations of differences between the program and counterfactual, across locations, between stakeholders, and over time. Key elements of these analyses for each site, as well as implementation challenges and successes, will be examined in cross-site analyses to draw lessons for practitioners and policy makers.

2. Time Schedule and Publications

The entire PPA evaluation will be conducted over an eight-year period. ACF began consultation with stakeholders about the design of the study and identification of potential programs and sites in September 2008 and will continue through March 2011. The first round of site visits in each site will take place around the time that program operations begin, between late 2010 and fall 2012. The followup site visits are projected to occur between May 2011 and May 2013.

We will produce several reporting products, including an interim implementation report after the first round of site visits have been completed in each site; contextual information on implementation and services offered at the intervention and control sites for the final report, after the final round of site visits have been completed in each site; and one or two topical research briefs that convey information that policy and program decision makers need on key subtopics of interest.

A17. Reason(S) Display of OMB Expiration Date is Inappropriate

All protocols will display the OMB number and the expiration date.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.



1 Abma, J. C., G. M. Martinez, W. D. Mosher, and B. S. Dawson. “Teenagers in the United States: sexual activity, contraceptive use, and childbearing”, Vital and Health Statistics, vol. 23, no. 24, 2004, pp. 1–48.

2 Albert, B., S. Brown, and C. Flannigan, eds. 14 and Younger: The Sexual Behavior of Young Adolescents. Washington, DC: National Campaign to Prevent Teen Pregnancy, 2003.

3 Teen birth rates declined by 34% from 1991–2005. See: Hamilton, B. E., J. A. Martin, and S. J. Ventura. “Births: Preliminary data for 2006.” National Vital Statistics Reports, vol. 56, no. 7. Hyattsville, MD: National Center for Health Statistics, 2007.

4 Hamilton BE, Martin JA, Ventura SJ. Births: Preliminary data for 2007. National vital statistics reports, Web release; vol. 57, no. 12. Hyattsville, MD: National Center for Health Statistics. Released March 18, 2009.

5 The latest version of this form, which includes language related to focus groups, may be found at http://www.reginfo.gov/public/do/PRAViewDocument?ref_nbr=201010-0970-001


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches Part A: Justification for the Collec
AuthorMary Hess
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy