June 2014
Submitted by:
Office of Adolescent Health
U.S. Department of Health and Human Services
1101 Wootton Parkway, Suite 700
Rockville, MD 20852
Project Officer: Amy Farb
The Office of Adolescent Health (OAH) Teen Pregnancy Prevention (TPP) grant program was launched in spring 2010 as a key early piece of the federal government’s ongoing “evidence and innovation” agenda. The program provides approximately $100 million in annual competitive contracts and grants to public and private entities to fund medically accurate and age appropriate programs to reduce teen pregnancy. The program features a “tiered-evidence” grant design that reserves most of the funding for grants to replicate programs with existing evidence of effectiveness (Tier 1). A smaller proportion of funding is reserved to encourage innovation in the field by implementing and rigorously testing promising new programmatic approaches (Tier 2). The first Tier 1 Replication grants were awarded in fall 2010 to 75 state or local organizations, for programming to start in fall 2011.
Consistent with the program’s focus on evidence, OAH has undertaken a range of evaluation activities associated with Tier 1 of the TPP program. Nine grantees are participating in the ongoing federal TPP Replication Study1, a large-scale, multi-site random assignment evaluation of three different evidence-based teen pregnancy prevention programs. Other grantees are using a portion of their funding to conduct their own “local” program impact and implementation evaluations. All grantees collect data on a uniform set of performance measures2 and report them to OAH on a semi-annual basis through an online system. The burden associated with these ongoing data collection activities has been previously reviewed and approved by OMB.
With this information collection request, OAH seeks approval for additional data collection instruments, to conduct a complimentary cost study of selected TPP grantees. The proposed cost study adds a new and unique contribution to OAH’s portfolio of evaluation activities. The study has three main components: (1) a cost analysis to determine the cost of implementing select evidence-based teen pregnancy prevention programs; (2) an economic evaluation to determine the economic impact of select evidence-based teen pregnancy prevention programs; and (3) the development of guidance and tools for OAH to use to collect and analyze cost data from potential future grantees in a systematic, standardized way. OAH has contracted with Mathematica Policy Research to conduct this three-year (2013-2016) study.
This information collection request focuses specifically on data collection instruments for the first two study components: (1) the cost analysis and (2) the economic evaluation. Any data collection instruments developed in support of the third study component (guidance and tools for use with future grantees) will be submitted separately in future years of the study.
A1. Circumstances Making the Collection of Information Necessary
High rates of teen pregnancy, sexually transmitted infections, and associated sexual risk behaviors remain a troubling issue in the United States. Nationwide, 24 percent of high school students report having had four or more partners by graduation, and nearly 40 percent of sexually active students had not used a condom during their last sexual intercourse. These behaviors increase the risks of pregnancy and STIs, including HIV. Preliminary national data for 2012 indicate there were approximately 29.4 births per 1,000 females 15 to 19 years of age, a rate higher than in most other industrialized countries. In addition, estimates suggest that adolescents and young adults account for half of all new STI cases in the United States every year.
Although prior studies have identified a range of programs with evidence of effectiveness in reducing these risks, much less is known about program cost and the return on investment. Researchers and policymakers have long recognized that teen pregnancy can have high social, economic, and personal costs, not only for teen parents, but also for their children, families, and communities. For this reason, programs successful in reducing teen pregnancy are generally considered sound public investments, even if they are relatively expensive. However, teen pregnancy prevention programs can vary substantially in intensity and length, ranging from short, one-time clinic or counseling sessions to multicomponent youth development programs delivered over several years. The current research literature says little or nothing about the relative costs of these different programmatic approaches, the main drivers of program costs, or how program costs compare to the effectiveness of the programs.
The TPP grant program was originally authorized under the Consolidated Appropriations Act, 2010 (P.L. 111-117) and currently operates under authority contained in the Consolidated Appropriations Act, 2014. The Act provided approximately $105,000,000 in FY 2014 for making competitive contracts and grants to public and private entities to fund medically accurate and age appropriate programs that reduce teen pregnancy, and for the Federal cost associated with administering and evaluating such grants and contracts. The proposed cost study is a key piece of OAH’s broad and ongoing effort to comprehensively evaluate the TPP program as required by the legislation.
The proposed study has three key objectives (Table A.1). The first is to determine the cost of implementing select evidence-based teen pregnancy prevention programs. To carry out this objective, the study team will collect detailed cost information from a subset of up to 30 OAH TPP Program grantees. For each grantee, the study team will administer three data collection instruments: (1) a Cost Tool used to collect comprehensive information on the cost of implementing of each select program (Instrument #1); (2) an Implementation Tool used to collect basic information on the characteristics of the grantee organization and program implementation (Instrument #2); and (3) a Staff Time Use survey used to collect information on how program staff allocate their time across different program activities (Instrument #3). Data from these instruments will be used to construct estimates of total program costs (including the cost of start-up), cost per program participant, and the distribution of program costs across different programs and activities.
The second study objective is to determine the economic impact of select evidence-based teen pregnancy prevention programs. Among the up to 30 grantees expected to participate in the first component of the study, about half are in the process of conducting program impact evaluations using either randomized controlled trials or rigorous quasi-experimental designs. By combining data from these program impact evaluations with estimates of program costs, the study team will calculate estimates of program cost-effectiveness for a select number of evidence-based teen pregnancy prevention programs. Importantly, the program impact evaluations are being conducted independently of the proposed cost study for which OAH is currently seeking OMB approval. Therefore, to achieve this study objective, the only additional data collection burden involves obtaining data from these ongoing impact evaluations, which the study team will accomplish by administering an Economic Evaluation Form (Instrument #4) to up to 15 OAH TPP Program grantees.
The third study objective is to develop guidance and tools for OAH to use in collecting and analyzing cost data from potential future grantees. The plans for this component of the study are still under development, so there is no additional burden requested as part of the current information collection request.
Table A.1. Overview of Study Objectives and Associated Data Collection Instruments
Objective |
Approximate Sample Size |
Associated Data Collection Instruments |
Covered by Current Information Collection Request? |
1. Determine cost of implementing select evidence-based programs |
30 grantees |
|
Yes |
2. Determine economic impact of select evidence-based programs |
15 grantees* |
|
Yes |
3. Develop guidance and tools to collect cost data from potential future grantees |
To be determined |
|
No |
*The grantees participating in the economic evaluation (study objective 2) will be a sub-set of those participating in the cost analysis (study objective 1).
A2. Purpose and Use of the Information
The cost data will be used by OAH to estimate the total and per participant costs of implementing selected evidence-based teen pregnancy prevention programs. These estimates will help OAH better understand how grantees are using program resources to deliver services; how program costs compare to the number of youth served by the programs; and potential factors driving program costs and variation in program costs across grantees. The cost estimates will also help OAH and other organizations prepare for possible future funding opportunities, by providing benchmark cost estimates that may be useful in preparing future budgets.
Data from the economic evaluation will be used by OAH to help estimate the agency’s return on program investments. Through the TPP grant program, the federal government has made a major investment in supporting the dissemination and implementation of evidence-based teen pregnancy prevention programs. By calculating estimates of program cost-effectiveness, the proposed cost study will help OAH determine the overall return on this investment. This economic assessment may help inform future federal policy decisions around support for teen pregnancy prevention programs.
A3. Use of Technology to Reduce Burden
To help minimize the level of burden on participating grantees, all study data collection instruments will be administered by telephone or in electronic format. The study Cost Tool (Instrument #1) will be formatted as an electronic spreadsheet and distributed to grantees via e-mail. Respondents will be instructed to enter the requested cost information directly in the spreadsheet and return the completed file by e-mail. The Staff Time Use Survey (Instrument #3) will be formatted as web-based survey respondents can access through the Internet. The web-based format will allow respondents to enter data at their own pace and on their own schedules. To protect personal identifying information, the study team will assign unique passcodes for each respondent. The study Implementation Tool (Instrument #2) and Economic Evaluation Form (Instrument #4) will be administered as semi-structured interviews conducted by telephone.
A4. Efforts to Avoid Duplication
This study is the first and only ongoing effort to systematically collect and analyze cost data for OAH TPP Program grantees. Although OAH currently has information on grantee budgets, these budgets may not reflect full operating costs, and actual cost experience may deviate from the original budget. The data collection proposed for this study is thus essential for OAH and federal policymakers to understand the cost and return on investment of TPP Program grantees.
A5. Methods to Minimize Burden on Small Entities
No small entities will be involved in this study as far as we know. If any of the program providers or their partners are small entities, the study team will reduce the number of Staff Time Use Surveys (Instrument #3) requested of the site.
A6. Consequences of Not Collecting Data
The current cohort of OAH TPP grantees are currently in the fourth year of projected five-year grant awards. If the proposed cost data are not collected in 2014-2015, it will be too late for OAH and other federal agencies to learn about the cost experience and return on investment of the current federal TPP Program grantees. In addition, the data will not be available to federal, state, and local agencies interested in efficiently implementing the same or similar programs in the future.
There are no special circumstances associated with this information collection.
A8. Federal Register Announcement and Consultation
The 60-day notice to solicit public comments was published in the Federal Register on March 12, 2014, in Volume 79, Number 48, page 14043, and provided a 60-day period for public comments (Attachment A). No public comments were received.
OAH received consultation on the study from members of an external technical working group (TWG). Members of the TWG (see Table A.2) convened for a one-day meeting in Washington, DC, on April 14, 2014, to provide input on the study data collection plans and instruments. The TWG included two representatives from TPP grantee sites.
Table A.2. Members of the Cost Study Technical Working Group
Name |
Affiliation |
Phaedra Corso |
Professor of Health Policy and Management, University of Georgia |
Max Crowley |
Center for Child and Family Policy, Duke University |
Emma García |
Economist, Economic Policy Institute |
Ron Haskins |
Senior Fellow, The Brookings Institution |
Saul Hoffman |
Professor of Economics, University of Delaware |
Andrea Kane |
Senior Director of Public Policy, The National Campaign |
Lynn Karoly |
Senior Economist, RAND |
Rebecca Maynard |
Professor of Education and Social Policy, University of Pennsylvania |
Adam Thomas |
McCourt School of Public Policy, Georgetown University |
Dawn Truett* |
Teen Pregnancy Prevention Program Manager, Inspira Health Network |
Bernice Tucker* |
Executive Director, Women Accepting Responsibility |
*Representatives from current TPP grantee sites.
No payments to respondents are proposed for this information collection.
The cost data will be reported only in aggregate for each site, without reference to any personal identifying information. For example, reports may show the proportion of total costs attributed to staff salaries, but without naming or referencing individual staff members. The data collection instruments follow a similar approach--for example, asking sites to report salary information by staff title, not personal name.
All electronic data will be transmitted and stored according to the level of security necessary for the sensitivity and identifiability of the data. The web-based Staff Time-Use Survey will be password protected, with a unique username and passcode for each respondent. Responses to all data collection instruments will be stored by the evaluation contractor, Mathematica, on secure network servers, with access to limited to project staff on a “need-to-know” basis.
Calculating accurate estimates of program costs requires collecting information on staff salaries and grantee operating costs. The importance of this information will be explained to study respondents and we will ask sites to report salary information only by staff title, not personal name.
Table A.3 summarizes the total estimated reporting burden for the Cost Study of Evidence-Based Teen Pregnancy Prevention Programs. We assume that up to 30 grantees will participate in the cost study and up to 15 grantees will participate in the economic evaluation. Assuming the maximum number of sites the total annualized burden is estimated to be 715 hours. Figures are estimated as follows:
Cost Tool. 30 total surveys, or one per grantee participating in the cost analysis, are estimated. Each grantee will complete the survey once and we estimate it will take eight hours.
Implementation Tool. 30 staff members, one per grantee participating in the cost analysis, will participate in one, one-hour discussion about program implementation.
Staff Time Use Survey. 1,200 responses are anticipated: 20 respondents in each of 30 grantees participating in the analysis, administered twice during the one year of data collection. Each response will take no more than 20 minutes.
Economic Evaluation Form. 15 respondents, one per grantee participating in the economic evaluation, will participate in one, one-hour discussion about the ongoing impact evaluations. As a result of these discussions, each respondent may need to conduct up to two hours of follow-up work to provide the requested information and data on the impact evaluations.
Table A.3. Estimate of Burden and Costs for the Cost Study of Evidence-Based Teen Pregnancy Prevention Programs
Activity/Respondent |
Annual Number of Respondents |
Number of Responses Per Respondent |
Average Burden Per Response (Minutes) |
Total Annual Burden Hours |
Average Hourly Wage |
Total Annualized Cost |
Cost Tool |
30 |
1 |
480 |
240 |
$30.99 |
$7,437.60 |
Implementation Tool* |
30 |
1 |
60 |
30 |
$30.99 |
$929.70 |
Staff Time Use Survey |
600 |
2 |
20 |
400 |
$30.99 |
$12,396.00 |
Economic Evaluation Form+ |
15 |
1 |
180 |
45 |
$37.71 |
$1,696.95 |
Total |
675 |
|
|
715 |
|
$22,460.25 |
+ The grantees completing the economic evaluation form are a sub-set of those completing the cost tool and implementation tool.
Table A.3 also provides the total estimated annualized cost of the burden for the current information collection request of $22,460.25. The Cost Tool, Implementation Tool, and Staff Time Use Survey will be completed by staff at the grantee organizations and their partners. The average hourly wage for these staff ($30.99) is the average hourly wage of “social and community service managers” taken from the U.S. Bureau of Labor Statistics, National Compensation Survey, 2012. The Economic Evaluation Tool will be completed by the evaluator at selected grantees. The average hourly wage for these staff ($37.71) is the average hourly wage of “miscellaneous social scientists and related workers” taken from the U.S. Bureau of Labor Statistics, National Compensation Survey, 2012. This proposed information collection does not impose an additional financial burden on respondents other than the time spent answering the questions contained in the instruments.
There are no start-up costs for respondents.
Data collection will be carried out by the evaluation contractor, Mathematica Policy Research. The total estimated cost to the government is $313,000, which covers the cost of developing the data collection plans and instruments; conducting a pilot test of the data collection instrument with two sites; and administering the data collection instruments to selected sites.
This is a new data collection.
The study will be conducted over a three-year period (2013-2016). Data collection will occur over a 12-month period from fall 2014 through fall 2015. Findings will be presented in a final report scheduled for release in spring 2016. The report will be made available to the public on the OAH website. Findings from the report may also be disseminated through peer-reviewed journal articles, professional conference presentations, and special issue briefs.
The report will present findings from both the cost analysis and economic evaluation:
For each site, four types of analysis will be conducted: (1) estimation of the program start-up costs, (2) estimation of total steady-state costs for implementing a teen pregnancy prevention program, (3) estimation of the cost per program component, and (4) estimation of the cost per participant.
Estimating Program Start-Up Costs. Program start-up cost estimates will be based on grantee expenditures from the first grant year. The estimates include expenditures from all agencies involved in start-up, including partnering agencies when involved. Averages, median values, and ranges for total program start-up costs will be reported. The findings will note any likely start-up costs not captured in the estimates, as well as any start-up costs that extended into later years of the grant period.
Estimating Total Steady-State Program Costs. For each grantee, an estimate of total costs for one program cycle will be calculated by summing the costs of individual resources applicable to each grantee. These estimates will include expenditures from all agencies involved in implementation, including partner agencies. Analysis may require adjustment of some reported costs. For example, salary costs will be adjusted to reflect national averages. Also, grantees may estimate the value of donated office space using commercial rental rates from a different period. In this case, values will be adjusted to the appropriate time frame using a consumer price inflator, such as the Consumer Price Index. Averages, median values, and ranges for total program costs will be reported.
Estimating Costs per Program Component. Estimates of program component costs will be based on staff time allocations and compensation. For each staff position, responses from the staff time use survey (Instrument #3) will be used to multiply the percentage of time that relevant staff members report spending on each program component by the compensation for each position. These values will be summed across staff positions to produce total personnel-related cost per program component.
Nonpersonnel costs that are clearly related to a specific component (for example, travel required to attend a training) will be allocated to the relevant component. Costs that cannot be directly linked to a specific program component will be allocated in the same proportions as personnel-related program component costs.
Estimating Costs per Participant. Estimates of costs per youth are critical to analytic comparisons—whether across grantees or program models. These estimates will differ depending on how a participant is defined and whether dosage of services or length of participation is taken into account. As a basic approach, participants will be defined as any youth who was served by the program during the cost study period. Total program costs would then be divided by the number of participants to produce an estimated cost per participant. The results estimates of program costs per participant will be reported separately by site.
The economic evaluation will provide estimates of program cost-effectiveness for the up to 15 participating sites that are conducting ongoing impact evaluations. For each site, data from the ongoing impact evaluations will be combined with the cost estimates produced in the first component of the study to generate estimates of program cost effectiveness measured as the cost per unit of a common outcome measure. This emphasis will lead to such estimates as the cost of program services per teen pregnancy averted, or the cost of program services per youth delaying the onset of sexual activity for a year.
The estimates will account for differences across grantees in the length of follow-up from program start and/or program end. For example, some grantees may measure outcomes 9 months post program, whereas others may measure outcomes after 12 months. To account for such differences, impact estimates will be extrapolated to a common follow-up length—for example, 6 months for short-term impacts and 12-month for longer-term impacts. Because any extrapolation method will involve assumptions, sensitivity tests of alternative assumptions will also be conducted.
The estimates will account for both start-up and steady-state costs, as well as fixed and variable costs. Frequently, fixed costs are incurred at start-up, but they represent investments in materials or infrastructure that will be used over the life of the project, and should be amortized over their usable life (for example, laptop computers purchased specifically for the project). Some start-up costs, however, are variable, reflecting the scale and structure of a program, particularly labor hours spent on planning. These costs should be accounted for separately as start-up costs; by the time the program is fully implemented, these are no longer applicable. For most sites, evaluation sample enrollment began sometime after an initial planning or pilot period during the first grant year. For this reason, the resulting program impact estimates are naturally compared to the ongoing or variable program costs associated with steady-state program implementation (plus the amortized value of fixed costs). These costs will include staff salaries and any program materials or supplies provided to program participants that are not re-usable. Sensitivity analysis will be conducted to test different assumptions of how start-up costs are apportioned.
The expiration date for OMB approval will be displayed on all data collection instruments.
There are no exceptions to the certification statement.
1 OMB approval number 0990-0394
2 OMB approval number 0990-0392
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches. Part A: Justification for the Colle |
Author | Barbara Collette |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |