Part A

Part A.doc

Personal Responsibility Education Program (PREP) Multi-Component Evaluation

OMB: 0970-0398

Document [doc]
Download: doc | pdf

Supporting Justification for OMB Clearance of the Personal Responsibility Education Program (PREP) Multi-Component Evaluation



Part A: Justification for the Collection of Field Data



November 2011









Submitted By:

U.S. Department of Health and Human Services

Administration for Children and Families

Office of Planning, Research and Evaluation

7th Floor, West Aerospace Building

370 L’Enfant Promenade, SW

Washington, D.C. 20447


Project Officers:

Dirk Butler

Clare DiSalvo



















Introduction


The Patient Protection and Affordable Care Act, signed into law in March of 2010, established the Personal Responsibility Education Program (PREP), which funds programs designed to educate adolescents on both abstinence and contraception for the prevention of pregnancy and sexually transmitted infections, including HIV/AIDS, and at least three adulthood preparation subjects.  PREP provides $55.25 million in formula grants to States to “replicate evidence-based effective program models or substantially incorporate elements of effective programs that have been proven on the basis of scientific research to change behavior, which means delaying sexual activity, increasing condom or contraceptive use for sexually active youth, or reducing pregnancy among youth.”


The goal of the PREP Multi-Component Evaluation will be to document how programs funded through the State PREP program are designed and implemented in the field and to assess selected PREP-funded programs’ effectiveness.  The project will include three primary, interconnected components, each of which is a study in its own right. These components are:


  1. a Design and Implementation Study (DIS): a broad descriptive analysis of how States designed and implemented PREP programs,

  2. a Performance Analysis Study (PAS): the collection and analysis of performance management data, and

  3. an Impact and In-depth Implementation Study (IIS): impact and in-depth implementation evaluations of four to five specific PREP-funded sites.


As part of the third component, ACF now seeks approval for field data collection instruments. The purpose of the field data collection effort is to identify potential sites for inclusion in the “Impact and Implementation Study,” which entails random assignment evaluations and in-depth implementation evaluations in 4-5 specific sites.


All the measures in the instruments included in this ICR were originally approved under OMB Clearance No. 0970-0360 as part of the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA) coordinated by the Office of Adolescent Health (OAH). ACF will continue to coordinate PREP data collection instrument development with OAH and other offices across HHS that oversee teen pregnancy prevention programming and evaluation (e.g. the HHS Assistant Secretary for Planning and Evaluation (ASPE) and CDC’s Division of Reproductive Health (CDC/DRH)).


A1. Circumstances Making the Collection of Information Necessary


Background


For decades, policymakers and the general public have been concerned about the prevalence of teen pregnancy. According to the Youth Risk Behavior Survey, 46 percent of all high school students, and 62 percent of all seniors, have had sexual intercourse; 21 percent of seniors have had sexual intercourse with four or more persons in their lifetime.1 While condom use has increased over time – 46 percent of sexually active high school students used a condom during last sexual intercourse in 1991, compared with 61 percent in 2009 – use of birth control pills has remained steady – from 1991 to 2009, around 20 percent of sexually active high school students used the birth control pill prior to their last sexual intercourse. In 2008, the last year for which data are final, there were 37 births per 1,000 unmarried teens ages 15-19 – 62 births per 1,000 unmarried teens ages 18-19.2 Rates of sexually transmitted infections (STIs) continue to rise among teens ages 15-19, where, for example, Chlamydia rates increased 2.4% from 2008 to 2009 and syphilis rates increased by 12% during the same time period.3


The Personal Responsibility Education Program (PREP) funds programs designed to educate adolescents on both abstinence and contraception for the prevention of pregnancy and sexually transmitted infections, including HIV/AIDS, and at least three adulthood preparation subjects.  PREP provides $55.25 million in formula grants to States to “replicate evidence-based effective program models or substantially incorporate elements of effective programs that have been proven on the basis of scientific research to change behavior, which means delaying sexual activity, increasing condom or contraceptive use for sexually active youth, or reducing pregnancy among youth.”


Legal or Administrative Requirements that Necessitate the Collection


On March 23, 2010 the President of the United States signed into law the Patient Protection and Affordable Care Act (ACA), H.R. 3590 (Public Law 111-148). In addition to its other requirements, the act amended Title V of the Social Security Act (42 U.S.C. 701 et seq.) to include funding for the Personal Responsibility Education Program (PREP). The PREP Multi-Component Evaluation is a response to the legislative requirement that the Secretary evaluate the programs and activities carried out with funds made available through PREP allotments or grants.


Study Objectives


As stated above, one goal of the PREP Multi-Component Evaluation is to assess the effectiveness of selected PREP-funded programs. This goal will be achieved through an “Impact and In-Depth Implementation Study,” or IIS, which is one component of the PREP Multi-Component Evaluation. The purpose of the information collection instruments submitted through this request is to help the federal government and contract staff select program sites for the IIS.


The IIS will use an experimental design in approximately 4 to 5 sites, with an average of approximately 1,500 research sample members at each site, to test the effectiveness of a range of PREP program models. The impact evaluation is expected to involve a baseline survey and two follow-up surveys (e.g. a short-term follow-up survey at approximately 6 months post-program and a long-term follow-up survey most likely at some point between 12 and 24 months post-program). A critical component of this impact evaluation will be an in-depth, high-quality implementation data collection and analysis effort of each site evaluated. In addition to measures of fidelity (e.g., adherence, exposure/dosage, quality of service delivery, participant response), this in-depth analysis will document how specific programs implemented key components of the PREP programs (e.g., substantially incorporating elements of effective programs, placing substantial emphasis on abstinence and contraception, and addressing adulthood preparation subjects).


It is not the intent of the random assignment impact evaluation to provide a representative look at the PREP program as a whole. Instead, the findings from the random assignment impact evaluation are intended to fill gaps in the evaluation literature by providing more information about what programs are effective for vulnerable populations, such as foster youth and run-away and homeless youth, and thereby serve the entire field of teen pregnancy prevention.


Current Request


The purpose of this information collection is to help ACF identify and select 4-5 PREP-funded teen pregnancy prevention intervention programs for inclusion in the “Impact and In-Depth Implementation Study,” which entails random assignment impact evaluation and in-depth implementation evaluation in 4-5 sites.


Sites for this study will be selected based on the following criteria: 1) the extent to which a site could support a random assignment impact evaluation (for example, whether the site could generate a sufficiently large sample size and whether there is a strong treatment-counterfactual distinction) and 2) whether the inclusion of the site in the evaluation would address ACF’s key research questions (whether it would test a program model designed to serve a vulnerable population of special interest to ACF, such as foster youth or run-away and homeless youth).


In order to gather information about whether potential sites meet these criteria, ACF seeks approval for the field data collection instruments. ACF seeks approval of the proposed discussion guides to be used in telephone and in-person informal, semi-structured discussions with macro-level (ie, state-level) PREP program coordinators, program directors, program staff, and school administrators. All the measures in the instruments included in this ICR were originally approved under OMB Clearance No. 0970-0360 as part of the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA) coordinated by the Office of Adolescent Health (OAH) and ACF.


As background, it is important to note that while PREP state grant funds are awarded to and administered by states, the sites selected for impact evaluation will most likely be sub-awardees. Most states are distributing their state grant funding to a number of community-level sub-awardees within their state via a competitive grant process. These sub-awardees may be county health departments, school districts, or local community organizations, for example. Each of these sub-awardees is then responsible for implementing their own PREP-funded teen pregnancy prevention program. While we are requesting a small amount of burden in order to be able to speak with state-level PREP administrators, the bulk of the burden that we are requesting for the field instrument is to speak with sub-awardee-level respondents.


For this field data collection effort only a relatively small number of states and sub-awardees will be engaged compared to the total number of states receiving PREP funds and total number of PREP-funded sub-awardees. 46 states and the District of Columbia received PREP state grant funds. For the field data collection effort, of this total, we plan to reach out to up to 10 states in order to identify 4-5 sites for the “Impact and In-Depth Implementation Study.” These states will be identified through a review of documents available to ACF and discussions with federal staff. Within each state, we will speak to up to 1 macro-level coordinator, 2 program directors, 4 program staff, and 7 school administrators.


ACF will continue to coordinate PREP data collection instrument development with OAH and other offices across HHS that oversee teen pregnancy prevention programming and evaluation (e.g. the HHS Assistant Secretary for Planning and Evaluation (ASPE) and CDC’s Division of Reproductive Health (CDC/DRH)).


A2. Purpose and Use of the Information Collection


If this request is approved, this information collection will help ACF identify and select teen pregnancy prevention intervention programs from among the PREP-funded grantees/sub-awardees for evaluation. The information gathered will be used by contractor staff to make recommendations to ACF about interventions to be considered for inclusion in the evaluation.


Prior to using the field data collection instruments, the contractor will review the applications submitted by states for PREP funding as well as other administrative documents available to ACF in order to establish what is already known about PREP grantees. Then, using the field data collection instruments, we propose to engage in informal interviews with the following groups of stakeholders:


  • Macro-Level Coordinators—state-level PREP program coordinators or other state-level coordinators;

  • Practitioners—directors or staff of PREP-funded pregnancy prevention programs, including school- and community-based programs as well as local and state agencies, as appropriate;

  • As appropriate, school administrators or individuals who coordinate, oversee, or otherwise work with PREP-funded programs within educational settings.


The proposed information collection instruments are included in this package.


A3. Use of Improved Information Technology and Burden Reduction


The data collection plan for each collection activity reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Prior to conducting interviews, the contractor will review existing documents available to ACF, including the applications that states submitted to ACF to receive state grant funds and the RFP’s that states made public in order to make sub-awards, in order to establish what is already known about each grantee. The information being requested through discussions/interviews is limited to that for which the survey participants are the best or only information sources. Protocols for interviews during site visits will be customized for each site to focus on information that is relevant for that site and that could not be obtained from documents.


A4. Efforts to Identify Duplication and Use of Similar Information


The information collection requirements for the evaluations have been carefully reviewed to determine what information is already available from existing studies and what will need to be collected for the first time. Although the information from existing studies provides value to our understanding of reducing teenage sexual risk behavior, ACF does not believe that it provides sufficient information on a sufficient range of program models to policymakers and stakeholders aiming to reduce this behavior. Furthermore, PREP programs will be newly-formulated programs: the PREP legislation requires additions to existing evidence-based models and permits substantial incorporation of evidence- based models into single programs. Programs like these have yet to be evaluated. These evaluation data collection efforts are essential to providing this information.


A5. Impact on Small Businesses or Other Small Entities


Programs in some sites may be operated by or in collaboration with small community-based organizations or by small businesses. The data collection plan is designed to minimize burden on such organizations by focusing interviews with their staff on their direct role in the intervention and its development or planning.


A6. Consequences of Collecting Information Less Frequently


During this step of the evaluation, information will be collected only once, thus no repetition of effort is planned. Not collecting the information at all would substantially limit the value of the investment ACF will make in this study. Identifying interventions of most interest to the field is crucial to ensuring that findings from the study are relevant to federal, state and local policymakers and program administrators.


A7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


There are no special circumstances for the proposed data collection efforts.


A8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on March 16, 2011, Volume 76, Number 51, page 14403, and provided a sixty-day period for public comment. A copy of this notice is attached as Attachment B. During the notice and comment period, one comment was received, which is attached. ACF did not respond to this comment.


Instruments were drafted by staff in the Office of Planning, Research, and Evaluation in ACF, and further developed and reviewed by staff from the Family and Youth Services Bureau in ACF; the Office of Planning, Research, and Evaluation in ACF; and the Office of the HHS Assistant Secretary for Planning and Evaluation (ASPE).


A9. Explanation of Any Payment or Gift to Respondents


No payments to respondents are proposed for this information collection.


A10. Assurance of Confidentiality Provided to Respondents


The only individuals that will be contacted as part of this effort are state-level program administrators, program directors, program staff, and school administrators. As stated in the introduction contained within each instruments, these respondents will be told, “Your responses will be discussed internally among the research team and the funding agency (the Administration for Children and Families) but, to the extent allowable by law, individual identifying information will not be disseminated publicly.”


A11. Justification for Sensitive Questions


There are no personally sensitive questions in this data collection.


A12. Estimates of Annualized Burden Hours and Costs


Estimates of Annualized Burden


The table below summarizes the reporting burden on respondents. Please note that the total level of burden is reduced from the total level of burden originally requested via the Federal Register Notices associated with this information collection request. Based on comments received from OMB, we are rescinding our request for clearance to conduct focus groups with program participants and to conduct short surveys. Therefore, the respondents to be included in this data collection effort are the following: macro-level coordinators (state-level administrators), program directors, program staff, and school administrators. The specific burden levels that we are requesting for each of these groups remain the same.


The burden levels for requested for these groups reflect the following strategy: We ultimately need to identify 4-5 specific sites (most likely sub-awardees within states) for inclusion in the impact evaluation. Through review of documents available to ACF, we will begin by identifying up to 10 states which we believe may have sub-awardees who would be good candidates for inclusion in the impact evaluation. Within each state, we will speak to up to 1 macro-level coordinator, 2 program directors, 4 program staff, and 7 school administrators.


Response times were estimated from informal pre-tests with ACF staff and prior experience. The annual burden was estimated from the total number of completed discussions proposed and the time required to complete the discussions. The total annual burden is expected to be 240 hours.


Instrument

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Total Burden Hours

Average Hourly Wage

Total Annual Cost

Discussion Guide for use with Macro-Level Coordinators

10

1

1

10

$33.59

$335.90

Discussion Guide for Use with Program Directors

20

2

2

80

$27.21

$2,176.80

Discussion Guide for Use with Program Staff

40

1

2

80

$23.76

$1,900.80

Discussion Guide for Use with School Administrators

70

1

1

70

$35.54

$2,487.80

Estimated Annual Burden Sub-total for Field Clearance

240 hrs


$6,901.30



Estimates of Annualized Costs


Survey respondents will be coordinators, program directors, program staff, participants, and school administrators.  To compute the total estimated annual cost, the total burden hours were multiplied by the median hourly wage for different categories of employees, per the Bureau of Labor Statistics, Occupation Employment Statistics.4 The total estimated annual cost is $6,901.30.


A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers


Not applicable. These information collection activities do not place any capital cost or cost of maintaining capital requirements on respondents.


A14. Annualized Cost to the Federal Government


Instrument Creation


The field data collection instruments were created by the Office of Planning, Research, and Evaluation (OPRE) in ACF, and further developed and reviewed by staff from the ACF Family and Youth Services Bureau (FYSB), the Office of the HHS Assistant Secretary for Planning and Evaluation (ASPE), and the Office of Adolescent Health (OAH). Costs of federal government employees’ time is estimated at $2,011.


Data Collection Costs


Data collection will be carried out by the evaluation contractor. Experience from the Evaluation of Adolescent Pregnancy Prevention Approaches indicates that the cost for the collecting data from the field is approximately $1,035,000 for 8 sites. For the PREP Evaluation, we expect up to 5 sites; thus, the estimated costs will be approximately $646,875 over three years or $215,625 per year.


A15. Explanation for Program Changes or Adjustments


This is a new collection for a new evaluation; thus, no program adjustments are anticipated based on this data collection.


It bears mention that all instruments included in this ICR were originally approved under OMB Clearance No. 0970-0360 as part of the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA) coordinated by the Office of Adolescent Health (OAH). (While some measures have been removed because they are not relevant to the PREP Evaluation, all the measures that remain were part of the instruments originally approved as part of the PPA study.)

ACF will continue to coordinate PREP data collection instrument development with offices across HHS that oversee teen pregnancy prevention programming and evaluation (e.g. the HHS Assistant Secretary for Planning and Evaluation (ASPE) and CDC’s Division of Reproductive Health (CDC/DRH)).


A16. Plans for Tabulation and Publication and Project Time Schedule


Analysis Plan


The purpose of the field data collection effort is not to collect data for statistical analysis. Rather, it is to identity sites for inclusion in the third component of the PREP Multi-Component Evaluation, the “Impact and Implementation Study.” The selected sites will undergo rigorous random assignment impact evaluations, as well as in-depth, high quality implementation evaluations. For the impact evaluations, each evaluation will include approximately 1,500 study participants (including both program and control groups) and will entail baseline, short-term, and long-term data collection.

As described above, the instruments submitted as part of this field data collection package will guide informal, semi-structured interviews with state-level administrators, program directors, program staff, and school administrators. The information collected through these interviews – as well as information gathered through a review of administrative documents available to ACF - will allow ACF and the contractor to identify sites for inclusion in the “Impact and Implementation Study.”


Time Schedule and Publications


The PREP Multi-Component Evaluation began this fall and will stretch through 2017. We hope to begin reaching out to potential sites for the impact evaluation (using the field instrument clearance) this month, November 2011, and to reach a final decision regarding the sites to be included in the impact evaluation by the spring of 2012. Baseline data collection for the 4-5 sites to be included in the random assignment evaluation will begin in the fall of 2012. (Because several of the sites selected will probably be schools and most schools develop their plans for the fall semester by the end of the spring semester, it is essential to identify sites and establish agreements with them by early spring.)


Below is a schedule of each of the instruments associated with the PREP Multi-Component Evaluation (for each of the three studies that make up the project) and the date that we plan to use each instrument in the field:


Instrument

Date of 30-Day Submission

Date Clearance Needed

Date for Use in Field

Design and Implementation Study

Design survey

October 2011

December 2011

January 2012

Implementation survey

October 2012

February 2013

March 2013

Performance Analysis Study

Performance measure package

February 2012

June 2012

Fall 2012 (though need clearance by spring to provide T/TA to reporting agencies)

Impact and In-Depth Implementation Study

Field instrument

Completed July 2011

October 2011

November 2011

Administrative data collection instruments

April 2012

August 2012

September 2012 (for sites starting fall 2012)

Baseline survey

April 2012

August 2012

September 2012 (for sites starting fall 2012)

Implementation study instruments

June 2012

October 2012

November 2012 (for sites starting fall 2012)

Short- and long-term follow-up surveys

November 2012

March 2013

April 2013 (for sites starting fall 2012)



No publications are planned from this information collection.


A17. Reason(s) Display of OMB Expiration Date is Inappropriate


All instruments will display the expiration date for OMB approval.


A18. Exceptions to Certification for Paperwork Reduction Act Submissions


No exceptions are necessary for this information collection.


1 Centers for Disease Control and Prevention.  “Youth Risk Behavior Surveillance – United States, 2009.”  Surveillance Summaries, [June 4, 2010]. MMWR 2010;59(No. SS-5).

2 Martin JA, Hamilton BE, Sutton PD, Ventura SJ, Mathews Osterman MJK. Births: Final data for 2008. National vital statistics reports; vol 59 no 1. Hyattsville, MD: National Center for Health Statistics. 2010.

3 Centers for Disease Control and Prevention. Sexually Transmitted Disease Surveillance, 2009. Atlanta, GA: U.S. Department of Health and Human Services; 2010.

4 Average hourly wages for program staff and community members were estimated from the latest – May 2009– National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor website.

File Typeapplication/msword
File TitleSupporting Justification for OMB Clearance of the Personal Responsibility Education Program (PREP) Multi-Component Evaluation (O
AuthorPatrick Reimherr
Last Modified ByCDiSalvo
File Modified2011-11-02
File Created2011-11-02

© 2024 OMB.report | Privacy Policy