Part A - Design Survey_updated burden 6.7.12_clean

Part A - Design Survey_updated burden 6.7.12_clean.docx

Personal Responsibility Education Program (PREP) Multi-Component Evaluation

OMB: 0970-0398

Document [docx]
Download: docx | pdf

Supporting Justification for OMB Clearance of the Personal Responsibility Education Program (PREP) Multi-Component Evaluation

OMB Control #0970-0398


Part A: Justification for the Design Survey Data Collection

November 2011






Submitted By:

U.S. Department of Health and Human Services

Administration for Children and Families

Office of Planning, Research and Evaluation

7th Floor, West Aerospace Building

370 L’Enfant Promenade, SW

Washington, D.C. 20447


Project Officers:

Dirk Butler

Clare DiSalvo



Introduction


The Patient Protection and Affordable Care Act, signed into law in March of 2010, established the Personal Responsibility Education Program (PREP) which funds programs designed to educate adolescents on both abstinence and contraception for the prevention of pregnancy and sexually transmitted infections, including HIV/AIDS, and at least three adulthood preparation subjects. 1 PREP provides $55.25 million in formula grants to States to “replicate evidence-based effective program models or substantially incorporate elements of effective programs that have been proven on the basis of scientific research to change behavior, which means delaying sexual activity, increasing condom or contraceptive use for sexually active youth, or reducing pregnancy among youth.” The PREP legislation also requires an evaluation.


The goal of the PREP Multi-Component Evaluation will be to document how programs funded through the State PREP program are designed and implemented in the field and to assess selected PREP-funded programs’ effectiveness. The project will include three primary, interconnected components, each of which is a study in its own right. These components are:


  1. a Design and Implementation Study (DIS): a broad descriptive analysis of how states designed and implemented PREP programs,

  2. a Performance Analysis Study (PAS): the collection and analysis of performance management data, and

  3. an Impact and In-depth Implementation Study (IIS): impact and in-depth implementation evaluations of four to five specific PREP-funded sites.


As part of the “Design and Implementation Study,” the broad descriptive study of how states designed and implemented PREP programs, ACF now seeks approval for a “Design Survey” data collection instrument. The purpose of the “Design Survey” data collection effort is to conduct semi-structured interviews with administrators in each of the states that received PREP state grants in order to better understand what key decisions states made regarding the design of their PREP-funded programs and why they made those decisions.


Please note that the burden we are requesting is lower than the burden originally requested in the Federal Register Notices associated with this collection, and lower than the burden indicated in the tables earlier submitted to OMB. This is because we now plan to conduct interviews exclusively with state-level respondents for the “Design Survey” effort. We no longer plan to interview sub-awardee-level respondents as part of this data collection effort. However, during the second wave of data collection associated with this study, the “Implementation Survey,” we will interview a select number of sub-awardee-level respondents.


A1. Circumstances Making the Collection of Information Necessary


Background


For decades, policymakers and the general public have been concerned about the prevalence of teen pregnancy. According to the Youth Risk Behavior Survey, 46 percent of all high school students, and 62 percent of all seniors, have had sexual intercourse; 21 percent of seniors have had sexual intercourse with four or more persons in their lifetime.2 While condom use has increased over time – 46 percent of sexually active high school students used a condom during last sexual intercourse in 1991, compared with 61 percent in 2009 – use of birth control pills has remained steady – from 1991 to 2009, around 20 percent of sexually active high school students used the birth control pill prior to their last sexual intercourse. In 2008, the last year for which data are final, there were 37 births per 1,000 unmarried teens ages 15-19 and 62 births per 1,000 unmarried teens ages 18-19.3 Rates of sexually transmitted infections (STIs) continue to rise among teens ages 15-19, where, for example, Chlamydia rates increased 2.4% from 2008 to 2009 and syphilis rates increased by 12% during the same time period.4


The Personal Responsibility Education Program (PREP) funds programs designed to educate adolescents on both abstinence and contraception for the prevention of pregnancy and sexually transmitted infections, including HIV/AIDS, and at least three adulthood preparation subjects. PREP provides $55.25 million in formula grants to States to “replicate evidence-based effective program models or substantially incorporate elements of effective programs that have been proven on the basis of scientific research to change behavior, which means delaying sexual activity, increasing condom or contraceptive use for sexually active youth, or reducing pregnancy among youth.”


Legal or Administrative Requirements that Necessitate the Collection


On March 23, 2010 the President signed into law the Patient Protection and Affordable Care Act (ACA), H.R. 3590 (Public Law 111-148). In addition to its other requirements, the act amended Title V of the Social Security Act (42 U.S.C. 701 et seq.) to include funding for the Personal Responsibility Education Program (PREP). The PREP Multi-Component Evaluation is a response to the legislative requirement that the Secretary evaluate the programs and activities carried out with funds made available through PREP.


Study Objectives


The overall purpose of the “Design and Implementation Study” is to understand and document the design and implementation of PREP programs, via data gathered across States and with selected sub-awardees. It is hoped that this effort will inform our understanding of program design and implementation and provide State-level policy-makers, sub-awardees, Congress, the general public, and ACF with useful information for future decision-making around PREP.


The study will document the general design and implementation of PREP State grant programs. Among other information, ACF is interested in the following: how States and/or sub-awardees elected to replicate evidence-based effective programs or substantially incorporate elements of effective programs; how States and/or sub-awardees placed substantial emphasis on both abstinence and contraception; and how States and/or sub-awardees addressed the required adulthood preparation subjects. The study will use multiple methods of information collection, including reviewing grant documents and discussing the program with federal staff.


The discussion guides submitted as part of this package will be used to collect information from administrators from states operating PREP programs. The data collection modality will be interviews. Interviews will primarily be conducted by phone; however, a few in-person interviews may be conducted if doing so would promote an efficient use of contract resources. (For example, in-person interviews may be conducted if the research team is already on site for another data collection effort related to the PREP Evaluation.)


Current Request


ACF seeks OMB approval for the “Design Survey” data collection instrument.


The data collection instrument is a discussion guide to be used with states administrators in states that received PREP state grant funding (ie, “PREP State-Level Coordinators”). The survey focuses on issues such as:

how program and policy decisions were made at the state level, the context in which the state is operating, how PREP programs are being integrated with other efforts to reduce teen pregnancy in the state, the relationship between the state and PREP sub-awardees, and how states intend to meet the legislative requirements of the program.


The “Design Survey” will be conducted via interviews with each informant.  Interviews will primarily be conducted by phone; however, a few in-person interviews may be conducted if doing so would promote an efficient use of contract resources. (For example, in-person interviews may be conducted if the research team is already on site for another data collection effort related to the PREP Evaluation.) The specific questions asked during each interview will vary, depending on 1) what is already known about the respondent’s program design decisions (e.g. we will not ask questions for which we already have answers, based on the contractor’s prior review of program documents and administrative data) and 2) the discretion of the interviewer, who will adapt his or her questions based on the respondent’s answers, while still touching on key themes across interviews. All questions asked by the interviewer will remain within the scope of the approved topics. The specific questions asked during each interview will be drawn from those questions approved by OMB.


The “Design Survey” instrument, therefore, will serve as a pool of possible questions which will be drawn upon by the contractor to guide informal, semi-structured interviews.  Each interviewer will use a sub-set of the questions listed in the interview guide.  The exact length of interviews will range, but most interviews are expected to last approximately one hour. (To be clear, one hour is the maximum length of time for the interviews; interviewers will be instructed to bring their interviews to an end within one hour.) A specific protocol will be developed for each interview in advance of the call. This overall approach has the benefit of reducing burden for respondents, because each interview will be tailored for the specific respondent who is being interviewed.  This strategy has worked very successfully in past data collection efforts like this (e.g. with implementation studies). 


A2. Purpose and Use of the Information Collection


If this request is approved, the data collection is intended to inform ACF about state processes, program models and components, target populations, proposed State implementation processes, community agencies involved in the projects, settings, program intensity, and other information necessary to understand the design of PREP programs. The Design Survey shall document how States planned replications of evidence-based programs or substantially incorporated elements of them; how they selected evidence-based program models; how they planned to adapt evidence-based programs, if necessary; how they planned to place substantial emphasis on both abstinence and contraception; how target populations were selected; and how they planned to incorporate adulthood preparation subjects, in addition to other topics.


In addition, the Design Survey will inform the development of a sampling frame for the next phase of the “Design and Implementation Study,” which is the “Implementation Survey.” The Implementation Survey will be conducted with all states who received PREP state grant funds, as well as a selected sub-set of sub-awardees. The information we collect about sub-awardees through the Design Survey - including program type, region, urbanicity, experience, population size served, target population, and setting - will inform the selection of sub-awardees to be included in the Implementation Survey.


As stated above, the overall purpose of the “Design and Implementation Study” – which includes the currently-proposed Design Survey and as well as the subsequent Implementation Survey (for which we will submit another package) – is to understand and document the design and implementation of PREP programs, via data gathered across States and with selected sub-awardees. It is hoped that this effort will inform our understanding of program design and implementation and provide State-level policy-makers and community sub-awardees, Congress and the general public, and ACF with useful information for future decision-making around PREP. To the extent possible, the information collected through this effort will also inform other components of the Personal Responsibility Education Program Multi-Component Evaluation.


A3. Use of Improved Information Technology and Burden Reduction


The data collection plan for the collection activity reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Prior to conducting interviews with state administrators, the research team will review each state’s application for PREP funds, as well as any other documents available to ACF, such as the RFP’s created by states in order to distribute funding to sub-awardees. Protocols for interviews will then be customized for each state to focus on information that is relevant for that state and that was not obtained from the documents reviewed. In other words, the information requested in interviews, will be limited to that for which the administrators are the best or only information sources.


A4. Efforts to Identify Duplication and Use of Similar Information


The information collection requirements for this evaluation have been carefully reviewed to determine what information is already available from existing studies and program documents and what will need to be collected for the first time. Although the information from existing sources provides value to our understanding of PREP programs, ACF does not believe that it provides sufficient information on PREP program design and initial implementation. This evaluation data collection effort is essential to providing this information.


A5. Impact on Small Businesses or Other Small Entities


Because this data collection effort will focus exclusively on state-level PREP administrators, it will not impact small businesses or other small entities.


A6. Consequences of Collecting Information Less Frequently


Information will be collected only once, thus no repetition of effort is planned. Not collecting the information at all would substantially limit our understanding of the PREP program and the value of the investment ACF will make in this study and in the PREP program itself. In the absence of such data, the decisions made by states and sub-awardees regarding the design and initial implementation of PREP programs will be unclear, and future funding and operational decisions about teen pregnancy prevention programs will be based on insufficient information.


In addition, what we learn through the Design Survey will inform the development of the sampling frame for a component of the next phase of the study. A year and a half after the Design Survey interviews are completed, “Implementation Survey” interviews will be conducted with all states, as well as a sample of sub-awardees, to learn about how PREP funded programs were implemented. The information we collect through the Design Survey interviews will inform the selection of sub-awardee programs for the Implementation Survey data collection effort.


A7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5


There are no special circumstances for the proposed data collection efforts.


A8.Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency


In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. The notice was published on Wednesday, March 30th, 2011 (Vol. 76, No. 61, 17655)


A copy of this notice is attached. During the notice and comment period, the government did not receive any comments or requests.


A9. Explanation of Any Payment or Gift to Respondents


No payments to respondents are proposed for this information collection.


A10. Assurance of Confidentiality Provided to Respondents


The data collected through the Design Survey efforts will be reported in two ways, and assurances of privacy will reflect those two ways.


First, a summary profile will be created for each state that will contain PREP program design decision facts – for example, the selected program models, populations to be served, number and location of sub-awardees, and number of total youth to be served.  This information, much of which will be publicly available through other sources, will be reported for each state receiving PREP funding in PREP Multi-Component Evaluation reports. Therefore, while we will not attribute such factual information to a specific respondent, it will be attributed to a specific state. Respondents will be made aware of how this factual information will be reported.


Second, the reports will discuss themes emerging from responses regarding “how” and “why” PREP program decisions were made. While these responses may not contain confidential information, they are likely to contain sensitive topics.  In particular, responses will be sensitive if state political or other contextual factors influenced the PREP program design decisions. In project reports, such responses will not be attributed to any one state, but will instead be analyzed and reported as part of overall trends across all states. Respondents will be made aware that responses to these questions will not be attributed to themselves or their state.


Beyond these two ways in which data will be reported, information will be kept private to the fullest extent of the law.


A11. Justification for Sensitive Questions


There are no sensitive questions in this data collection.


A12. Estimates of Annualized Burden Hours and Costs


Annualized Burden Estimates


We are requesting two years of clearance. The annual burden was estimated from the total number of completed discussions proposed and the time required to complete the discussions. 45 states and the District of Columbia received FY 2010 PREP funds. We expect two respondents from each state to be contacted and interviewed over the course of two years, which averages to one respondent per year.5 It is important to note that there may be multiple modalities for data collection, though all are expected to be types of interviews, i.e. we expect telephone surveys for most data collection, though in-depth in-person interviews may also take place. Regardless of modality, we will limit burden to one hour per person, whether data collection occurs via telephone or in-person interviews.



Instrument

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Total Burden Hours

Average Hourly Wage

Total Annual Cost

Design Survey: Discussion Guide for Use with PREP State-Level Coordinators and State-Level Staff

46

1

1

46

$37.45 (wage for “Social Scientists and Related Workers, All Other”)

$1,723


Total:

46

-

-

46

-

$1,723


Estimates of Annualized Cost


Average hourly wages for State-Level Coordinators and Staff were estimated from the latest – May 2011 – National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor website for those fields (see table). The total annualized cost is estimated at $1,723.



Total Burden Requested


The table below includes burden for discussion guides that were approved for three years on November 6, 2011 and the burden for the current information request (Design Survey).

Instrument

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Total Burden Hours

Average Hourly Wage

Total Annual Cost

Previously Approved

Discussion Guide for use with Macro-Level Coordinators

10

1

1

10

$33.59

$335.90

Discussion Guide for Use with Program Directors

20

2

2

80

$27.21

$2,176.80

Discussion Guide for Use with Program Staff

40

1

2

80

$23.76

$1,900.80

Discussion Guide for Use with School Administrators

70

1

1

70

$35.54

$2,487.80

Current Request

Design Survey: Discussion Guide for Use with PREP State-Level Coordinators and State-Level Staff

46

1

1

46

$37.45 (wage for “Social Scientists and Related Workers, All Other”)

$1,723


Total New:




46


$1,723

Total:


-

-

286

-

$8,624


A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers


Not applicable. These information collection activities do not place any capital cost or cost of maintaining capital requirements on respondents.


A14. Annualized Cost to the Federal Government


Data Collection Costs


Data collection will be carried out by the evaluation contractor. Based on our experience with other similar information collection activities6 carried out by contractors, total costs to the government are estimated to be approximately $250,000. Because data collection will be carried out over two years, the estimated annualized cost to the government for field data collection is expected to be approximately $125,000 per year. The annual cost to the federal government for the previously approved IC was $215, 625, so the total annual cost to the federal government under this IC is $340.625.


A15. Explanation for Program Changes or Adjustments


This is the first data collection for the “Design and Implementation Study” portion of the PREP Multi-Component Evaluation. This collection adds to a previous stage to this multi-component evaluation.


A16. Plans for Tabulation and Publication and Project Time Schedule


Design and Implementation Study Analysis Plan


As a reminder, the Design Survey data collection effort is part of the “Design and Implementation Study,” a broad descriptive study documenting the design and implementation of PREP programs across the country. As part of this study, the contractor has already reviewed all states’ applications for funding, as well as other documents available to ACF, and has engaged in clarifying conversations with federal staff. The next step – which is the focus of this information collection request – will be to conduct “Design Survey” interviews focused on the overall design of states’ PREP programs. A year and half after these interviews are completed, an “Implementation Survey” is planned, which will entail interviews focused on the implementation of states’ and sub-awardees’ programs.


The report produced based on all of these data collection efforts will document the general design and implementation of the PREP state grant programs. The data will be reported in two ways: First, a summary profile will be created for each state that will contain PREP program design decision facts – for example, the selected program models, populations to be served, number of sub-awardees, and number of total youth to be served. This information will be reported for each state receiving PREP funding. Second, the reports produced will discuss themes emerging from responses regarding “how” and “why” PREP program decisions were made, the degree to which state PREP program plans have been implemented, what challenges may have emerged during program implementation, and the degree to which states and their subawardees could address the named challenges.


Time Schedule and Publications


Below is a schedule of each of the instruments associated with the PREP Multi-Component Evaluation (for each of the three studies that make up the project) and the date that we plan to use each instrument in the field:


Instrument

Date of 30-Day Submission

Date Clearance Needed

Date for Use in Field

Design and Implementation Study

Design survey

October 2011

December 2011

January 2012

Implementation survey

October 2012

February 2013

March 2013

Performance Analysis Study

Performance measure package

February 2012

June 2012

Fall 2012 (though need clearance by spring to provide T/TA to reporting agencies)

Impact and In-Depth Implementation Study

Field instrument

Completed July 2011

October 2011

November 2011

Administrative data collection instruments

April 2012

August 2012

September 2012 (for sites starting fall 2012)

Baseline survey

April 2012

August 2012

September 2012 (for sites starting fall 2012)

Implementation study instruments

June 2012

October 2012

November 2012 (for sites starting fall 2012)

Short- and long-term follow-up surveys

November 2012

March 2013

April 2013 (for sites starting fall 2012)


ACF anticipates that one report detailing findings from both the Design Survey and a later component of the study, the Implementation Survey, will be produced. The report shall extensively detail how States and sub-awardees designed PREP programs and how these programs were implemented. ACF also anticipates the creation of a brief that communicates the findings in a succinct way to a wide range of stakeholders. The draft brief and final brief shall be produced on the same timeline as the draft report and final report. All products are expected by August 2014.


A17. Reason(s) Display of OMB Expiration Date is Inappropriate


All instruments will display the expiration date for OMB approval.


A18. Exceptions to Certification for Paperwork Reduction Act Submissions


No exceptions are necessary for this information collection.




1 PREP legislation outlines the six adulthood preparation subjects:

(i) Healthy relationships, Including, marriage and family interactions.

(ii) Adolescent development, such as the development of healthy attitudes and values about adolescent growth and development, body image, racial and ethnic diversity, and other related subjects.

(iii) Financial literacy.

(iv) Parent-child communication.

(v) Educational and career success, such as developing skills for employment preparation, job seeking, independent living, financial self-sufficiency, and workplace productivity.

(vi) Healthy life skills, such as goal-setting, decision making, negotiation, communication and interpersonal skills, and stress management.

2 Centers for Disease Control and Prevention.  “Youth Risk Behavior Surveillance – United States, 2009.”  Surveillance Summaries, [June 4, 2010]. MMWR 2010;59(No. SS-5).

3 Martin JA, Hamilton BE, Sutton PD, Ventura SJ, Mathews Osterman MJK. Births: Final data for 2008. National vital statistics reports; vol 59 no 1. Hyattsville, MD: National Center for Health Statistics. 2010.

4 Centers for Disease Control and Prevention. Sexually Transmitted Disease Surveillance, 2009. Atlanta, GA: U.S. Department of Health and Human Services; 2010.

5 Please note that the burden we are requesting is lower than the burden originally requested in the Federal Register Notices associated with this collection, and lower than the burden indicated in the tables earlier submitted to OMB. This is because we now plan to conduct interviews exclusively with state-level respondents for the “Design Survey” effort. We no longer plan to interview sub-awardee-level respondents as part of this data collection effort. However, during the second wave of data collection associated with this study, the “Implementation Survey,” we will interview a select number of sub-awardee-level respondents. The sample frame used to select sub-awardee-level respondents will be developed, in part, based on the results of the “Design Survey” data collection.

6 The estimate is based on a similar effort in a recent evaluation coordinated by the Office of Planning, Research and Evaluation on Subsidized and Transitional Employment in which the contractor performed a national scan of the subsidized employment programs created as part of the American Recovery and Re-Investment Act (Pub.L. 111-5). http://www.gpo.gov/fdsys/pkg/PLAW-111publ5/content-detail.html

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPatrick Reimherr
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy