PREP Eval - PAS+Baseline -Supporting Statement A - Clean version - 2-27-13

PREP Eval - PAS+Baseline -Supporting Statement A - Clean version - 2-27-13.docx

Personal Responsibility Education Program (PREP) Multi-Component Evaluation

OMB: 0970-0398

Document [docx]
Download: docx | pdf


U.S. Department of Health
and Human Services

Office of Planning, Research and Evaluation, Administration for Children and Families Family and Youth Services Bureau

7th floor West Aerospace Building

370 L'Enfant Promenade, SW

Washington, DC 20047

Project Officer: Clare DiSalvo, Dirk Butler




Part A: Justification for the Collection of Performance Measures and Baseline Data - Personal Responsibility Education Program (PREP) Multi-Component Evaluation

0970-0398

February 2013








CONTENTS

Part a Introduction 1

A1. Circumstances Making the Collection of Information Necessary 2

1. Legal or Administrative Requirements that Necessitate the Collection 2

2. Study Objectives 2

A.2. Purpose and Use of the Information Collection 4

A.3. Use of Information Technology to Reduce Burden 6

A.4. Efforts to Identify Duplication and Use of Similar Information 7

A.5. Impact on Small Businesses 8

A.6. Consequences of Not Collecting the Information/Collecting Less Frequently 8

A.7. Special Circumstances 8

A.8. Federal Register Notice and Consultation Outside the Agency 8

A.9. Payments to Respondents 9

A.10. Assurance of Confidentiality 9

A.11. Justification for Sensitive Questions 10

A.12 Estimates of the Burden of Data Collection 12

1. Annual Burden for Youth Participants 13

2. Annual Burden for Grantees, Sub-Awardees, and Sub-Awardee Implementation Sites 14

3. Total Annual Burden and Cost for Sub-Awardees 15

4. Total Annual Burden and Cost for Implementation Sites 16

A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 18

A.14. Annualized Cost to Federal Government 18

A.15. Explanation for Program Changes or Adjustments 19

CONTENTS (continued)

A16. Plans for Tabulation and Publication and Project Time Schedule 19

1. Analysis Plan 19

2. Time Schedule and Publications 20

A17. Reason(s) Display of OMB Expiration Date is Inappropriate 21

A18. Exceptions to Certification for Paperwork Reduction Act Submissions 21

SUPPORTING REFERENCES FOR INCLUSION OF SENSITIVE QUESTIONS OR GROUPS OF QUESTIONS 22

TABLES

A2 1. Collection Frequency for PREP Performance Measures Data 5

A11 1. Summary of Sensitive Questions to Be Included on the Participant Entry and Exit Surveys and Their Justification 11

A11 2. Summary of Sensitive Questions to be Included on the IIS Baseline Survey (Instrument 3) and Their Justification 12

A12 1. Calculations of Burden Hours and Cost for Youth Participants 14

A12 2. Calculations of Burden Hours and Costs for Grantees, Their Sub-awardees, and implementation sites to collect and report the required performance measures 15

INSTRUMENTS

INSTRUMENT #1 – PARTICIPANT ENTRY SURVEY (PAS)

INSTRUMENT #2 – PARTICIPANT EXIT SURVEY (PAS)

INSTRUMENT #3 – BASELINE SURVEY (IIS)

INSTRUMENT #4 – PERFORMANCE REPORTING SYSTEM DATA ENTRY FORM

INSTRUMENT #5 – SUBAWARDEE DATA COLLECTION AND REPORTING

INSTRUMENT #6 – IMPLEMENTATION SITE DATA COLLECTION


ATTACHMENTS

ATTACHMENT A: OVERVIEW OF THE PREP EVALUATION

ATTACHMEMT B: ANALYSIS PLAN FOR PREP IMPACT STUDY

ATTACHMENT C: QUESTION BY QUESTION SOURCE TABLE FOR THE BASELINE SURVEY

ATTACHMENT D: SOURCES REFERENCED FOR THE BASELINE QUESTIONNAIRE

ATTACHMENT E: 60-DAY FEDERAL REGISTER NOTICE

ATTACHMENT F: PERSONS CONSULTED ON COLLECTION AND/OR ANALYSIS OF THE

PAS AND IIS BASELINE SURVEY

ATTACHMENT G: CONSENT LETTERS AND FORMS AND YOUTH ASSENT FORM

ATTACHMENT H: CONFIDENTIALITY PLEDGE

Part a Introduction

In March 2010, Congress authorized the Personal Responsibility Education Program (PREP) as part of the Patient Protection and Affordable Care Act (ACA). PREP provides grants to states, tribes, and tribal communities to support evidence-based programs to reduce teen pregnancy and sexually transmitted infections (STIs). The programs are required to provide education on both abstinence and contraceptive use. The programs will also offer information on adulthood preparation subjects such as healthy relationships, adolescent development, financial literacy, parent–child communication, education and employment skills, and healthy life skills. Grantees are encouraged to target their programming to high-risk populations—for example, homeless youth, youth in foster care, pregnant or parenting teens, youth residing in geographic areas with high teen birth rates, and Native American youth.

States and other entities could acquire PREP funding through a formula grants program. Forty-five states, the District of Columbia, the Virgin Islands, Puerto Rico, and the Federated States of Micronesia applied for and received PREP funding.1 Grants to tribes and tribal communities were made through a competitive process, and 16 grantees were awarded funding.

In line with PREP’s emphasis on evidence-based programming, Congress also mandated a federal evaluation of the PREP program. To meet this need, the Family and Youth Services Bureau (FYSB) and the Office of Planning, Research and Evaluation (OPRE) within the Administration for Children and Families (ACF) of the U.S. Department of Health and Human Services (HHS) have contracted with Mathematica Policy Research and its subcontractors to conduct the Personal Responsibility Education Program (PREP) Multi-Component Evaluation, a seven year evaluation to document how PREP-funded programs are operationalized in the field and assess their effectiveness on reducing teenage pregnancies, sexual risk behaviors, and STIs.

The evaluation includes three complementary components, each with distinct data collection activities: (1) the Design and Implementation Study (DIS), a broad descriptive analysis of how states are using PREP grant funding to support evidence-based teen pregnancy and STI prevention programs; (2) the Performance Analysis Study (PAS), focused on the collection and analysis of performance management data from state and tribal grantees; and (3) the Impact and In-depth Implementation Study (IIS), designed to assess the impacts and implementation of funded programs in four to five selected PREP sites.

OMB approval for Field Data Collection as part of the Impact and In-depth Implementation Study (IIS) was received on November 6, 2011, and approval was subsequently received for the Design Survey conducted as part of the Design and Implementation Study (DIS) on March 7, 2012 (OMB Control #0970-0398). ACF is now requesting OMB approval for two additional data collection efforts and the associated instruments: (1) collection of PREP performance measures for the Performance Analysis Study (PAS) through participant entry and exit surveys and the Performance Reporting System Data Entry Form; and (2) collection of baseline data for the Impact and In-depth Implementation Study (IIS) through the PREP baseline survey. Attachment A provides an overview of the multiple components of the PREP evaluation, including the components that have received OMB approval and the components included in this ICR.

The multiple components of the PREP Evaluation play a unique role in the mix of current federal evaluation efforts designed to expand the evidence base on teen pregnancy prevention programs. First, unlike other evaluations, the PREP effort will provide information on large-scale (state-wide) replication of evidence-based programs, with particular emphasis on 1) lessons learned from replication among high-risk populations in new settings, such as youth in foster care group homes, in the juvenile justice system, or youth living on tribal lands, 2) how and why states, tribes, and localities choose and implement evidence-based programs most appropriate for their local contexts, and (3) adaptations made to support the unique PREP requirements, such as the inclusion of adulthood preparation subjects. Data from both the Design and Implementation Study (DIS) and Performance Analysis Study (PAS) will help answer these questions about large-scale replication. Second, the evaluation will also offer a unique opportunity to test the effectiveness of four or five program models on various high-risk populations, contributing further to building a more comprehensive evidence base on effective programming. This evidence on program effectiveness will come from the Impact and In-depth Implementation Study (IIS).

A1. Circumstances Making the Collection of Information Necessary

1. Legal or Administrative Requirements that Necessitate the Collection

On March 23, 2010 the President signed into law the Patient Protection and Affordable Care Act (ACA), H.R. 3590 (Public Law 111-148, Section 2953). In addition to its other requirements, the act amended Title V of the Social Security Act (42 U.S.C. 701 et seq.) to include $55.25 million in formula grants to states to “replicate evidence-based effective program models or substantially incorporate elements of effective programs that have been proven on the basis of scientific research to change behavior, which means delaying sexual activity, increasing condom or contraceptive use for sexually active youth, or reducing pregnancy among youth.” The legislation mandates that the Secretary evaluate the programs and activities carried out with funds made available through PREP. To meet this requirement, FYSB and OPRE within ACF have contracted with Mathematica Policy Research and its subcontractors to conduct the PREP Multi-Component Evaluation. The collection of performance measures, one component of this evaluation and of this request for OMB approval, will support compliance with the GPRA Modernization Act of 2010 (Public Law 111-352).

2. Study Objectives

The objective of the PREP evaluation is to document how PREP-funded programs are operationalized in the field and assess their effectiveness on reducing teenage pregnancies, sexual risk behaviors, and STIs. The evaluation will expand the evidence base on teen pregnancy prevention programs and serve as a case study on the successes and challenges of replicating, adapting, and scaling up evidence-based programs through federal grant-making to states, tribes, and tribal communities.

As described above, the evaluation has three main components: (1) the Design and Implementation Study; (2) the Performance Analysis Study; and (3) the Impact and In-depth Implementation Study. This submission pertains to the latter two of these three components. We provide more background on these two components below. Attachment A provides an overview of the multiple components of the PREP evaluation.

Performance Analysis Study. The primary purpose of the Performance Analysis Study (PAS) is to collect information from all grantees on the extent to which the federal PREP objectives are being met and to learn from scaling up the replication of evidence-based programs. The PAS can also be used to create a foundation for program improvement efforts based on federal, grantee, and sub-awardee examination of the data. The PAS will not be used to evaluate program effectiveness, which will be estimated only in the four or five sites participating in the IIS component.

The plan for collecting and reporting the performance measures data reflects the multiple layers that states, tribes, and tribal communities are using to support program delivery. For example, some state agencies, tribes, or sub-awardees may directly implement the programs. In other arrangements, state agencies, tribes, or their sub-awardees may deliver programs through partner agencies. Under both scenarios, multiple implementation sites could be used to reach youth.

Ultimately, the grantees will be responsible for ensuring that all performance measures are reported to ACF. The data that the grantees will report to ACF will originate from three levels – the grantee, grantees’ sub-awardees, and the sub-awardees’ implementation sites. For some performance measures, grantees will provide data about activities or decisions that they undertake directly at the grantee level. For other measures, data will come from the sub-awardees to the grantee because sub-awardees oversee the activities to be documented. And, for other measures, sub-awardees will first have to gather data from each implementation site that provides direct programming to youth. In addition, some data will come from the youth themselves, who will be asked to complete entry and exit surveys. The efforts expected to be undertaken at each level and the estimated level of burden are further explained in Section A.12.

The performance measures data will be reported by PREP grantees through ACF’s PREP reporting system. ACF has contracted with RTI International to create and maintain this system.

Impact and In-depth Implementation Study. The objective of this component of the evaluation is to assess the impacts and implementation of funded programs in four to five selected PREP sites. The study will help ACF determine the effectiveness of PREP-funded programs in affecting key outcomes related to teen pregnancy, sexually transmitted infections, and associated sexual risk behaviors. It will also provide important information on the success and challenges sites face in implementing evidence-based teen pregnancy prevention programs and the quality with which the programs were implemented.

The evaluation team is currently working with ACF to identify four or five select PREP-funded sites to participate in this component of the evaluation. The sites are not meant to be representative of PREP-funded programs as a whole. Rather, site selection is focusing on grantees that (1) are large enough to support an impact and in-depth implementation study, (2) are implementing programs in a way that is amenable to random assignment for the program impact study (discussed below), and (3) address priority gaps in the existing research literature on evidence-based approaches to teen pregnancy prevention. These gaps include evidence on effective programs for high-risk populations such as youth living in rural areas or youth in the foster care or juvenile justice systems.

In each site, youth will be randomly assigned to a treatment group that receives the program being tested or to a control group that does not. The evaluation team will work collaboratively with site leaders to develop a plan for randomly assigning either individuals or organizations (such as schools, clinics, or group homes) to the treatment or control groups. Random assignment of individuals will be preferred when the risk of cross-over is low or when the program focuses more on individualized services or voluntary group programs; a cluster design is optimal when the risk of cross-over is high or when the program model features group- or community-level components intended to have broad contextual effects on the target population.

In each site, ACF expects to recruit and enroll a sample of 1,200 to 1,500 youth (for a total of 6,000 youth across four or five sites). Each site will be analyzed separately, so the relatively large samples of 1,200 to 1,500 youth per site are needed to detect policy-relevant impacts on key behavioral outcomes. ACF does not plan to pool data across sites or compare the effectiveness of one program versus another. The target sample sizes have been determined to support this goal of site-specific analyses. Minimum detectable impacts for the target sample sizes are presented in Supporting Statement B.

The impact study will involve three rounds of survey data collection: (1) a baseline survey administered to youth shortly before programming begins, (2) a short-term follow-up survey administered approximately 6 months after the end of programming, and (3) a long-term follow-up survey administered approximately 18 months after the end of programming. Wherever possible, there will be a group administration of a self-administered pencil and paper survey. When necessary to increase response rates or accommodate specific populations, this method will be augmented with web survey and telephone follow-up with hard copy. 2 Additional data will be collected for the implementation component of the study, such as in-person site visits, stakeholder interviews, focus groups, document review, and direct structured observations of program activities.

ACF is currently requesting OMB approval for only the baseline survey (see Instrument 3). Approval for the follow-up surveys and implementation data collection activities will be sought later, after site selection has progressed. Attachment B provides an analysis plan for the impact evaluation that describes what we will collect at baseline, how it will be used, and the information that will be collected at follow-up (to be included in a future ICR).

A.2. Purpose and Use of the Information Collection

Performance Analysis Study. This portion of the information collection request is related to performance management and the Performance Analysis Study, not to the Impact and In-Depth Implementation Study. The purpose of performance management is to track outputs and outcomes over time in order to provide information on how all PREP grantees and the programs that they operate are performing. Through the PAS, grantees will be required to submit data annually on two broad topics – PREP program structure and PREP program delivery.

  • PREP program structure refers to how grant funds are being used, the program models selected, the ways in which grantees and sub-awardees support program implementation, and the characteristics of the youth served.

  • Program delivery refers to the extent to which the intended program dosage was delivered, youths’ attendance and retention, youths’ perceptions of program effectiveness and their experiences in the programs, and challenges experienced implementing the programs.

To understand PREP program structure, grantees will be asked to provide the amount of grant allocated for various activities, including direct service provision; approach to staffing PREP at the grantee level; grantee provision of training, technical assistance, and program monitoring; number of sub-awardees, their funding, program models, populations, settings, and coverage of adulthood preparation subjects; number of program facilitators, their training on the program model, and the extent to which they are monitored to ensure program quality; and the characteristics of the youth entering the PREP programs.3 This information will be collected from the grantees (Instrument 4), their sub-awardees (Instrument 5), and the implementation sites involved in the direct delivery of programs (Instrument 6). Subawardees and implementation sites will submit their data to grantees, who will then compile this information and submit it to ACF (Instrument 4).

To understand PREP program delivery, grantees will be asked to provide the number of completed program hours for each cohort; number of youth who ever attended a PREP program, and by subpopulations (such as youth in foster care or the juvenile justice system); youths’ attendance and retention4; youths’ perceptions of program effectiveness and program experiences; and challenges providers face implementing their programs. This information will be collected from sub-awardees (Instrument 5) and the implementation sites involved in the direct delivery of programs (Instrument 6), and submitted to ACF by the grantees (Instrument 4).

The frequency with which performance data will be collected from grantees is summarized in Table A2.1.

Table A2.1. Collection Frequency for PREP Performance Measures Data

Category

Collection Frequencya

Demographic Items: Age, Grade, Gender, Ethnicity, Race3

Program Entry and Exit

Risk Behaviors and Intentions3

Program Entry

Participant Perceptions of Program Effects

Program Exit

Participant Assessments of the Program Experience

Program Exit

Features and Structure: Grantees, Sub-awardees, Programs

Once a Year

Program Fidelity (Dosage)

At Program Sessions

Participant Engagement (Attendance4, Reach, Retention,)

At Program Sessions and Cohort Completion

Staff Perceptions of Quality Challenges and Technical Assistance Needs

Once a Year

a “Collection frequency” refers to when grantees, their sub-awardees, and program staff collect the data that will later be compiled and reported to ACF.


ACF will then use the performance measures data to (1) track how grantees are allocating their PREP funds; (2) assess whether PREP objectives are being met (for example, in terms of the populations served); and (3) help drive PREP programs toward continuous improvement of service delivery. In addition, ACF will use this information to fulfill reporting requirements to Congress and the Office of Management and Budget concerning the PREP initiative. ACF also intends to share grantee and sub-awardee level findings with each state to inform their own program improvement efforts.

The Participant Entry Survey (Instrument 1), Participant Exit Survey (Instrument 2), the Performance Reporting System Data Entry Form (Instrument 4), the Sub-awardee Data Collection and Reporting items (Instrument 5), and the Implementation Site Data Collection items (Instrument 6) are attached.

Impact and In-depth Implementation Study. Data collected on the PREP baseline survey (Instrument 3) will be used as a central component to the in-depth study. Specifically, the data will be used to establish baseline equivalence of the treatment and control groups and thus to confirm the integrity of the random assignment process. Baseline data will also be used to define subgroups for which impacts will be estimated, and to adjust impact estimates to account for survey non-response. Many baseline measures will be measured again at follow-up; their baseline values can be used to improve the precision of impact estimates by their inclusion as covariates in the impact models. Attachment B provides an analysis plan for the impact evaluation that describes what we will collect at baseline, how it will be used, and the information that will be collected at follow-up (to be included in a future ICR).

Many of the items included on the baseline survey are taken directly from the similar survey OMB has already approved for use in the ongoing Evaluation of Adolescent Pregnancy Prevention Approaches (PPA). ACF received initial OMB approval for the PPA baseline survey on July 26, 2010 (OMB Control Number 0970-0360). In summer 2011, oversight of PPA was transferred to the Office of Adolescent Health (OAH) within the Office of the Secretary, and the project is now tracked with a different OMB Control Number (0990-0382). To date, the PPA baseline survey has been administered to approximately 3,500 adolescents. The PPA survey is also being used as the common starting point for instrument development across all the ongoing federal teen pregnancy prevention evaluations. By drawing on items from the PPA survey, we are thus aligning PREP with other ongoing federal evaluations.

For PREP, the evaluation team worked with ACF to adapt the OMB-approved PPA baseline instrument in two ways. First, certain measures were added to reflect PREP’s authorizing legislation—specifically, the legislation’s focus on general adulthood preparation topics (healthy life skills, education and employment skills, financial literacy, and so on) beyond the primary focus on preventing teen pregnancy, sexually transmitted infections, and associated sexual risk behaviors. Second, additional measures were added to address current ACF priorities in understanding the experiences of especially high-risk populations and how these experiences may shape their developmental trajectories and sexual risk behaviors. To accommodate these new additions to the survey, other measures of lower priority for PREP were dropped from the PPA baseline survey. We did not, however, drop any of the core behavioral outcomes from the PPA survey necessary for assessing program impacts on measures of teen pregnancy, STIs, or associated sexual risk behaviors.5 Attachment C includes a question by question listing of the items proposed for the PREP baseline survey and how they relate to the OMB-approved PPA baseline. A description of the sources referenced in the development of the PREP baseline instrument is found in Attachment D.

A.3. Use of Information Technology to Reduce Burden

Performance Analysis Study. To comply with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and to reduce grantee burden, ACF is streamlining the performance data reporting process and generation of reports by (1) providing common data element definitions across PREP grantees and program models, (2) collecting these data in a uniform manner through the PREP reporting system, and (3) using the PREP reporting system to calculate common performance measures across grantees and program models. Using the PREP reporting system will reduce reporting burden and minimize grantee and sub-awardee costs related to implementing the reporting requirements.

Impact and In-depth Implementation Study. The data collection plan for the IIS baseline survey reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Wherever possible, there will be a group administration of a self-administered pencil and paper survey instrument (PAPI). The advantages of PAPI over more technologically innovative approaches, such as laptops or personal digital assistants (PDAs), are that it enables respondents to set their own pace (allowing for more accurate responses to sensitive questions); reduces costs; and simplifies administration logistics. Studies have shown no difference between PAPI and computer-assisted self-interviewing (CASI) in reports of most measures of male-female sexual activity, including reports such as ever having had sexual intercourse, recent sexual activity, number of partners, condom use, and pregnancy.6,7,8,9,10,11 This method is also consistent with other national youth surveys (for example, the national Youth Risk Behavior Survey) and the ongoing federal PPA project. In those instances in which the survey must be administered outside a group-based setting, respondents will be provided a unique PIN/password for web completion or will be surveyed via telephone.

A.4. Efforts to Identify Duplication and Use of Similar Information

ACF has carefully reviewed the information collection requirements for PREP to avoid duplication with either existing studies or other ongoing federal teen pregnancy prevention evaluations and believes that the PREP Evaluation complements, rather than duplicates, the existing literature and the other ongoing federal teen pregnancy prevention evaluations .

As background, the other federal teen pregnancy prevention-related evaluations currently in the field are (1) the Evaluation of Adolescent Pregnancy Prevention Approaches, sponsored by the Office of Adolescent Health within HHS; (2) the Teen Pregnancy Prevention Replication Study, also sponsored by the Office of Adolescent Health within HHS; and (3) the Evaluation of Community-Based Approaches, sponsored by the Centers for Disease Control and Prevention.

Each of these three evaluations has a specific focus. The Evaluation of Adolescent Pregnancy Prevention Approaches is focused on testing promising and innovative new models for reducing teen pregnancy. The Teen Pregnancy Prevention Replication Study is focused on testing of evidence-based models for reducing teen pregnancy (which are being scaled up through the Teen Pregnancy Prevention program administered by the HHS Office of Adolescent Health). And the Evaluation of Community-Based Approaches is focused on testing community saturation models for reducing teen pregnancy.

Although the information from these other federal evaluations will increase our understanding of reducing teenage sexual risk behavior, the focus of the PREP Evaluation is different from the foci of the other three federal evaluations. Specifically, ACF believes that the PREP evaluation complements the other evaluations by providing the following unique opportunities:


  • Opportunity to learn about using a state formula grant to scale up evidence-based programs. The PREP Evaluation will allow us to learn about both the opportunities and the challenges of scaling up evidence-based teen pregnancy prevention programs through a state formula grant process (as opposed the competitive discretionary grant process being used for the Teen Pregnancy Prevention Program). It is the only federal evaluation to do so.

  • Opportunity to understand the special components of the PREP program. The PREP Evaluation will help us to understand the unique components of the programs funded through PREP, such as the adulthood preparation topics which are being incorporated in the teen pregnancy prevention programming funded through PREP. These components are not part of the other teen pregnancy prevention models being evaluated.

  • Opportunity to test programs being implemented with high-risk populations. In the process of recruiting and selecting sites for the impact evaluation component of the PREP Evaluation, we are especially targeting programs which are implemented with high-risk and vulnerable populations, such as foster care youth, homeless youth, and youth in the juvenile justice system (although we are considering a range of programs for the impact evaluation). These high-risk groups, which are a priority population of interest to ACF, are currently underrepresented in the teen pregnancy prevention literature and are not the focus of other ongoing federal teen pregnancy prevention evaluations.

In addition, the evaluation team will also take steps to avoid duplication across the different components of the evaluation. For example, data collected through the PAS Participant Entry Survey are also included in the IIS baseline survey. To avoid duplication of data collection among youth enrolled in programs selected for inclusion in the IIS, these youth will complete only the PREP baseline survey at program entrance. Participant entry data required for submission via the PREP reporting system will be obtained from these baseline surveys.

A.5. Impact on Small Businesses

Programs in some sites may be operated by community-based organizations. The data collection plan is designed to minimize burden on such sites by providing staff from Mathematica Policy Research to manage the group baseline data collection for the IIS. For respondents who do not complete the survey in the group setting, Mathematica will provide unique passwords for web completion or will conduct a telephone data collection, thus minimizing requirements for extensive “sample pursuit” by site staff.

A.6. Consequences of Not Collecting the Information/Collecting Less Frequently

Performance Analysis Study. The Government Performance and Results Act (GPRA) requires federal agencies to report annually on measures of program performance. Therefore, it is essential that grantees report the performance data described in this ICR to ACF on an annual basis. Further, collection and reporting of data for performance measurement is a requirement of all grantees, as stated in the PREP funding opportunity announcement.

Impact and In-depth Implementation Study. Baseline data are essential to conducting a rigorous evaluation of PREP programs supported under Public Law 111-148. Specifically, without these baseline data, we would not be able to monitor whether random assignment was conducted correctly and created two very similar research groups. In addition, we would not be able to estimate impacts for key subgroups or to improve the precision of our impact estimates by including baseline covariates in our statistical models used to estimate program impacts.

A.7. Special Circumstances

There are no special circumstances for the proposed data collection efforts.

A.8. Federal Register Notice and Consultation Outside the Agency

The 60-day Federal Register Notice was posted on December 13, 2011. No comments have been received. A copy of the 60-day FRN is included in Attachment E. The full study was described, including expected burden, in the 60-day FRN and for that reason, we are requesting that subsequent 60-day FRNs be waived for this study.

ACF consulted with staff of Mathematica Policy Research, Child Trends, and RTI International, the contractors responsible for assisting in development of the PAS performance measures and performance measure reporting system.

The names and contact information of the persons consulted in the drafting and refinement of the baseline survey instrument and analysis are found in Attachment F.

A.9. Payments to Respondents

No payments to respondents are proposed for this information collection.

A.10. Assurance of Confidentiality

Performance Analysis Study. Grantees will enter all PAS performance measure data into a national reporting system that will be developed and maintained by RTI International. The PREP performance measurement reporting system is designed to ensure the security of data that are maintained in the system. Electronic data are stored in a location within the RTI network that provides the appropriate level of security based on the sensitivity or identifiability of the data. Further, all data reported by grantees related to program participants will be aggregated; no personal identifiers or data on individual participants will be submitted to ACF. Reports generated by the system will present data in aggregate form only.

System users designated by the individual grantees will be assigned user names and passwords that will grant them limited access to the PREP reporting system. The database server, located at RTI International, will be accessible only to authorized users. Electronic communications will occur via a secure Internet connection. All transmissions will be encrypted with 128-bit encryption through secure socket layers (SSL) and verified by a VeriSign®, the leading SSL Certificate authority.

To further ensure data security, all RTI project staff are required to adhere to strict standards and to sign security agreements as a condition of employment on the PREP project. All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only.

Participant-level data required for PAS reporting will be gathered by grantees and their subawardees. Grantees will then enter this information in aggregated form into the national reporting system. Grantees and sub-awardees will be responsible for ensuring privacy of participant level data and securing institutional review board (IRB) approvals to collect these items, as necessary. Some of the grantees may need IRB approval based upon their local jurisdiction mandates. Therefore, we informing grantees that they should determine whether they need IRB approval and follow the proper procedures of their locality. Grantees will be required to inform participants of the measures that are being taken to protect the privacy of their answers.

These data will be reported by grantees only as aggregate counts. There will be no means by which individual responses can be identified by ACF, RTI International, Mathematica Policy Research, or other end-users of the data.

Impact and In-depth Implementation Study. Mathematica Policy Research has secured IRB approval for the Impact and In-depth Implementation Study and will be responsible for securing any additional local IRB approvals for each site prior to information collection, as necessary. Prior to collecting baseline data, we will seek consent from a parent or legal guardian if the respondent is a minor, or from respondents themselves if they are 18 or older (Attachment G). The consent form will explain the data being collected and its use. The form will also state that answers will be kept private and not seen by anyone outside of the study team, that participation is voluntary, and that they may refuse to participate at any time without penalty. Participants and their parents/guardians will be told that, to the extent allowable by law, individual identifying information will not be released or published; rather, data collection will be published only in summary form with no identifying information at the individual level.

Trained Mathematica field staff will administer the baseline survey in a group setting. All field staff are required to sign a confidentiality pledge (see Attachment H) when hired by Mathematica. On the day of the survey administration, field staff will distribute a student assent form to participants, providing them with a chance to opt out of the baseline data collection, should they want to do so (Attachment G). The survey administration protocol provides reassurance that we take the issue of privacy seriously. Participants will be informed that all of their answers will be kept private, that identifying information will be kept separate from baseline data, and that no one outside of the study team will see their responses.

The questionnaire and envelope will have a label with a unique ID number; no identifying information will appear on the questionnaire or return envelope. Before turning completed questionnaires in to field staff, respondents will place them in blank return envelopes and seal them. This approach has been shown in research to yield the same reports of sexual activity as computer-assisted surveys in school settings, and a lower incidence of student concerns about privacy. Field staff are trained to keep all data collection forms in a secure location and are instructed not to share any materials with anyone outside of the study team. Completed surveys are immediately shipped via FedEx to Mathematica’s Survey Operations Center for receipting. Any forms with identifying information (consent and assent forms) will be shipped separately from the surveys.

All electronic data will be stored in secure files, with identifying information kept in a separate file from survey and other individual-level data. Survey responses will be stored on a secure, password-protected computer shared drive.

A.11. Justification for Sensitive Questions

A key objective of PREP programs is to prevent teen pregnancy through a decrease in sexual activity and/or an increase in contraceptive use. Because this is the primary focus of the programs, some questions for the programs’ performance measures and some questions on the IIS baseline survey are necessarily related to these sensitive issues.

Performance Analysis Study. Table A11.1 provides a list of sensitive questions that will be asked on the participant entry and exit surveys and the justification for their inclusion.


Table A11.1. Summary of Sensitive Questions to Be Included on the Participant Entry and Exit Surveys and Their Justification

Topic

Justification

Participant Entry Survey (Instrument 1)

Sexual orientation (Question 6)

ACF has a strong interest in improving programming that serves lesbian, gay, bisexual, transgendered, and questioning (LGBTQ) youth. This question will allow us to document the proportion of youth that are being served by PREP nationwide and that are part of this subpopulation.

Sexual activity, incidence of pregnancy, and contraceptive use  (Questions 9-15)

Intentions to engage in sexual activity, the level of sexual activity, incidence of pregnancy, and contraceptive use are all central to the PREP evaluation. Collecting this information will allow us to document the characteristics of the population served by PREP and the degree to which they engage in risky behavior.

Participant Exit Survey (Instrument 2)

Participants’ perceptions of PREP’s effects on their sexual activity and contraceptive use (Questions 8a-8d)

Reducing risky adolescent sexual behavior and increasing contraceptive use for those who are sexually active are among the central goals of PREP-funded programs. Examining whether participating youth consider PREP programs to be effective in achieving these goals is an important element of gauging the success of these programs.



To address concerns about asking questions about sexual behavior and sexual orientation of younger youth at program entry (before they have been through the program), grantees will not be required to collect this information from youth in middle schools or youth younger than age 14 in non-school settings. In addition, grantees will inform program participants that they may refuse to answer any or all of the questions in the entry and exit surveys.

Impact and In-depth Implementation Study. Table A11.2 provides a list of the sensitive questions found on the PREP baseline survey, along with a justification for their inclusion.

Table A11.2. Summary of Sensitive Questions to be Included on the IIS Baseline Survey (Instrument 3) and Their Justification

Topic

Justification1

Sexual orientation (question 3.4)2

ACF has a strong interest in improving programming that serves lesbian, gay, bisexual, transgendered, and questioning (LGBTQ) youth. This question will allow us to document the proportion of youth in-depth study sites that are part of this subpopulation. In addition, if sample sizes permit, we will use this information to estimate program impacts separately for these youth.

Sexual activity, incidence of pregnancy and STDs, and contraceptive use  (4.12, 5.1 in B1 and B2; 5.2-5.21 in B1; 6.1 -6.7 in B1)

Sexual activity, incidence of pregnancy and STDs, and contraceptive use are all key outcomes for the evaluation and sexual activity at baseline is a powerful predictor of later outcomes. Having data at baseline increases the precision of our estimates of impacts on sexual activity at follow-up. The majority of these questions are asked only of youth who report being sexually active.

Intentions regarding sexual activity (5.13 in B2)

Intentions regarding engaging in sex and other risk-taking behaviors are extremely strong predictors of subsequent behavior (Buhi and Goodson, 2007). Intentions are strongly related to behavior and will be an important mediator predicting behavior change.

Drug and alcohol use (7.1–7.5 in B1 and B2)

There is a substantial body of literature linking various high-risk behaviors of youth, particularly drug and alcohol use, sexual intercourse, and risky sexual behavior. The effectiveness of various program strategies is expected to differ for youth who are and are not experimenting with or using drugs and alcohol (Tapert et al., 2001; Li et al., 2001; Boyer et al., 1999; Fergusson and Lynskey, 1996; Sen, 2002; Dermen et al., 1998; Santelli et al., 2001.)

1Full references for sources cited in table may be found at the end of Supporting Statement A.

2Question numbers on the Healthy Families San Angelo (HFSA) baseline survey vary slightly from those on the master survey. Questions regarding sexual activity, incidence of pregnancy and STDs and contraceptive use are found in 5.1,-5.6 and 6.1 – 6.6 on the HFSA baseline; questions on intentions regarding sexual activity: 5.7; drug and alcohol use: 7.1- 7.5. The question on sexual orientation is not included on the HFSA baseline survey.


Sensitive questions are drawn from previously-successful youth surveys and evaluations (see Attachment C). The items have been carefully selected, and we have been guided by past experience in determining whether or not the benefits of measures may outweigh concerns about the heightened sensitivity among sample members, parents, and program staff to specific issues. Although these questions are sensitive, they are commonly and successfully asked of youth similar to those who will be in the PREP study.

In addition, we have designed the baseline survey instrument so that only sexually active youth will receive most of these sensitive questions. The instrument is designed with three parts, Part A, Part B1, and Part B2. All participants will complete Part A. At the end of Part A, they will be directed to complete either Part B1 (for youth who report being sexually active) or Part B2 (for youth who are not sexually active). 12 Many of the sensitive items related to sexual activity will be included only in Part B1 and thus asked only of sample members who report being sexually active. This structure has been used successfully in other federally funded teen pregnancy prevention evaluations, such as the Evaluation of the Title V, Section 510 Abstinence Education Program and the Evaluation of Adolescent Pregnancy Prevention Approaches.

A.12 Estimates of the Burden of Data Collection

Tables A12.1 and A12.2 provide the estimated annual reporting burden calculations for the PAS data collection and IIS baseline survey. These are broken out separately as burden for PREP youth participants (Table A12.1) and for PREP state and tribal grantees and their sub-awardees (Table A12.2). Table A12.3 provides a summary of burden hours and costs approved to-date, as well as those requested in this ICR.

1. Annual Burden for Youth Participants

Performance Analysis Study. Table A12.1 presents the hours and cost burden for the participant entry and exit surveys. The number of participants completing these surveys is based on interviews conducted with all state grantees in summer 2012 (ICR approved March 7, 2012, OMB Control No.: 0970-0398) and a review of tribal PREP grantee applications. The amount of time it will take for youth to complete the entry and exit surveys is estimated based on pretest results of each of these instruments with nine youth. The cost of this burden is estimated by assuming that 10 percent of the youth served by the program will be age 18 or older and then assigning a value to their time of $7.25 per hour, the federal minimum wage. The estimate of the proportion of youth served by PREP programs that will be 18 or older is based on interviews conducted with all state grantees in summer 2012 (ICR approved March 7, 2012, OMB Control No.: 0970-0398) and a review of tribal PREP grantee applications..

  • Participant entry survey. PREP grantees are expected to serve approximately 207,000 participants over the three year OMB clearance period, for an average of about 69,000 new participants per year.13 However, grantees will not collect participant entry surveys among the PREP program participants for this current grant year, which reduces the estimated number of participants over the three year OMB clearance period to 58,650. The participant entry survey will not be administered to middle school youth in school-based settings. Once we exclude those participants and apply a 95 percent response rate to the remaining participants, we anticipate 35,103 respondents to the entry survey each year . to the entry survey each year (36,950 x 0.95 = 35,103).14 Based on pretesting of this instrument, the participant entry survey is estimated to take 5 minutes (0.08333 hour) to complete. The total annual burden for this data collection is estimated to be 35,103, x 0.08333 = 2,925 hours. The annual cost of this burden is estimated to be 2,925 hours x 0.25 (proportion of youth age 18 or older) x $7.25 = $5,300.15

  • Participant exit survey. It is estimated that about 20 percent of the participants will drop out of the program prior to completion, leaving approximately 46,920 (46,920 = 58,650 x 0.80) participants at the end of the program.16 Of those, we expect 95 percent, or 44,574 participants, will complete the participant exit survey each year.17 Based on pretesting, the exit survey is estimated to take youth 10 minutes (0.16667 hour) to complete. The total annual burden for this data collection is estimated to be 44,574x 0.16667 = 7,429 hours. The cost of this burden is estimated to be 7,429 hours x 0.10 (proportion of youth age 18 or older) x $7.25 = $5,386.

Impact and In-Depth Implementation Study. It is expected that 6,000 youth will be enrolled in the evaluation sample across the four to five evaluation sites for IIS. Sample intake will take place over three years, for an average of 2,000 participants per year. The expected response rate for the IIS baseline survey is 95 percent, for an average of 1,900 IIS baseline survey completions per year. Based on previous experience with similar questionnaires, it is estimated that it will take youth 45 minutes (0.75 hour) to complete the baseline survey, on average. The total annual burden for this data collection is estimated to be 1,900 x 0.75 = 1,425 hours. The cost of this burden is estimated to be 1,425 hours x 0.10 (proportion of youth age 18 or older) x $7.25 = $1,037.


Table A12.1. Calculations of Burden Hours and Cost for Youth Participants

Instrument

Annual Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Total Burden Hours

Total Burden Hours for Youth Age 18 or Older

Hourly Wage Rate

Total Costs

Performance Analysis Study

Instrument 1: Participant entry survey

35,103

1

0.08333

2,925

731

$7.25

$5,300

Instrument 2: Participant exit survey

44,574

1

0.16667

7,429

743

$7.25

$5,386

Impact and In-Depth Implementation Study

Instrument 3: Baseline survey

1,900

1

0.75

1,425

143

$7.25

$1,037

Estimated Annual Burden for Youth Participants


11,779



$11,723



2. Annual Burden for Grantees, Sub-Awardees, and Sub-Awardee Implementation Sites

Performance Analysis Study. The 65 grantees18 will report performance measure data into a national reporting system developed for the PREP initiative. They will gather this information with the assistance of their sub-awardees (estimated to be 350 across all grantees) and the sub-awardees’ implementation sites (estimated to be 1,400 across all grantees)19. The grantee, sub-awardee, and implementation site data collection efforts described below are record-keeping tasks.

Table A12.2. Calculations of Burden Hours and Costs for Grantees, Their Sub-awardees, and Implementation Sites to Collect and Report the Required Performance Measures



Data collection instrument

Type of Respondent

Annual number of respondents

Number of responses per respondent

Average burden hours per response

Total burden hours

Hourly Wage Rate

Total costs

Instrument 4:
Performance Reporting System Data Entry Form

Grantee Administrator

65

1

24

1,560

21.35

$33,306

Instrument 5:
Sub-awardee data collection and reporting

Sub-Awardee Administrator

350

1

18.6667

6,533

20.76

$135,625

Instrument 6:
Implementation site data collection

Site Facilitator

1,400

1

8

11,200

20.76

$232,512

Estimated Total Annual Burden for Grantees, Sub-awardees, and Implementation Sites

19,293

$401,443



Total Annual Burden and Cost for Grantees

Once per year, all 65 grantees20 will be required to submit all of the required performance measures into the national system. Time for a designated PREP grantee administrator to aggregate the data across each of the grantee’s sub-awardees and submit all of the required data into the system is estimated to be 16 hours per year per grantee. Grantee administrators will also spend an estimated 8 hours collecting information at the grantee-level that pertain to grantee structure, cost, and support for program implementation. The Performance Reporting System Data Entry Form includes all of these required data elements that the grantee will collect, aggregate, and submit into the national system (see Instrument 4). The total annual burden for these activities is estimated to be 65 x 24 = 1,560 hours. The cost burden for this activity is estimated to be 1,560 hours times an hourly wage of $21.35, for a total cost burden of $33,306. This hourly wage rate represents the mean hourly wage rate for all occupations (National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor, May 2010).

Total Annual Burden and Cost for Sub-Awardees

The 350 estimated sub-awardees will conduct multiple activities to support the performance analysis study (see Instrument 5). They will aggregate data on participant level entry and exit surveys (provided by implementation sites, and estimated, on average, at 5 hours for one sub-awardee administrator to aggregate), aggregate data on attendance and program session hours (provided by implementation sites, and estimated, on average, at 5.5 hours for one sub-awardee administrator to aggregate), 21 report to the grantee on implementation challenges and needs for technical assistance (estimated at 10 minutes or 0.16667 hour for one sub-awardee administrator to complete), and report to the grantee on sub-awardee structure, cost, and support for program implementation (estimated at 8 hours for one sub-awardee administrator to complete). The total estimated time for sub-awardees is 18 hours and forty minutes. The total annual burden for this data collection activity is estimated to be 350 x 18.6667 = 6,533 hours. The cost burden for this activity is estimated to be 6533 hours times an hourly wage of $20.76, for a total cost burden of $135,625. This hourly wage rate represents the mean hourly wage rate for community and social service occupations (National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor, May 2010).

Total Annual Burden and Cost for Implementation Sites

The 1,400 estimated program implementation sites will collect program implementation data to support the performance analysis study (see Instrument 6). They will record youth program attendance at sites operating during out of school time (estimated at an average of 3 hours for each implementation site facilitator to complete22) and will record the program session hours delivered at each implementation site (estimated at 5 hours for one implementation site facilitator to complete). The total annual burden for this data collection activity is estimated to be 1,400 x 8 = 11,200 hours. The cost burden for this activity is estimated to be 11,200 hours times an hourly wage of $20.76, for a total cost burden of $232,512. This hourly wage rate represents the mean hourly wage rate for community and social service occupations (National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor, May 2010).

Impact and In-Depth Implementation Study. There is no grantee, sub-awardee, or implementation site burden associated with administration of the baseline survey. Data collectors from Mathematica Policy Research will be responsible for the baseline survey data collection.

3. Overall Burden

Table A12.3 detail the overall burden approved and requested for data collection associated with the PREP Multi-Component Evaluation. A total of 27023 hours (and a cost of $8,023) has been approved thus far with the prior two ICRs for this project. A total of 43,845 hours (and a cost of $598,067) is requested in this ICR. If approved, the total approved burden for this project (i.e. the prior burden summed with the requested burden) will be 44,115 hours (and a cost of $606,090).

Table A12.3. Calculations of Burden Hours and Costs for Approved and Requested Burden

Data collection instrument

Type of Respondent

Annual number of respondents

Number of responses per respondent

Average burden hours per response

Total burden hours

Total Burden Hours for Youth Age 18 or Older

Hourly Wage Rate

Total costs

Collection of Field Data (Approved November 6, 2011)

Discussion Guide for use with Macro-Level Coordinators

Macro-Level Coordinators

10

1

1

10

N/A

$33.59

$333.90

Discussion Guide for Use with Program Directors

Program Directors

20

2

2

80

N/A

$27.21

$2,176.80

Discussion Guide for Use with Program Staff

Program Staff

40

1

2

80

N/A

$23.76

$1,900.80

Discussion Guide for Use with School Administrators

School Administrators

70

1

1

70

N/A

$35.54

$2,487.80

Design Survey Data Collection (Approved March 7, 2012)

Design Survey: Discussion Guide for Use with PREP State-Level Coordinators and State-Level Staff

State-Level Coordinators and State-Level Staff

3024

1

1

30

N/A

$37.45

$1,124

Subtotal: Burden Approved To-Date

270



$8,023

Performance Measures and Baseline Data (Currently Requested)

Instrument 1: Participant entry survey

Participant

35,103

1

0.08333

2,925

731

$7.25

$5,300

Instrument 2: Participant exit survey

Participant

44,574

1

0.16667

7,429

743

$7.25

$5,386

Instrument 3: Baseline Survey

Participant

1,900

1

0.75

1,425

143

$7.25

$1,037

Instrument 4:
Performance Reporting System Data Entry Form

Grantee Administrator

65

1

24

1,560

N/A

21.35

$33,306

Instrument 5:
Sub-awardee data collection and reporting

Sub-Awardee Administrator

350

1

18.6667

6,533

N/A

20.76

$135,625

Instrument 6:
Implementation site data collection

Site Facilitator

1,400

1

8

11,200

N/A

20.76

$232,512

Subtotal: Burden Currently Requested

31,072



$413,166

Estimated Total Annual Burden

31,342


$421,189



A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

These information collection activities do not place any capital cost or cost of maintaining requirements on respondents. ACF will provide grantees with access to the PREP reporting system that will be used for reporting required PAS data and generating associated reports.

A.14. Annualized Cost to Federal Government

Costs for previously-approved data collection. On November 6, 2011, OMB approved field data collection. Annualized costs for that effort are $215,625. On March 7, 2012, OMB approved data collection for the Design Survey. Annualized costs for that effort are $83,333.25

Performance Analysis Study. The estimated cost for completion of the PAS is $1,081,866 over five and a half years. The cost over the three years for requested clearance is $590,109. The annual cost to the federal government is estimated to be $196,703 ($590,109/3).

Impact and In-depth Implementation Study. The total cost for the baseline data collection is $1,148,275. Because baseline data collection will be carried out over three years as successive sites start up and enroll sample, the estimated annualized cost to the government for baseline data collection is $382,758 (1,148,275/3).

If this proposed ICR is approved, the total annual cost to the federal government for this and all previously approved collections as part of the PREP Multi-Component Study is $878,419.

A.15. Explanation for Program Changes or Adjustments

OMB gave approval on March 7, 2012 for the Design Survey under the DIS (OMB Control No.: 0970-0398). We now seek approval for the data collections associated with the Performance Analysis Study and for the collection of baseline data under the In-depth Impact and Implementation Study. This request will increase the total burden requested for the PREP Evaluation, under OMB Control No. 0970-0398.

A16. Plans for Tabulation and Publication and Project Time Schedule

1. Analysis Plan

This phase of the PREP Evaluation involves collecting performance measure data that will be used to monitor and analyze grantee performance. It also involves collecting baseline information that will be used for the impact evaluation during the follow-up data collection.

Performance Analysis Study. A major objective of performance measure analysis will be to construct, for Congress, a picture of PREP implementation. A basic set of statistics will be constructed across all grantees. These statistics, for example, will answer questions for the PREP program as a whole, such as:

  • What programs were implemented, and for how many youth?

  • What are the characteristics of the population served?

  • To what extent were members of vulnerable populations served?

  • How fully did programs deliver their program models?

  • How many youth participated in most program sessions or activities?

  • How many entities are involved at the sub-awardee level in delivering PREP programs?

  • How do grantees allocate their resources?

  • How do participants feel about the programs, and how do they perceive its effect on them?

  • What challenges do grantees and their partners see in implementing PREP programs on a large scale?

Answers to questions like these will be constructed by combining data across all grantees, and also separately for state grantees and tribal grantees. These answers will help ACF understand whether, overall, PREP objectives are being met. Using the performance data for accountability requires constructing indicators for many of the same measures, but separately for each grantee and even sub-awardee. Indicators at the grantee level help fulfill federal responsibilities to hold grantees accountable for performance. Indicators at the sub-awardee level will help grantees in their efforts to hold accountable those to whom they are providing resources for PREP implementation. The structure of the data will also allow for examining several of these questions by program model to better understand successes and challenges implementing the various programmatic approaches.

The results of the performance measures analysis will help ACF and grantees pinpoint areas for possible improvement of program implementation. For example, ACF will be able to determine which grantees deliver their complete program content and hours to a high percentage of participant cohorts, and for which program models that is true. Grantees will be able to determine from performance data which of the program models they implement are succeeding in delivering complete content, or in getting participants to complete at least 75 percent of the program sessions. ACF will be able to generate statistics showing how programs serving vulnerable populations compare to programs serving more general teen populations with regard to participant completion, participants’ assessments and perceived effects. ACF will learn which implementation challenges are most evident to grantees and their sub-awardees, and which are seen as topics for technical assistance. Over time, data can demonstrate which grantees and sub-awardees are improving with respect to elements of program delivery and which areas of technical assistance require on-going attention.

Impact and In-depth Implementation Study. Data from the baseline survey will be used for two initial purposes. First, ACF will use the data to describe the study sample. This step will enable ACF to compare the characteristics of youth in the study with youth nationwide and provide guidance on how the study sample and findings might generalize to a broader policy setting. Second, ACF will assess whether random assignment resulted in similar baseline characteristics of youth, on average, for the treatment and control groups.

Ultimately, the baseline data will also be used in estimating program impacts on youth outcomes. The program impact estimates will rely primarily on data from the two planned follow-up surveys, which ACF will submit for OMB approval later after site selection has progressed. Attachment B provides a description of the analysis plan for the PREP impact evaluation. With a random assignment design, unbiased impact estimates can be obtained by comparing mean outcomes for the treatment and control group based on follow-up data alone. However, we can improve precision of the impact estimates by controlling in our regression model for baseline covariates, especially baseline measures of outcomes. Regression adjustment can also address any differences between the treatment and control groups in baseline characteristics that arose by chance or from survey nonresponse. Baseline data will also be used for subgroup analysis, to assess whether program impacts vary by baseline characteristics such as prior sexual experience or other risk characteristics.

2. Time Schedule and Publications

The PREP evaluation will be conducted over a seven-year period. This request is for a three year period and subsequent packages will be submitted as necessary for new collections or to extend collection periods. Below is a schedule of the data collection efforts for the Performance Analysis Study and the IIS Baseline Survey:

Performance Analysis Study. The performance analysis reporting schedule is designed to complement the timing of grantees’ program implementation and the availability of the tools to support the data collection. In winter 2013, grantees will provide limited data on PREP program structure and delivery for the September 2011 to August 2012 grant period, such as grantee organization, program models, and resource use. Grantees will again report limited data on PREP program structure and delivery for the September 2012 to August 2013 grant period in fall 2013. These first two rounds of reporting will not include characteristics of the individual youth served; youths’ perceptions of program effectiveness or program experiences; or any data that require a detailed recording of participants’ enrollment, attendance, and retention or of delivered program hours. Grantees will implement data collection for these measures during the 2013 to 2014 program year and report them in fall 2014 and annually thereafter. In fall 2014, and annually thereafter, grantees will provide full data on PREP program structure and delivery for the September 2013 to August 2014 grant period. While the grantees will provide data once each year to ACF, the analytical results based on their reported data will be compiled into reports twice each year. With the program year ending in August, grantees could be expected to report performance measurement data in October of each year, allowing time for collection of data from sub-awardees. Analysis of the performance data could then proceed in two stages. Stage 1, to be completed within four months of data receipt, will focus on generating national statistics for reporting to Congress. Stage 2, to be completed within eight months of data receipt, will involve more detailed and exploratory analyses by grantee, sub-awardee, and program model. The exact timing of both stages will depend on the quality of data submitted to the ACF data system. Improvement in data quality over time, driven in part by technical assistance to grantees, could result in acceleration of this schedule for producing results.

Impact and In-depth Implementation Study. For the IIS, ACF expects one or more sites to begin enrolling sample members and administering baseline surveys in March 2013. Other sites may begin later, and because ACF plans to analyze each site separately (discussed in Section A.3), it is acceptable for the data collection schedule to vary across sites. The current project schedule assumes that all sites will begin enrolling members and administering baseline surveys by September 2013. To generate sufficient sample sizes for the impact study, the project schedule allows for sample enrollment to continue for up to three years after the initial sites have started—that is, through September 2015. No separate publications are planned for the baseline survey data.

A17. Reason(s) Display of OMB Expiration Date is Inappropriate

All instruments, consent and assent forms and letters will display the OMB Control Number and expiration date.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.


SUPPORTING REFERENCES FOR INCLUSION OF SENSITIVE
QUESTIONS OR GROUPS OF QUESTIONS

Blake, Susan M., Rebecca Ledsky, Thomas Lehman, Carol Goodenow, Richard Sawyer, and Tim Hack. "Preventing Sexual Risk Behaviors among Gay, Lesbian, and Bisexual Adolescents: The Benefits of Gay-Sensitive HIV Instruction in Schools." American Journal of Public Health, vol. 91, no. 6, 2001, pp. 940-46.

Boyer, Cherrie B., Jeanne M. Tschann, and Mary-Ann Shafer. "Predictors of Risk for Sexually Transmitted Diseases in Ninth Grade Urban High School Students." Journal of Adolescent Research, vol. 14, no. 4, 1999, pp. 448-65.

Buhi, Eric R. and Patricia Goodson. "Predictors of Adolescent Sexual Behavior and Intention: A Theory-Guided Systematic Review." Journal of Adolescent Health: Official Publication of the Society for Adolescent Medicine., vol. 40, no. 1, 2007, pp. 4.

Davis, E. C., and Friel, L. V. “Adolescent Sexuality: Disentangling the Effects of Family Structure and Family Context.” Journal of Marriage & Family, vol. 63, no. 3, 2001, pp. 669-681.

Dermen, K. H., M. L. Cooper, and V. B. Agocha. "Sex-Related Alcohol Expectancies as Moderators of the Relationship between Alcohol use and Risky Sex in Adolescents." Journal of Studies on Alcohol., vol. 59, no. 1, 1998, pp. 71.

Fergusson, David M. and Michael T. Lynskey. "Alcohol Misuse and Adolescent Sexual Behaviors and Risk Taking." Pediatrics, vol. 98, no. 1, 1996, pp. 91.

Goodenow, C., J. Netherland, and L. Szalacha. "AIDS-Related Risk among Adolescent Males Who have Sex with Males, Females, Or both: Evidence from a Statewide Survey." American Journal of Public Health, vol. 92, 2002, pp. 203-210.

Li, Xiaoming, Bonita Stanton, Lesley Cottrell, James Burns, Robert Pack, and Linda Kaljee. "Patterns of Initiation of Sex and Drug-Related Activities among Urban Low-Income African-American Adolescents." Journal of Adolescent Health : Official Publication of the Society for Adolescent Medicine., vol. 28, no. 1, 2001, pp. 46.

Magura, S., J. L. Shapiro, and S. -. Kang. "Condom use among Criminally-Involved Adolescents." AIDS Care, vol. 6, no. 5, 1994, pp. 595.

Raj, Anita, Jay G. Silverman, and Hortensia Amaro. "The Relationship between Sexual Abuse and Sexual Risk among High School Students: Findings from the 1997 Massachusetts Youth Risk Behavior Survey." Maternal and Child Health Journal, vol. 4, no. 2, 2000, pp. 125-134.

Resnick, M. D., P. S. Bearman, R. W. Blum, K. E. Bauman, K. M. Harris, J. Jones, J. Tabor, T. Beuhring, R. Sieving, M. Shew, L. H. Bearinger, and J. R. Udry. "Protecting Adolescents from Harm: Findings from the National Longitudinal Study on Adolescent Health." JAMA : The Journal of the American Medical Association., vol. 278, no. 10, 1997, pp. 823.

Santelli, John S., Leah Robin, Nancy D. Brener, and Richard Lowry. "Timing of Alcohol and Other Drug use and Sexual Risk Behaviors among Unmarried Adolescents and Young Adults." Family Planning Perspectives, vol. 33, no. 5, 2001.

Sen, Bisakha. "Does Alcohol-use Increase the Risk of Sexual Intercourse among Adolescents? Evidence from the NLSY97." Journal of Health Economics., vol. 21, no. 6, 2002, pp. 1085.

Tapert, Susan F., Gregory A. Aarons, Georganna R. Sedlar, and Sandra A. Brown. "Adolescent Substance use and Sexual Risk-Taking Behavior." Journal of Adolescent Health : Official Publication of the Society for Adolescent Medicine., vol. 28, n3, 2001, pp.181.

Upchurch DM and Kusunoki Y. “Associations Between Forced Sex, Sexual and Protective Practices, and STDs Among a National Sample of Adolescent Girls.” Women's Health Issues, vol. 14, no. 3, 2004, pp.75-84.



1 Five states did not apply for or returned PREP funding: Florida, Indiana, North Dakota, Texas, and Virginia.

2 Trained interviewers will read the survey aloud to respondents over the phone, and the interviewers will record the respondent’s answers on a hard copy (PAPI) survey.

3 Middle school youth in school-based settings will not complete an entry survey. The characteristics of the youth entering PREP programs will be available for youth 13 and younger in non-school settings, and for all youth 14 and older in school-based and non-school settings. As part of the performance measures plan, middle school youth are not required to answer the entrance survey questions on sexual risk behavior. The remaining measures on the participant entrance survey – demographic measures – will be collected from middle school youth at program exit, as originally planned.

4 Attendance will not be collected for youth participating in PREP programs during the school day. ACF assumes that attendance in these programs will be high.

5 Additional items may be dropped from the baseline survey, depending on site sensitivities. For example, in the Healthy Families San Angelo site (HFSA) where all participants are young mothers, items asking whether they have ever had sex or ever had a baby will be removed from the survey.

6 Turner, C.F., L. Ku, S.M. Rogers, L.D. Lindberg, J.H. Pleck, and F.L. Sonenstein. “Adolescent Sexual Behavior, Drug Use, and Violence: Increased Reporting with Computer Survey Technology.” Science, vol. 280, 1998, pp. 867–873.

7 Beebe, Timothy J., Patricia A. Harrison, James A. McCrae Jr., Ronald E. Anderson, and Jayne A. Fulkerson. “An Evaluation of Computer-Assisted Self-Interviews in a School Setting.” Public Opinion Quarterly, vol. 62, 1998, pp. 623–632.

8 Beebe, Timothy J., Patricia A. Harrison, Eunkyung Park, James A. McRae, Jr., and James Evans. “The Effects of Data Collection Mode and Disclosure on Adolescent Reporting and Health Behavior.” Social Science Review, vol. 24, no. 4, 2006, pp. 476–488.

9 Brener, Nancy D., Danice K. Eaton, Laura Kann, JoAnne Grunbaum, Lori A. Gorss, Tonja M. Kyle, and James G. Ross. “The Association of Survey Setting and Mode with Self-Reported Health Risk Behaviors Among High School Students.” Public Opinion Quarterly, vol. 70, 2006, pp. 354–374.

10 Webb, P.M., G.D. Zimet, J.D. Fortenberry, and M.J. Blythe. “Comparability of a Computer-Assisted Versus Written Method for Collecting Health Behavior Information from Adolescent Patients.” Journal of Adolescent Health, vol. 24, no. 6, 1999, pp. 383–388.

11 Schochet, Peter Z. “An Approach for Addressing the Multiple Testing Problem in Social Policy Impact Evaluations.” Evaluation Review, vol.33, no.6, December 2009.

12 Because all participants in the Healthy Families San Angelo (HFSA) site are young mothers. the survey consists of only one part, which includes questions for sexually active youth (Parts A and B1 combined).

13 The three year period for which we are requesting clearance cover the remaining years of the PREP program.

14 This figure excludes those youth participating in programs at impact study sites who will complete only an IIS Baseline Survey at program entry. The baseline survey will include the items on the entry survey.

15 We assume that 25 percent of the sample, not 10 percent, will be 18 or older because middle school youth in school settings are now removed from the sample.

16 Based on our review of state PREP plans and other documents, we estimate that 60 percent of youth served in PREP programs will be in school-based programs and that 40 percent will be served in out-of-school programs. We assume that 90 percent of youth in school-based PREP programs will complete the program and that 65 percent of youth in out-of-school PREP programs will complete the program. These assumptions yield an overall program completion rate of 80 percent.



17 We are currently requesting clearance for three years; over the three years for which we are requesting clearance, we expect that 140,760 youth will complete the programs and 133,722will complete a participant exit survey.

18 The 65 grantees include 45 states, the District of Columbia, the Virgin Islands, Puerto Rico, and the Federated States of Micronesia, and 16 grants made to tribes and tribal communities.

19 Our initial estimates were compiled based upon grantees’ 2011 planning documents, in which they estimated how many youth they intended to serve, the number of sub-awards they will make, and the number of expected program implementation sites. However, through our Design Survey interviews conducted in summer 2012 (ICR approved March 7, 2012, OMB Control No.: 0970-0398), after programs have begun, we now have a more accurate understanding of how many youth grantees expect to serve over the entire grant period, the number of sub-awards actually made, and the number and variation in the implementation sites.The PREP Design Survey interviews, conducted with grantees in summer 2012 about their program plans, revealed that there will be 350 sub-awardees and 1,400 implementation sites.

20 As mentioned previously, the 65 grantees include 45 states, the District of Columbia, the Virgin Islands, Puerto Rico, and the Federated States of Micronesia, and 16 grants made to tribes and tribal communities.

21 These estimated burden hours are being reduced to reflect the lower number of expected participants completing surveys and attending programs, and that attendance data will not be processed for PREP programs operating during the school day.

22 These estimated burden hours are being reduced to reflect the lower number of expected participants attending programs, and that attendance data will not be processed for PREP programs operating during the school day.

23 The burden for the second package approved was originally annualized over two years. Since the current request is for three years, burden for all packages has been annualized over three years.

24 The burden for this instrument was originally annualized over two years. Since all other instruments in this table have been annualized over three years, the burden for this instrument has been adjusted so that it is annualized over three years, in order for its burden to be summed with the other instruments.

25 Annual costs for the ICR approving the Design Survey data collection were reported as $125,000. However, reported costs were calculated over two years. The figure reported in this ICR – $83,333 – is the annualized cost, that is, the cost calculated over three years.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBCollette
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy