PREP Eval - FFUP+Implementation - Part B - revised - 10-16-13_clean

PREP Eval - FFUP+Implementation - Part B - revised - 10-16-13_clean.docx

Personal Responsibility Education Program (PREP) Multi-Component Evaluation

OMB: 0970-0398

Document [docx]
Download: docx | pdf


U.S. Department of Health
and Human Services

Office of Planning, Research and Evaluation & Family and Youth Services Bureau,

Administration for Children and Families

7th floor West Aerospace Building

370 L'Enfant Promenade, SW

Washington, DC 20047

Project Officers: Clare DiSalvo, Dirk Butler



PART B: Statistical Methods for the Collection of Performance Measures, Implementation and Outcome Data - Personal Responsibility Education Program (PREP) Multi-Component Evaluation

0970-0398

Draft

June 2013







CONTENTS

PART b. INTRODUCTION 1

B1. Respondent Universe and Sampling Methods 1

B2. Procedures for Collection of Information 4

B3. Methods to Maximize Response Rates and Deal with Non-Response 6

B4. Test of Procedures or Methods to be Undertaken 8

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 8

TABLES

B1.1 Respondent Universe and Expected Response Rates for the Performance Analysis Study 2

B1.2 Minimum Detectible Impacts with Sample of 1,500 Youth 3



INSTRUMENTS

INSTRUMENT #1 : PARTICIPANT ENTRY SURVEY (PAS)

INSTRUMENT #2 : PARTICIPANT EXIT SURVEY (PAS)

INSTRUMENT #3 : PERFORMANCE REPORTING SYSTEM DATA ENTRY FORM (PAS)

INSTRUMENT #4 : IMPLEMENTATION SITE DATA COLLECTION PROTOCOL (PAS)

INSTRUMENT #5 : MASTER FOLLOW-UP SURVEY (IIS)

INSTRUMENT #6 : HFSA FOLLOW-UP SURVEY (IIS)

INSTRUMENT #7 : MASTER LIST OF TOPICS FOR STAFF INTERVIEWS (IIS)

INSTRUMENT #8 : STAFF SURVEY (IIS)

INSTRUMENT #9 : TOPIC GUIDE FOR FOCUS GROUP DISCUSSION WITH PARTICIPATING YOUTH (IIS)

INSTRUMENT #10: PROGRAM ATTENDANCE DATA COLLECTION PROTOCOL (IIS)





ATTACHMENTS

ATTACHMENT A: OVERVIEW OF THE PREP EVALUATION

ATTACHMENT B: ANALYSIS PLAN FOR PREP IIS STUDY

ATTACHMENT C: QUESTION BY QUESTION SOURCE TABLE FOR THE FOLLOW-UP SURVEY

ATTACHMENT D: SOURCES REFERENCED FOR THE FOLLOW-UP SURVEY

ATTACHMENT E: QUESTION BY QUESTION SOURCE TABLE FOR THE STAFF SURVEY

ATTACHMENT F: SOURCES REFERENCED FOR THE STAFF SURVEY

ATTACHMENT G: 60-DAY FEDERAL REGISTER NOTICE

ATTACHMENT H: PERSONS CONSULTED ON COLLECTION OF THE PAS AND IIS DATA

ATTACHMENT I: CONSENT LETTERS AND FORMS

ATTACHMENT J: ADVANCE LETTERS AND PROMPTS



PART b. INTRODUCTION

B1. Respondent Universe and Sampling Methods

In March 2010, Congress authorized the Personal Responsibility Education Program (PREP) as part of the Patient Protection and Affordable Care Act (ACA). PREP provides grants to states, tribes, tribal communities, and local organizations to support evidence-based programs to reduce teen pregnancy and sexually transmitted infections (STIs). As described in Part A, Congress mandated a federal evaluation of PREP when it authorized the program. In response, the Administration for Children and Families within the U.S. Department of Health and Human Services launched the PREP evaluation.

The evaluation includes the following three components: (1) the Design and Implementation Study (DIS), a broad descriptive analysis of how states are using PREP grant funding to support evidence-based teen pregnancy and STI prevention programs; (2) the Performance Analysis Study (PAS), focused on the collection and analysis of performance management data from state grantees, tribal grantees, and competitive PREP (CPREP) grantees; and (3) the Impact and In-Depth Implementation Study (IIS), designed to assess the impacts and implementation of funded programs in four to five selected PREP sites. Attachment A provides an overview of the multiple components of the PREP evaluation, including the components that have received OMB approval and the components included in this ICR.

In this submission, ACF is requesting OMB approval for data collection activities associated with two of the three PREP components: PAS and IIS. Clearance for performance measure data collection for state and tribal PREP grantees under the PAS component was received on March 12, 2013 (OMB Control # 0970-0398). In this submission, ACF is requesting clearance to extend the performance measure data collection plan that was cleared by OMB for state and tribal grantees to CPREP grantees. On March 12, 2013, ACF also received clearance for the baseline survey to be conducted as part of the IIS component of the PREP evaluation (OMB Control # 0970-0398). In this submission, ACF is requesting clearance for follow-up surveys to be conducted as part of the IIS impact analysis. ACF is also requesting clearance for instruments that will be used as part of the IIS implementation analysis.

Performance Analysis Study. As was the case with PAS data collection for state and tribal grantees, CPREP grantees will be asked to have participants complete brief surveys at program entry and exit (Instruments 1 and 2). The respondent universe for these surveys will be all youth participating in programming supported by CPREP grants. These surveys will provide data on the demographic and behavioral characteristics of program participants and participants’ perceptions of program effects and their responses to the program. In addition to the items collected through the participant entry and exit surveys, CPREP grantees will report on measures of program attendance, reach, and dosage, implementation challenges, and measures of program structure and cost and then summarize this information into a national reporting system on an annual basis (Instruments 3 and 4). CPREP grantees will provide some of this information directly and will collect some data from their implementation sites. ACF estimates that CPREP grantees will be supporting programming in approximately 300 sites.

To reduce the burden associated with CPREP performance measure data collection, ACF is following the same strategy used with state and tribal grantees, an approach approved by OMB on March 12, 2013 (OMB Control # 0970-0398). In particular, as with other PREP grantees, ACF will not ask CPREP grantees to administer participant entry surveys to middle school youth in school-based settings.1 (These youth will still be asked to complete exit surveys.) In addition, ACF will not ask CPREP grantees to provide attendance data in school settings offering programming during the school day, since most youth in school-based programs will attend schools (and thus these sessions) every day. Furthermore, ACF will reduce burden by not collecting data for the first grant year, and not collecting data for half of the second grant year.

CPREP grantees will administer participant entry surveys to program participants at the time they are enrolled in the program. An estimated 6,201 new participants eligible to take the entry survey will enroll in the study each year. Of those, it is expected that 5,891 (95 percent) will complete the PAS entry survey each year. Exit surveys will be administered to all participants who are still in the program at completion. Approximately 80 percent of youth who enroll in these programs are expected to complete the program.2 Of those who complete the program, it is expected that that 8,057 (95 percent) will complete the exit survey annually.3 Because the PAS participant surveys will be administered to all participants who are active at the time of entry and exit, with the exception of the entry survey exclusions mentioned above, no sampling is required for the PAS component of the evaluation. Table B1.1 presents the respondent universe with expected response rates for each respondent population.

Table B1.1. Annualized Respondent Universe and Expected Response Rates for the Performance Analysis Study

Data Collection

Type of Respondent

Number of Respondents

Expected Response Rate

Total Expected Responses

Instrument 1: Participant Entry Survey

Youth Participant

6,201

95%

5,8914

Instrument 2: Participant Exit Survey



Youth Participant



8,057



95%



7,6545

Instrument 3: Performance Reporting System Data Entry Form

Grantee Administrator

37

100%

37

Instrument 4: Implementation Site Data Collection Protocol

Site Facilitator

300

100%

300

Estimated Totals


14,595


13,882

Impact and In-Depth Implementation Study. From the universe of PREP grantees, ACF will select four or five program sites to participate in the IIS component of the PREP evaluation. The selected sites are not meant to be representative of PREP-funded programs as a whole. Rather, site selection is focusing on programs that (1) are large enough to support a rigorous evaluation of program effectiveness, (2) are implementing programs in a way that is amenable to random assignment, and (3) address priority gaps in the existing research literature on evidence-based approaches to teen pregnancy prevention. These gaps include evidence on effective programs for high-risk populations such as pregnant and parenting teens, or under-studied youth populations, such as youth living in rural areas.

IIS Impact Analysis. In each site, ACF expects to recruit and enroll a sample of 1,200 to 1,500 youth (for a total of 6,000 youth across four or five sites).6 Each site will be analyzed separately, so the relatively large samples of 1,200 to 1,500 youth per site are needed to detect policy-relevant impacts on key behavioral outcomes. Table B1.2 reports minimum detectible impacts on two illustrative outcomes—one with 50 percent prevalence (such as the proportion of high-risk teens that have had sex in the past three months) and one with 20 percent prevalence (such as the proportion of high-risk teens that have been pregnant or gotten someone pregnant or had an STI). Separate estimates are presented assuming either (1) random assignment of individuals to treatment and control groups or (2) random assignment of clusters of individuals (such as schools, clinics, or group homes). Separate estimates are also presented for analyses of full sample versus subgroup impacts. The table reports minimum detectible impacts for an assumed sample of 1,500 youth per site. However, smaller samples of 1,200 youth per site might, in some instances, be sufficient—for example, if the main research questions are limited to full sample impacts, not subgroup analyses.

Sample enrollment began in one site in May 2013. Other sites are expected to begin sample enrollment later in 2013. All eligible youth will be considered for enrollment until reaching the target sample of 1,200 to 1,500 youth per site. ACF does not expect to conduct any sampling of youth prior to enrollment.

Table B1.2. Minimum Detectible Impacts with Sample of 1,500 Youth


Percentage Point Impacts for Illustrative Binary Outcomes


Recent Sexual Activity
(Mean 50%)


Pregnancy or STI
(Mean 20%)


Full Sample

50% Subgroup


Full Sample

50% Subgroup

Individual Random Assignment

7.0

9.0


5.6

7.9

Cluster Random Assignment

9.1

11.4


7.3

9.2

Notes: Sample size of 1,500 youth refers to program and control groups combined. Figures assume that the sample is evenly divided between the program and control groups, a response rate of 75 percent, and that covariates explain 30 percent of the variance at the individual level. The figures also assume a two-tailed t-test with 80 percent power and a 95 percent confidence interval. For sites with cluster random assignment, the figures further assume a total of 16 clusters (evenly divided between the program and control groups), an intra-class correlation (ICC) of 0.01, and that covariates explain 30 percent of the variance at the cluster level.

IIS In-Depth Implementation Analysis. The IIS implementation data collection will take place in each of the IIS sites. Within each site, interviews will be conducted with diverse staff and community members who play substantive roles in program implementation, who are knowledgeable about the origins and operations of their program, and who can discuss any challenges encountered and how they were resolved. In addition, all frontline staff and supervisors will be asked to complete an online staff survey about program implementation and the support they receive for it. Finally, focus groups will be held with 8-12 participating youths per group who agree to participate in a focus group discussion. Youth will be randomly selected for the focus groups.

B2. Procedures for Collection of Information

Performance Analysis Study. Each CPREP grantee and their implementation sites will make decisions regarding procedures for collecting the participant entry and exit surveys (Instruments 1 and 2). Some grantees have elected to work with local evaluators that will administer the surveys for performance measure purposes; the local evaluators could decide to use paper-and-pencil or web-based surveys. For those grantees not working with local evaluators, it is likely that the program staff at the implementation sites will administer the entry and exit surveys using paper and pencil in group or individual settings. CPREP grantees will inform their individual program participants that participation is voluntary and that they may refuse to answer any or all of the questions in the entry and exit surveys. The response rate for both surveys is expected to be 95 percent.

CPREP grantees will report separately on levels of participant attendance, reach, dosage, and retention. Data on these measures will be collected by implementation site facilitators (Instrument 4). Administrative data on program features and structure, allocation of funds, fidelity to evidence-based program models, and staff perceptions of quality challenges will be collected by grantees through their administrators. CPREP grantees will prepare and submit their final data sets in aggregate form to ACF through the PREP reporting system. The Performance Reporting System Data Entry Form (Instrument 3) contains the list of all data elements CPREP grantees will report, collected from among their implementation sites. Because collecting and reporting data for performance measures is a funding requirement of the CPREP grants, the CPREP grantee response rate is expected to be 100 percent.

The timing of the PAS participant survey data collections will be customized for each site depending upon the start and end dates of each cohort of participants. Administrative performance measurement data will be submitted annually by CPREP grantees following the end of each grant year. These procedures are identical for CPREP grantees (addressed in this ICR) and PREP grantees (addressed in an ICR approved on March 12, 2013; OMB Control Number 0970-0398).

Impact and In-Depth Implementation Study. In each of the four or five sites selected for the IIS component of the PREP evaluation, all eligible youth for whom consent is obtained will be enrolled in the study. Each site will be responsible for providing the evaluation team with a list of eligible youth. The evaluation team will then work collaboratively with each site to identify youth for the study and obtain active written consent from the responsible parent or guardian for youth under age 18 and from the youth themselves for those age 18 or older. The evaluation team will then prepare a final roster of youth at each site for whom it has consent. Evaluation consent forms were included in an ICR approved on March 12, 2013 (OMB Control Number 0970-0398).

Follow-up Surveys for the IIS Impact Analysis. The follow-up survey will be administered to all consented sample members 8-12 months after random assignment and then roughly one year later. The evaluation team will work individually with each site to determine the best mode and procedures for survey administration. As discussed in Part A of this ICR, wherever possible, there will be a group administration of a self-administered paper-and-pencil instrument (PAPI). When necessary to increase response rates or accommodate specific populations, this method will be augmented with or replaced by a computer assisted telephone interview (CATI) follow-up or a telephone follow-up with hard copy. 7,8 For instance, in HFSA, the survey will be administered by telephone because the program serves youth in an individual setting in their home.

For group administration, the evaluation team will begin by handing out pre-identified survey packets to the youth whose names are on the packets, and obtaining youth assent. Each packet will consist of the PREP follow-up survey and a sealable return envelope. The survey will have a label with a unique ID number (no personally identifying information will appear on the survey or return envelope). Youth will self-administer the survey. The instrument has three parts (Part A, Part B1, and Part B2) to avoid asking youth who are not sexually experienced detailed questions about their sexual activities. Part A of the survey asks for background information and concludes with a single screening question about sexual experience. Youth with sexual experience will complete Part B1 and those without will complete Part B2. Two members of the evaluation team will monitor activities in each survey room. At the end of the survey administration, youth will place the entire survey in the return envelope, seal it, and return it to a member of the evaluation team. Completed surveys will be immediately shipped via FedEx to Mathematica’s Survey Operations Center for receipting, and then checked for completeness. Any forms with identifying information (assent forms) will be shipped separately from the surveys. All surveys that pass the check will be sent to a vendor for scanning. All scanned data will be electronically transmitted back to the evaluation team.

For youth who do not attend group administrations or when group administration is not feasible, the evaluation team will work collaboratively with each site to determine the best alternative mode of survey administration. Two options will be considered: individual administration of a PAPI survey over the telephone when small numbers of respondents cannot attend group administration, or individual administration through CATI when the majority or all respondents would not find group administration feasible. The HFSA survey, for example, will be administered as a CATI instrument as the program serves youth in their home, and the evaluation team will not have an opportunity for group-administration.

Site Visits for the IIS Implementation Analysis. The IIS implementation analysis will include two rounds of site visits one conducted early in the implementation period and the second later in the implementation period. The specific timing of site visits will be determined after sites are selected and specific implementation plans are known. The key activity during these visits will be interviews with state and local program staff. The master list of topics for staff interviews during site visits (Instrument 7) identifies the information that will be gathered from these staff to document the program context, implementation plans, the implementing organization and partner organizations, implementation systems, youth participation and engagement, and actual service delivery. Preparation for site visits will involve using the master list of topics to develop discussion guides customized to each site to ensure that site visitors collect the relevant needed information in an efficient, consistent way from the appropriate respondents.

Staff Survey for the IIS Implementation Analysis. All program staff and their supervisors will be invited to complete the staff survey (Instrument 8). This survey will be administered using Opinio, a web-based online survey application. Staff will be asked to complete these surveys twice, around the time of the two site visits. The evaluation team will send a link to the web-based survey via email to relevant PREP program staff. This will allow staff to complete the survey at a time most convenient for them and to support efficient follow-up via email to achieve a high response rate. The private format will also encourage open and honest responses about program implementation successes and challenges. Opinio will deliver the survey data in an electronic format that can be cleaned efficiently, assessed for missing data, and analyzed descriptively.

Participant Focus Groups for the IIS Implementation Analysis. Focus groups (Instrument 9) will be conducted with a subset of program participants during site visits. The objective of the focus groups will be to explore participants’ perspectives on the availability, quality, and value of program services. The focus groups will be used to learn about participants’ motivations for enrolling in the program, their participation and response to incentives offered, their experiences with each of the core services offered, their perceptions of the benefits of participation, and their overall satisfaction with program services.

Two rounds of focus groups will be conducted, corresponding with the two rounds of site visits. During each round, up to four focus groups of 8 to 10 youth each will be conducted in each of the four or five IIS sites. Participants will be selected randomly to participate in focus groups among youth who have completed the core program services.

Program Attendance Data for the IIS Implementation Analysis. Program attendance data will be collected for every participant at each IIS site. If the site maintains this attendance information in its existing administrative records system, sites will extract these data from their administrative system and provide it to the evaluation team. If the site does not already collect this attendance information, site staff will record participant attendance on a form provided by the evaluation team (Instrument 10). Attendance data will allow the evaluation team to document the proportion of program services that was actually delivered to participants. It will also allow the evaluation team to conduct exploratory impact analyses of the impact of these programs on the youth who actually received the core services.

B3. Methods to Maximize Response Rates and Deal with Non-Response

Performance Analysis Study. Response rates for PAS participants will be maximized through the administration of entry surveys to participants at enrollment and administration of the exit surveys during the final program sessions. Where feasible, exit surveys will be administered on an individualized basis to program exiters who are absent during final sessions when surveys are completed.

To reduce CPREP grantee burden and maximize CPREP grantee response rates, ACF is streamlining the PAS administrative data reporting process by providing common data element definitions across PREP program models and collecting these data in a uniform manner through the PREP reporting instrument (see Instrument 3). Because the submission of the performance measures data is a grant requirement, except in cases when waivers are extended for the sensitive questions on the participant entry and exit surveys, ACF does not expect problems with non-response.

The methods described for the CPREP grantees are identical to those approved for the PREP grantees by the OMB on March 12, 2013 (OMB Control Number 0970-0398).

Impact and In-Depth Implementation Study.

IIS Impact Analysis. ACF expects to achieve a response rate of 80 percent for the first follow-up survey and 75 percent for the second follow-up. We can expect to achieve these completion rates for several reasons. The first follow-up survey administration will occur at 8-12 months after random assignment. This timing will ensure contact data are quite current, which should minimize location problems. In school-based sites, in many cases, youth will be enrolled in the same schools at follow-up that they were enrolled in at baseline, simplifying locating efforts and improving response rates.

In addition, we expect that obtaining the site’s willing assistance will be very important to maximizing the response rate; we will invest significant effort in gaining their cooperation from the beginning of the study, minimizing burden on sites and assuring privacy to the youth participants. Sites will be given detailed information about the surveys, how they will be administered and on what schedule, what involvement and time will be required of school or organization staff, and how data will be used and protected. Bringing sites into the process while minimizing burden will ensure site support of the PREP data collections. By applying identical methods for maximizing the response rates of the treatment and control groups, the evaluation team does not anticipate differences in response rates across research groups.

Prior to follow-up survey administration in the school-based sites, the evaluation team will work closely with school contacts to locate respondents in their new classrooms. Evaluation team members will ask schools to post reminders and make announcements prior to and on the day of the survey administration to maximize attendance. On the day of the survey administration, contractor staff will take attendance prior to beginning administration and immediately follow-up with the school contact regarding any unexpected absentees. Sample members who have transferred schools or moved out of the area will be tracked and given the option to complete the survey by telephone.

In sites where group-based administration is not possible, such as HFSA, an advance letter will be sent to sample members, notifying them of the data collection and providing them with the information necessary to complete the survey over the phone. Additional telephone, email and text prompts to youth and parents will be conducted as needed (Attachment J).

Additionally, gift cards will be provided to respondents to encourage participation in the survey. For group administration, a $15 gift card will be provided to participants completing the first follow-up survey and a $20 gift card will be provided to participants completing the second follow-up survey. For participants who complete the survey by telephone, a $20 gift card will be provided to those completing the first follow-up survey and a $25 gift card will be provided to participants completing the second follow-up survey. A $25 gift card, as opposed to a $20 gift card is offered to respondents to telephone surveys because completion outside of group administration requires greater initiative and cooperation on behalf of the respondent, as well as additional time outside of the school or their ordinary day.

The evaluation team anticipates high response rates to follow-up surveys. Even so, the team will take steps to understand the nature of any non-response and to account for the threat that it may pose for the validity of the study’s impact estimates. Using data from the baseline survey, evaluation team members will first test for statistically significant differences across demographic and baseline outcome variables between respondents and nonrespondents. Any such differences will be controlled for in the analyses by using non-response weights. The team will also test for differences between the research groups in their baseline characteristics and control for these differences using covariates when estimating program impacts (see Attachment B).

IIS In-Depth Implementation Analysis. To ensure high response rates to data collection efforts associated with the IIS implementation analysis, site visits will be planned well in advance so that all identified respondents can participate in individual or group interviews, as appropriate. To increase participation in focus groups, youth who are selected to participate will be offered a $25 gift card for participating. To ensure that attendance data are recorded completely and accurately, the evaluation team will routinely review attendance information provided by sites and follow up with program staff if information is incomplete.

B4. Test of Procedures or Methods to be Undertaken

Performance Analysis Study. Cognitive pretesting with nine youth ages 13 to 18 has been conducted for both the PAS entry and exit surveys. The cognitive pretest sample included males and females and included youth from a mix of racial and ethnic backgrounds. Survey questions have been revised based on the results of these tests.

To ensure that the PREP reporting system functions as intended and in a user-friendly manner for grantees’ entry of administrative performance measurement data, the system was tested by an internal team of PAS reviewers who designed the system specifications, independent of the system developer. This team has reviewed all data import/entry, reporting, calculation, and extract functions of the system to ensure that grantee end-users will find the system to be efficient and user-friendly.

Impact and In-Depth Implementation Study. As discussed in Part A of this information collection request, the follow-up survey is very similar to the baseline survey, which recently received OMB clearance, and to the PPA baseline and follow-up surveys. The OMB-approved PPA baseline and follow-up surveys were pre-tested prior to receiving OMB clearance and have been administered to approximately 5,000 and 2,500 adolescents, respectively, as of March 2013. New items added specifically for PREP were generally drawn from established sources (see Attachments C and D).

Attachment E provides a question-by-question source table for the IIS staff survey. Many of the items are drawn from established sources and have been tested and refined through those survey efforts. Attachment F provides an annotated list of the established sources.

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Performance Analysis Study. Data for the PAS will be collected by CPREP grantees. In some cases, grantees will have engaged local evaluators who will assist them in performance measure data collection. CPREP grantees will report these data in aggregate form into the PREP reporting system that will be maintained by ACF’s contractor, RTI International. RTI International will provide data extracts from this reporting system to ACF’s evaluation contractor Mathematica Policy Research. The data extract files will include data collected by CPREP grantees, as requested in this ICR, as well as PAS data collected by PREP grantees (approved on March 12, 2013 OMB Control Number 0970-0398). Mathematica and its subcontractor, Child Trends, will use these extract files to analyze PREP performance data and to generate performance measurement reports for ACF.

Impact and In-Depth Implementation Study. Follow-up survey data for the Impact Study will be collected and analyzed by ACF’s prime contracting organization, Mathematica Policy Research. The PREP Implementation Study site visits will be conducted by ACF’s contracting organization, Mathematica Policy Research, and its subcontractors, Twin Peaks and Child Trends.

Attachment H lists the individuals whom ACF consulted on the collection of the PAS or IIS instruments.

1 As described in Part A, middle school youth will not be asked sensitive questions concerning their sexual activity as part of performance measure data collection. These sensitive questions represent a key component of the entry surveys. Since these questions will be dropped for middle school youth, ACF has concluded that the entry surveys for these youth are of less value than other PAS data collection elements for understanding the population served by PREP. For this reason, as with state and tribal PREP grantees, ACF is dropping entry surveys for these youth as best method to reduce burden while still collecting the most valuable and relevant data.

2 Based on a review of CPREP plans and other documents, 60 percent of the youth served in the CPREP programs are estimated to be in school-based programs and 40 percent will be served in out-of-school programs. We assume that 90 percent of youth in school-based CPREP programs will complete the program and that 65 percent of youth in out-of-school CPREP programs will complete the program. These assumptions yield an overall program completion rate of 80 percent.

3 There will be more exit surveys then entry surveys because middle school students in school-based settings will not complete entry surveys; however, they will complete exit surveys.

4 This is the annualized number of participant entry survey responses. Table A12.1 (in Supporting Statement A) shows the total number of participant entry survey responses: 17,673.

5 This is the annualized number of participant exit survey responses. Table A12.1 (in Supporting Statement A) shows the total number of participant entry survey responses: 22,961.

6 Some youth or their parents will not consent to be part of the PREP evaluation. These sample size estimates are for youth who complete the consent process to be part of the evaluation.

7 Trained interviewers will read the survey aloud to respondents over the phone, and the interviewers will record the respondent’s answers on a hard copy survey.

8 We assume an 80 percent response rate for the IIS first follow-up survey and a 75 percent response rate for the IIS second follow-up survey.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorbgoesling
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy