SRAE PAS Supporting Statement B_clean

SRAE PAS Supporting Statement B_clean.docx

OPRE Descriptive Study - Sexual Risk Avoidance Education Program Performance Analysis Study (SRAE PAS) [Descriptive Study - Performance Measures]

OMB: 0970-0536

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes


Sexual Risk Avoidance Education Program Performance Analysis Study

(SRAE PAS)



OMB Information Collection Request

0970-0536





Supporting Statement

Part B



December2023









Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Caryn Blitz, Selma Caal



Part B


B1. Objectives

Study Objectives

The objective of the Sexual Risk Avoidance Education (SRAE) Performance Analysis Study (PAS) is to document how SRAE-funded programs are operationalized in the field and assess program outcomes. The Family and Youth Services Bureau (FYSB) and Office of Planning, Research, and Evaluation (OPRE) in the Administration for Children and Families (ACF), at the Department of Health and Human Services (HHS), seek approval for the continued collection of performance measures data to (1) monitor the extent to which the programs meet SRAE implementation objectives and advance toward expected outcomes; (2) inform program improvement efforts; and (3) update ACF, grantees, and others on the program’s status and progress.


Generalizability of Results

This study is intended to present an internally valid description of the SRAE Program, not to promote statistical generalization to other programs or populations. The study will continue to include information on all SRAE grantees, subrecipient program providers, and participants who respond to data collection.


Appropriateness of Study Design and Methods for Planned Uses

The SRAE PAS is designed to describe the implementation and outcomes of the SRAE Program. The performance measures data collected through this descriptive study will continue to provide necessary information to ACF and grantees to effectively manage the programs. Entry and exit surveys of youth participating in SRAE are necessary to collect information on the demographic and behavioral characteristics of program participants, their responses to the program, and their perceptions of program effects. Administrative data from grantees and their subrecipient program providers are needed to understand the structure and features of SRAE programs, participant numbers, implementation supports, and staff perceptions of quality challenges and needs for technical assistance. Because these are performance measures, data are required on the universe of grantees, programs, and participants. As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.


B2. Methods and Design

Target Population

The target population for the SRAE PAS includes all 192 SRAE grantees, their subrecipient program providers, and youth participants. The grantees include 119 General Departmental SRAE (GDSRAE), 39 State SRAE (SSRAE), and 34 Competitive SRAE (CSRAE) grantees. Based on performance measures data from earlier rounds of data collection, the number of subrecipients is estimated at 741 across all grantees. Grantees are expected to serve approximately 1,464,945 participants over the three-year Office of Management and Budget (OMB) clearance period, for an average of about 488,315 new participants per year. Program participants are youth ages 10–20. The data collection instruments to be used by each target population are as follows:

  • Instrument 1: Participant Entry Survey – youth participants

  • Instrument 2: Participant Exit Survey youth participants

  • Instrument 3: Performance Reporting System Data Entry Form – grantees

  • Instrument 4: Subrecipient Data Collection and Reporting Form – subrecipient program providers

Sampling and Site Selection

The SRAE PAS will continue to include all SRAE grantees, subrecipient program providers, and participants. ACF will use the performance measures data to monitor and report on progress in implementing SRAE programs. In addition, the information will be used to support continuous quality improvement by (1) the program office, to improve the SRAE Program overall, and (2) grantees, to improve their own program(s). All SRAE grantees must be included in the study so that (1) the measures reflect the complete scope of the SRAE Program and (2) the data can be used to promote program improvement among all grantees.


B3. Design of Data Collection Instruments

Development of Data Collection Instruments

The SRAE performance measures and data collection instruments were developed through a deliberative process over two years and subsequently revised after data collection began. Several documents informed the development of the data collection instruments, including: (1) data collection instruments used to collect performance measures information for other adolescent pregnancy prevention grant programs; (2) performance progress reports used to monitor previous FYSB grant programs that promoted refraining from nonmarital sex; and (3) documentation of measures used in other relevant data collections, including survey items from the Youth Risk Behavior Survey, the National Longitudinal Study of Adolescent to Adult Health, and the National Youth in Transition Database. In addition, ACF consulted with FYSB program staff (grantees’ project officers), select SRAE grantees, and ACF leadership to obtain their feedback on the proposed measures, processes, and instruments.


Cognitive pretesting with nine youth ages 12 to 18 was conducted for the Participant Entry and Exit Surveys (Instruments 1 and 2). The cognitive pretest sample included males and females, as well as youth from a mix of racial and ethnic backgrounds. The survey questions were revised based on the results of these pretests. These revisions were previously reviewed and approved by OMB and are reflected in the current submission.

During the initial years of SRAE performance measures data collection, the instruments were revised several times to (1) address comments from SRAE grantees on the data collection instruments, (2) streamline the instruments to ask only those questions necessary to achieve the objectives of the SRAE PAS, and (3) include—and later revise—measures related to effects of the COVID-19 pandemic on SRAE program operations. Some of these revisions resulted in the creation of alternate versions of the entry and exit survey instruments to meet the needs of different types of grantees and populations of youth.1 These changes were previously reviewed and approved by OMB and are reflected in the current submission.

Each of the four data collection instruments addresses the study objectives described in Section B1 above. Instruments 1 and 2 capture information on the characteristics of the youth participating in the program and their perceptions of program effects, and Instruments 3 and 4 capture information on grantees’ and subrecipients’ implementation of SRAE programs. Both types of data will be used to monitor program implementation and outcomes, inform program improvement, and provide status and progress updates.


B4. Collection of Data and Quality Control

Instruments 12: Participant Entry and Exit Surveys. As in earlier rounds of SRAE performance measures data collection, each grantee and its subrecipients will make decisions regarding procedures for collecting the participant entry and exit surveys. Some grantees may have elected to work with local evaluators who will administer the performance measures surveys. Grantees and local evaluators could decide to use paper-and-pencil or web-based surveys in group or individual settings. Grantees will inform their individual program participants that participation is voluntary and that they may refuse to answer any or all of the questions in the entry and exit questionnaires.

Instruments 34: Performance Reporting System Data Entry Form and Subrecipient Data Collection and Reporting Form. Grantees will continue to report separately on levels of participant attendance, reach and dosage (see Figure 1). Data on these measures will continue to be collected by subrecipient program providers (Instrument 4). Administrative data on program features and structure, allocation of funds, and staff perceptions of quality challenges will continue to be collected by grantees and subrecipients (Instruments 3 and 4). Grantees will continue to prepare and submit their final data to ACF through the SRAE Performance Measures Data Portal. The Performance Reporting System Data Entry Form (Instrument 3) contains the list of all data elements grantees will submit, collected from among their subrecipients.

The timing of participant survey data collections will be customized for each site depending upon the start and end dates of each cohort of participants. Administrative performance measurement data will continue to be submitted to ACF once a year, and participant information will continue to be submitted to ACF twice a year.

Experiences to Date

SRAE grantees have been collecting performance measures data since January 2020, including participant entry and exit survey since September 2020. Throughout this time, there have been a number of circumstances that have impacted data collection.

  • SRAE grantees were not required to collect and submit data prior to 2020, which may have impacted initial data quality.

  • Since initial approval in 2020, there have been several rounds of revisions to the measures, including revisions requested by the previous administration, which delayed obtaining a uniform set of performance measures, and later revisions due to the COVID-19 pandemic.

  • Substantially fewer surveys have been administered than initially anticipated, due in part to the COVID-19 pandemic. The pandemic-related disruptions in data collection and SRAE programming included halting programming (and data collection) entirely when sites shut down due the pandemic; shifting to virtual programming, which made data collection more difficult (and sometimes required a change in mode of data collection, which takes time to develop); and staffing challenges, when some programming or data collection staff were unavailable due to the pandemic. Since then, participation has increased. The number of youth completing the participant entry and exit surveys more than doubled from the 2020–2021 program year to the 2021–2022 program year, and we expect the recovery to continue.

With the requirement of performance measures as part of the funding award process and a final uniform set of performance measures, this information collection is now collecting uniform data of great utility to the government. We are working to address administration issues experienced by grantees through targeted training and technical assistance (T/TA).

  • ACF’s contractor will continue to provide training and technical assistance to ensure that grantees and program providers understand the measures, instruments, and data collection processes. Later rounds of technical assistance will reflect data quality checks conducted on earlier data and focus on areas of potential error. For example, outliers in the measure of hours delivered suggest that some grantees have misunderstood what this measure is intended to capture, so the contractor highlighted this issue during a recent webinar.


B5. Response Rates and Potential Nonresponse Bias

Response Rates

Table B5.1 describes the respondents and expected response rates associated with the performance measures data collection.

Instruments 12: Participant Entry and Exit Surveys. For the entry survey, the goal is for 95 percent of youth participants, or approximately 463,899 participants (488,315 x 0.95), complete the participant entry survey each year.2 Based on our experience with similar performance analysis studies, we estimate that about 20 percent of the participants will drop out of the program prior to completion, leaving approximately 390,652 (488,315 x 0.80) participants at the end of the program annually. Of those, we expect 95 percent, or approximately 371,119 participants, will complete the participant exit survey each year. (These estimates are based on our experience with similar performance analysis studies, rather than the first years of the SRAE PAS, because disruptions related to the COVID-19 pandemic—which depressed responses in those years—are expected to abate.)

As in earlier rounds of SRAE performance measures data collection, response rates for participant surveys will be maximized through the administration of entry surveys to all participants at enrollment and administration of the exit surveys during final program sessions. For participants who are absent at program exit when exit surveys are administered, the exit surveys will be administered to them as soon as possible individually or in a small group.

Instruments 34: Performance Reporting System Data Entry Form and Subrecipient Data Collection Reporting Form. Because collecting and submitting data for performance measures is a funding requirement of all SRAE grants, the grantee and subrecipient response rates are expected to be 100 percent.

To reduce grantee burden and maximize grantee response rates, ACF will provide common data elements definitions across program models and obtain these data in a uniform manner through the SRAE Performance Measures Data Portal developed for earlier rounds of SRAE performance measures data submission (see Instruments 3–4). Because the submission of the performance measures data is a grant requirement, except in the cases when waivers are extended for the sensitive questions on the participant entry and exit surveys, ACF does not expect problems with non-response.

Table B5.1. Annual Respondent Universe and Expected Response Rates for the Study of Performance Measures

Data Collection

Type of Respondent

Number of
Potential Respondents

Expected Response Rate

Total Expected Responses

Instrument 1: Participant Entry Survey

Youth participant

488,315

95%

463,899

Instrument 2: Participant Exit Survey

Youth participant

390,652

95%

371,119

Instrument 3: Performance Reporting System Data Entry Form

Grantee Administrator

192

100%

192

Instrument 4: Subrecipient Data Collection and Reporting Form

Subrecipient Administrator

741

100%

741


NonResponse

As in earlier rounds of SRAE performance measures data, analyses will be conducted based on respondents’ submission of data for a given measure, with no imputation or weight adjustments to address missing data. Because we expect high response rates to all components of the data collection, we do not plan any nonresponse bias analysis.


B6. Production of Estimates and Projections

The performance measures data will continue to primarily be used internally by ACF and SRAE grantees, but will also be used to inform other stakeholders. For example, performance measures data will inform ACF’s annual reporting to the OMB on the progress of the SRAE Program and end-of-cohort reports will be made available to the public.

The performance measures data will continue to be submitted by all SRAE grantees, program providers, and youth participants. The analyses will include computation of statistics such as percentages and means based on the respondents; we will not produce estimates intended to apply to any broader population.


B7. Data Handling and Analysis

Data Handling

Performance measures data will continue to be collected by grantees and their subrecipients. In some cases, grantees may have engaged local evaluators to assist them with data collection.

Grantees will continue to enter performance measures data into the SRAE Performance Measures Data Portal maintained by ACF’s contractor. The entry screens of the Portal include a series of automated validity checks to identify some types of errors as the data are entered. Error messages will continue to alert grantees to inconsistencies between data elements, values beyond the expected range, and similar issues, and provide grantees an opportunity to correct such errors before the data are submitted. For example, an error message will appear if the number of facilitators reported to have received training or been observed for quality monitoring during the period exceeds the total number of facilitators reported for the period. The system will also continue to conduct automated checks to ensure that the full set of performance measures are entered (e.g., the grantee entered survey data for each program that served youth).

Once submitted, ACF’s contractor will conduct additional quality checks to identify remaining issues, as in earlier rounds of SRAE performance measures data. Cases with unresolved data issues may be omitted from analyses that rely on the problematic data elements. If suspect data are included in any tabulations, caveats will be included alongside the reported data.


Data Analysis

As in earlier rounds of SRAE performance measures data, the contractor will analyze SRAE performance data to generate performance measurement reports for ACF and other audiences. Core analyses will include computing means and sums of continuous numeric measures (such as number of participants served) and producing frequency distributions of categorical and character variables (such as program models implemented). Cross-tabulations will be used to explore potential relationships between variables, such as whether perceptions of program effects differ by participant’s age or other characteristics. We will examine changes in the performance measures data over time. Analyses will continue to be conducted separately for each of the SRAE Program’s three funding streams (GDSRAE, SSRAE, and CSRAE). Where feasible, we will obtain comparable measures for the youth population nationwide for comparison.


Data Use

ACF uses information from the performance measures to fulfill reporting requirements to OMB concerning the SRAE initiative. In addition, the performance measures data will be used to develop end-of-cohort reports and briefs, which will synthesize performance measures data across years. These reports will include summaries of the data collection and analysis methods as well as appropriate caveats and data limitations. Each year, the contractor will prepare a brief public-facing fact sheet that highlights key findings based on the performance measures for the previous reporting year. The first report and fact sheet are expected to be available in late 2023.


B8. Contact Person(s)



Attachments

  • Instrument 1: Participant Entry Survey for high school and older youth

  • Instrument 1a: Participant Entry Survey for middle school youth

  • Instrument 1b: Participant Entry Survey for high school and older youth in programs with impact evaluations

  • Instrument 2: Participant Exit Survey for high school and older youth

  • Instrument 2a: Participant Exit Survey for middle school youth

  • Instrument 3: Performance Reporting System Data Entry Form

  • Instrument 4: Subrecipient Data Collection and Reporting Form

1 Grantees use the original versions of the entry survey (Instrument 1) and the exit survey (Instrument 2) with high school and older youth. Alternative versions (Instruments 1a and Instrument 2a) were developed for use with middle school youth; these versions exclude some of the more sensitive items. Later, additional alternate versions of the entry survey were developed (Instrument 1b for high school and older youth and 1c for middle school youth) for use by grantees participating in impact evaluation studies; these versions include a limited number of questions to decrease burden due to grantees’ participation in other studies that often include extensive surveys. Previously there was a version of the exit survey for programs conducting impact studies but use of those was discontinued prior to the last OMB approval in 2022.

2 Response rates for 2021-2022 were lower than 95 percent however, we expect response rates will continue to improve so the estimated burden is based on the target response rate of 95 percent.

7

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLara Hulsey
File Modified0000-00-00
File Created2023-12-12

© 2024 OMB.report | Privacy Policy