Supporting Statement for
“Office of Adolescent Health and Administration on Children, Youth and Families Teen Pregnancy Prevention Performance Measure Collection”
A. Background and Justification
A.1. Need and Legal Basis
This document provides a Supporting Statement to accompany a request for approval of collection of performance measures for teen pregnancy prevention programs administered by the Office of Adolescent Health (OAH) and the Administration on Children, Youth and Families (ACYF). The purpose of this data collection is to collect data that will allow OAH and ACYF to monitor progress of program grantees, and to report to Congress on the performance of the programs.
The program administered by OAH is the Teen Pregnancy Prevention program (TPP), was originally authorized under the Consolidated Appropriations Act, 2010 (P.L. 111-117) and currently operates under authority contained in Consolidated Appropriations Act, 2012. The Act provides $105,000,000 in FY 2012 for making competitive contracts and grants to public and private entities to fund medically accurate and age appropriate programs that reduce teen pregnancy, and for the Federal cost associated with administering and evaluating such grants and contracts. The program administered by ACYF uses funds available through the Personal Responsibility Education Program Innovative Strategies (PREIS), authorized by the Patient Protection and Affordable Care Act, 2010 (P.L. 111-148). The Act authorized ACYF to award $10 million in grants to entities to implement innovative youth pregnancy prevention strategies.
Grants for teen pregnancy prevention under both TPP and PREIS were awarded for a five year project period. TPP funded a total of 94 grantees, and PREIS a total of 13 (Exhibit 1). Of the 94 TPP grantees, 75 are “Tier 1”—grants to replicate programs that have already been proven effective to reduce teenage pregnancy. Of the Tier 1 grantees, 59 are funded at levels between $400,000 and $1 million a year (A/B), and 16 are funded at levels between $1 million and $4 million a year (C/D). A total of 23 different evidence-based programs are being implemented by the 75Tier 1 grantees. Interventions for these different programs vary widely in terms of duration (from 1 day to 4 years), setting (schools, clinics, or community based settings), populations served (middle school students, high school students, parents of teens) and content (e.g., youth development programs or sex education programs).
The remaining 19 TPP grants (“Tier 2”) and the 13 PREIS grants are research and demonstration grants to develop, refine, and test additional models and innovative strategies. Tier 2 and PREIS grantees are funded at levels of between $400,000 and $1 million per year. Tier 2 and PREIS grants focus on areas with high teen pregnancy rates and high-risk, vulnerable, and culturally underrepresented youth populations, including youth in foster care, runaway and homeless youth, pregnant and parenting teens, youth living in areas with high teen birth rates, delinquent youth, and youth who are disconnected from usual service delivery systems.
The Tier 1 C/D and the Tier 2 and PREIS grantees are all required to conduct independent rigorous evaluations, but the Tier 1 A/B grantees are not.
Exhibit 1: Summary of TPP and PREIS grants
|
Agency |
Description |
Independent rigorous evaluation |
# of grants |
TPP grants |
|
|
|
|
Tier 1 A/B |
OAH |
Replication grants funded at <$1 million/year, |
No |
59 |
Tier 1 C/D |
OAH |
Replication grants funded at >$1 million/year |
Yes |
16 |
Tier 2 |
OAH |
Research and demonstration grants |
Yes |
19 |
PREIS grants |
ACYF |
Research and demonstration grants |
Yes |
13 |
TOTAL |
|
|
|
107 |
The performance measure collection is important to OAH and ACYF because it will provide the agency with data to both effectively manage the TPP and PREIS programs, and to comply with accountability and federal performance requirements for the 1993 Government Performance and Results Act (P.L. 103-62). Moreover, collecting and reporting on data for performance measures are a funding requirement for the grants, as stated in the funding opportunity announcement.
Measures to assess changes in participant behaviors (e.g., sexual activity, contraceptive use, condom use) or intentions (intention to have sex, use contraception, or use condoms) require a comparison group for meaningful interpretation. Only those grantees with rigorous evaluations will have data on both program participants and a comparison group; therefore, they will be the only grantees to report data on these performance measures. All grantees will, however, be required to report on measures of participants’ perceptions of the impact the program has had on their sexual activity, condom use, and contraceptive use. For grantees with rigorous evaluations, these questions will be part of the evaluation instruments they are already using; for grantees without rigorous evaluations, collecting these data will require that they administer a brief questionnaire to all program participants.
Another difference in performance measures between those with rigorous evaluations and those without is that OAH and ACYF expect that all grantees with rigorous evaluations will disseminate the information learned from their evaluations through publications and presentations. This is not an expectation for grantees that do not have a rigorous evaluation.
Performance measures also vary slightly based on whether grants are replication grants (Tier 1) or research and demonstration grants (Tier 2 and PREIS). Notably, an important objective for Tier 2/PREIS grantees is that, if effective, they will package their interventions so that they can be replicated in the future. Because Tier 1 grantees are implementing programs that have already been packaged for replication, this is not a relevant measure for them.
The performance measures to be reported by grantees are summarized in Exhibit 2. As shown, a few of the measures will be used by OAH and ACYF only for purposes of managing the programs, not as performance measures to report to Congress. Most of the measures, however, will be used for both purposes.
Exhibit 2: Measures to be reported by grantees*
|
Data source |
Measures collected, reported to OAH/ACYF |
Measures reported to Congress |
||||
A/B |
C/D |
Tier 2/ PREIS |
A/B |
C/D |
Tier 2/ PREIS |
||
Participant-level measures** |
|
|
|
|
|
|
|
Behaviors and intentions (rigorous outcome data) |
|
|
|
|
|
|
|
|
Grantees’ evaluations |
|
x |
x |
|
x |
x |
|
|
x |
x |
|
x |
x |
|
|
|
x |
x |
|
x |
x |
|
|
|
x |
x |
|
|
|
|
|
|
x |
x |
|
|
|
|
|
|
x |
x |
|
|
|
|
|
|
x |
x |
|
|
|
|
Perception of program impact (post-test only data) |
|
|
|
|
|
|
|
|
Grantees’ evaluations/ Questionnaire administered to program participants |
x |
x |
x |
x |
x |
x |
|
x |
x |
x |
x |
x |
x |
|
|
x |
x |
x |
x |
x |
x |
|
|
|
|
|
|
|
|
|
Grantee/intervention-level measures |
|
|
|
|
|
|
|
Soundness of evaluations |
|
|
|
|
|
|
|
|
Assessment by Federal evaluation TA contractor |
|
x |
x |
|
x |
x |
|
|
x |
x |
|
x |
x |
|
|
|
|
|
|
|
|
|
Dissemination |
|
|
|
|
|
|
|
|
Administrative records of grantees |
x |
x |
x |
x |
x |
x |
|
|
x |
x |
|
|
|
|
|
|
x |
x |
|
|
|
|
|
x |
x |
x |
|
|
|
|
|
|
|
x |
|
|
x |
|
|
|
|
|
|
|
|
|
Reach and retention |
|
|
|
|
|
|
|
|
Administrative records of grantees |
x |
x |
x |
x |
x |
x |
|
x |
x |
x |
|
|
|
|
|
x |
x |
x |
x |
x |
x |
|
|
x |
x |
x |
|
|
|
|
|
x |
x |
x |
x |
x |
x |
|
|
x |
x |
x |
|
|
|
|
Dosage |
|
|
|
|
|
|
|
|
Grantee attendance records |
x |
x |
x |
|
|
|
|
x |
x |
x |
|
|
|
|
|
x |
x |
x |
|
|
|
|
|
x |
x |
x |
|
|
|
|
|
|
|
|
|
|
|
|
Fidelity |
|
|
|
|
|
|
|
|
Fidelity monitoring logs |
x |
x |
x |
|
|
|
|
x |
x |
x |
|
|
|
|
|
Observation forms |
x |
x |
x |
x |
x |
x |
|
x |
x |
x |
x |
x |
x |
|
|
Fidelity process form |
x |
x |
x |
|
|
|
* The three types of grantees represent different funding levels, resources and grant requirements. Therefore, as the table demonstrates, data will be reported by grantee type.
** Rigorous outcome data is only available for a subset of A/B grantees through the Federal TPP Replication Evaluation that begins this Fall. This data will be incorporated into our reporting beginning in 2015.
A.2. Information Users
The proposed data collection activities will provide OAH and ACYF leadership and program officers with information that will help them to more effectively manage the TPP and PREIS programs, respectively. The data will also be made available to members of Congress, the Office of Management and Budget, and the public at large to assess program performance.
A.3. Use of Information Technology and Burden Reduction
Grantees will enter performance measure data into a multi-use, Web-based reporting system. The Web-based system can reduce burden for respondents by programming in skip patterns, so that grantees only have to look at questions that are relevant for them. Programming will automatically perform necessary calculations for respondents, and will validate responses. A branching mode of presentation will allow respondents to go directly to the sections they need, without having to go through the system in a linear progression. The system will also automatically produce a clean data set, which will save time on preparation of the data for analysis.
A.4. Efforts to Identify Duplication and Use of Similar Information
The OAH/ACYF performance measures data collection is the only data collection that will provide information on the performance of the TPP and PREIS programs. The data collection will make use of existing data to the extent possible. For example, Tier 1 C/D, Tier 2, and PREIS grantees will already be conducting rigorous evaluations of their programs. These grantees will use findings from their evaluations to report on behavioral participant-level measures. Most of the additional measures will already be collected by grantees as part of their routine administrative records (e.g., numbers of publications, numbers of participants).
The perceived impact measures will need to be collected specifically for purposes of performance measurement. Grantees that do not have a rigorous evaluation cannot directly assess the impact of the program on key program outcomes such as sexual activity and condom or contraceptive use. These measures of perceived program impact are, therefore, critical, as they are the only measure of the program’s possible influence on these key program outcomes. Many grantees (including Tier 1 A/B grantees) will already be collecting data from program participants at program end; these grantees can integrate the questions related to perceived impact into their existing questionnaires. Grantees that were not planning to collect data from program participants at program end will need to add a data collection to collect this information.
All of the demographic, perceived impact and behavior and intention measures in the proposed collection have been previously approved by OMB through collection OS 0990-0382, “Evaluation of Pregnancy Prevention Approaches”.
A.5. Impact on Small Businesses
No small businesses will be involved in the collection of data in this study.
A.6. Consequences of Not Collecting the Information/Collecting Less Frequently
GPRA requires that government agencies report on their performance measures annually. Therefore, it is essential that grantees report on these performance measures annually to OAH and ACYF. In addition, collection and reporting of performance measure data is a requirement of all TPP and PREIS grantees as stated in the Funding Opportunity Announcement.
A.7. Special Circumstances
There are no special circumstances that occur when collecting this information.
A.8. Federal Register Comments and Persons Consulted Outside the Office of Adolescent Health
A 60-day notice was published in the Federal Register on June 9, 2011, in Volume 76, Number 111, page 33760, and provided a 60-day period for public comments (Appendix A). No public comments were received.
OAH and ACYF consulted with staff of RTI International, the contractor responsible for assisting OAH and ACYF in developing the performance measures and performance measure reporting system, and a panel consisting of experts in the fields of performance measurement, teen pregnancy prevention, and evidence-based practice. In addition, OAH presented information on the performance measures to TPP and PREIS grantees and their evaluators at two conferences, and solicited their input. OAH also consulted and received feedback from other Federal staff working in the area of teen pregnancy prevention from ASPE, ACF, and CDC.
A list of individuals in the expert panel who provided input regarding the process evaluation is found in Exhibit 3.
Exhibit 3. Persons Consulted Outside the Agency |
||||||||||||||||||||
Expert Work Group |
||||||||||||||||||||
|
|
|||||||||||||||||||
|
|
|||||||||||||||||||
|
|
|||||||||||||||||||
|
|
A.A.9. Payments to Respondents
There will be no payments to staff of grantee organizations completing the performance measure reporting form. For data collected from participants, many grantees will be rolling the questions into questionnaires they are already using for evaluation purposes to collect data from participants. The agency agrees to provide a description of the form, type and amount of incentives offered participants by grantees. This description will be added to the public record in relation to this information collection request.
A.10. Assurance of Confidentiality
Respondents are told that we will keep their data private to the extent allowable by law. They are not being guaranteed confidentiality.
The Web-based reporting system will be designed to ensure the security of the data obtained. Electronic data are stored in a location within the RTI network that provides the appropriate level of security based on the sensitivity or identifiability of the data. No personal identifiers will be used in the reporting of any data.
Individual users designated by the grantees will be assigned user names and passwords that will grant them access to Hatteras, a web-based data collection system. Hatteras will guide the user through a series of questions to collect information that will be stored in a secure Microsoft SQL Server database utilizing a relational table structure, facilitating expedient data retrieval and analysis. The database server, located at RTI, will be accessible only to the statisticians and analysts assigned to this project. Electronic communications will occur via a secure Internet connection. All transmissions will be encrypted with 128-bit encryption through secure socket layers (SSL) and verified by a VeriSign®, the leading SSL Certificate authority.
To ensure data security, all RTI project staff are required to adhere to strict standards and to sign agreements as a condition of employment on the process evaluation. Survey responses will be stored on a secure, password-protected computer shared drive. All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only. No respondent identifiers will be contained in reports generated by RTI, and results will only present data in aggregate form.
A.11. Sensitive Questions
The primary objective of the TPP and PREIS programs is to prevent teen pregnancy. The programs do this by promoting a decrease in sexual activity and/or an increase in contraceptive use. Because this is the primary focus of the programs, questions for the programs’ performance measures are necessarily related to these sensitive issues. Grantees with a rigorous evaluation would already be asking program participants (and adolescents in a comparison group) about sexual activity and contraceptive use as part of their evaluations. The only sensitive questions that adolescents will be asked specifically for this data collection are four questions about their perception of the program’s impact on their behaviors. These questions are:
1. Would you say that being in [NAME OF PROGRAM] has made you more or less likely to have sexual intercourse in the next year?
2. Would you say that being in [NAME OF PROGRAM] has made you more or less likely to abstain from sexual intercourse in the next year?
3. If you were to have sexual intercourse in the next year, would you say that being in [NAME OF PROGRAM] has made you more or less likely to use any of these methods of birth control?
Condoms
Birth control pills
The shot (Depo Provera)
The patch
The ring (NuvaRing)
IUD (Mirena or Paragard)
Implant (Implanon)
4. If you were to have sexual intercourse in the next year, would you say that being in [NAME OF PROGRAM] has made you more or less likely to use a condom?
OAH will consider a waiver to grantees collecting this data, on a case by case basis, if the grantee can provide adequate justification–for example, a very young client population (sixth grade or under) or, in the case of a school-based project, opposition from a school board or district.
In addition, grantees will inform their individual respondents that their participation is voluntary and that they may refuse to answer any or all of the questions in the instrument. Participants will also be informed of the measures to protect the privacy of their answers.
Grantees will also be reporting data on sensitive issues to OAH. For grantees that have a rigorous evaluation, sensitive topics they will be reporting on include the proportion of youth who are sexually active, using contraception, and using condoms; and the proportions who intend to have sex, use contraception, and use condoms. In addition, all grantees (including those that do not have rigorous evaluations) will report on the 4 above questions about participants’ perception of the program’s impact. All of this data will be reported in the aggregate, however, and there will be no means to identify responses by individuals.
A.12 Burden Estimate (Total Hours & Wages)
A.12A Estimated Annualized Burden Hours
The total annual burden is estimated to be 2,942 hours for grantees (to collect, summarize, and report the data for the performance measures), and 4,212 hours1 for program participants (to respond to the survey questions about perceived impact of the program). The burden for the perceived impact questions includes youth in 7th grade or higher, all remaining youth in the A/B grantee sample, and remaining treatment youth in the rigorous evaluation grantee samples2. The measures of soundness of evaluation plans and soundness of evaluations will be reported by an OAH contractor providing evaluation technical assistance to grantees with rigorous evaluations (Tier 1 C/D and Tier 2/PREIS). The contractor will be assessing the grantees’ evaluation plans (in year 1) and evaluations (in years 2-4) and rating them as either adequate or inadequate as part of their regular work. As a result, there is no additional burden for these ratings to be used as performance measures.
Of 107 grantees, 6 had estimated extremely large service populations. One has since revised its service plan and will not be serving 6000 or more youth, leaving 5 grantees with very large service populations. All five grantees will use a random sampling procedure to reduce overall burden while retaining statistical rigor. See Table A.12.1 (below).
Table A.12.1 Estimated Sample Respondents Among Largest Service Group Grantees
Grantee |
Estimated Service Group |
Sample Respondents |
Of Sample Respondents, Those Receiving Treatment Condition |
1 |
20,000 |
2,000 |
1,000 |
2 |
15,000 |
6,000 |
3,000 |
3 |
6,000 |
6000 |
3000 |
4 |
9,000 |
9000 |
4500 |
5 |
7,000 |
7000 (no evaluation) |
3500 (subsample to report on) |
Average burden hours for grantees
All of the data except the soundness of evaluation plans and soundness of evaluations will be reported by the grantees. Because reporting requirements are slightly different for Tier 1 A/B grantees as compared to Tier 1 C/D and Tier 2/PREIS grantees, we calculate the average burden to each separately. With the exception of participant-level measures, grantees will be collecting all of the data required for the performance measures as part of their administrative record-keeping, so the only additional burden to grantees for reporting the performance measures is the time it takes them to assemble the necessary data and enter it into the reporting forms.
Reach and retention. Grantees will report semi-annually on measures of reach and retention. The reach data indicate the number of participants, by different background factors, the program is reaching. These data will be based on basic demographic information that grantees collect on program participants when they are enrolled in the program. Grantees will also report semi-annually on the number of partners they are working with, partners retained, and the number of facilitators trained. Grantees will be collecting these data for their own administrative purposes, and many will have their own systems in place to track the data. For grantees that have their own system, the performance measure reporting system will provide a mechanism to directly import that data from the grantees’ systems. Grantees that do not have their own system will be able to enter the data directly into the reporting system. We estimate that it will take each grantee approximately 4 hours to summarize and report these data each time, for a total of 8 hours per year.
Dissemination and dosage. Grantees will report annually on measures of dissemination and dosage. We estimate that this will take each grantee approximately 1 hour to summarize and report these data each year.
Fidelity. Grantees will be collecting several types of data related to fidelity as part of their ongoing administration of their programs. These include measures of adherence and quality, based on observations of a sample of sessions; a measure of adherence based on self-assessment forms completed by session facilitators; and a process measure of fidelity assessing the extent to which grantees have the necessary processes in place to ensure fidelity, to be completed by the grantee staff. The collection of these fidelity data was a requirement stated to grantees in the funding opportunity announcement to which they responded, so that is not a burden imposed by the performance measures data collection—only the actual reporting of these data. We estimate that it will take grantees approximately 2 hours to summarize and report these data each year.
Perceived impact. For the perceived impact questions, grantees will need to administer the questionnaires to program participants and enter the data into a database, which they will then upload into the web portal. Questionnaires will have a total of 10 questions—4 related to perceptions of impact, and 6 related to demographic characteristics of the respondents (age, grade, sex, ethnicity, race, and language spoken at home).
Administration of questionnaires. Many grantees will be administering post-test questionnaires for their own evaluation purposes, and the perceived impact questions could be easily integrated into these questionnaires, with little to no additional burden for administration. However, because some grantees may not be planning to administer post-tests, we conservatively estimate that all grantees will have to administer the perceived impact questions independently of any other data collection. The number of participants each grantee plans to serve in a year varies widely, from approximately 60 to approximately 20,0003. The average is close to 1000 per year. The way in which the programs deliver their programs also varies widely, but many are classroom based. We estimate that, if there are 25 participants per class, grantees would be working with an average of approximately 40 classes a year (1000 participants/25 per class = 40 classes). Each grantee would therefore be administering the questionnaire an average of 40 times per year. We estimate that it would take the class facilitator approximately 10 minutes per class to administer the questionnaire, for a total of 6 hours and 40 minutes per grantee per year.
Data entry. Assuming an average of 1000 participants per year, we estimate that it would take grantees approximately 8 hours to enter the data from 1000 questionnaires containing 9 questions each.
Uploading the data. We estimate that it would take each grantee 1 hour to upload the data.
The burden to Tier 1 C/D and Tier 2/PREIS grantees is the same as for the Tier 1 A/B grantees, except that the Tier 1 C/D and Tier 2/PREIS grantees will be reporting on seven additional participant-level measures, in addition to the three related to perceived impact.
Other participant-level measures. Collection of the data and data entry are not an additional burden, because the grantees are collecting and entering these data as part of their evaluations. However, the time to create a dataset that contains only the variables needed for the performance measures in the required format is an additional burden. Grantees (or their evaluators) will be able to produce the required information with simple programming statements, so we estimate that it will take no more than 1 hour for each grantee to upload these data.
Average burden hours to program participants
The estimate of burden to participants is based on the number of participants that grantees expect to serve over the course of 4 years of program implementation and the estimated amount of time it will take the participants to respond to the 10 questions that will be asked of all program participants (3 about the perceived impact of the program and 6 about demographic characteristics). The 107 grantees project that they will reach approximately 204,000 participants over the course of 4 years, for an average of 51,000 per year. We expect that it will take each participant approximately 5 minutes (0.08 hours) to respond to the 10 questions.
Estimated annualized burden hours
Calculation of the total estimated annualized burden hours is shown in Exhibit 4. Burden to participants is calculated as described above (50,547 program participants per year, 1 response per year, and 5 minutes per response). The total burden to participants is 4,212 hours.
For grantees, as calculated above, we estimate that it will take each of the 107 grantees 4 hours to report their data related to reach (Appendix D), and they will report this data twice a year. The total burden for reporting this data is thus 856 hours. The rest of the data will only be collected from grantees once a year. As calculated above, we estimate that it will take the 59 Tier 1 A/B grantees 18.67 hours each to complete this form, for a total burden of 1,121 hours (Appendix E), and that it will take the 48 Tier 1 C/D and Tier 2/PREIS grantees 20.67 hours each to complete this form, for a total burden of 1,008 hours (Appendix F). The total burden to participants and grantees is 7,197 hours.
Exhibit 4. Estimated Annualized Burden Hours
Forms
|
Type of Respondent |
Number of Respondents |
Number of Responses per Respondent |
Average Burden Hours per Response |
Total Burden Hours |
Perceived impact questions |
Youth participating in programs |
50,547 |
1 |
5/60 |
4,212 |
Reporting form for reach |
Grantee program staff |
107 |
2 |
4 |
856 |
Tier 1 A/B performance measure reporting form |
Grantee program staff—Tier 1 A/B |
594 |
1 |
19 |
1121 |
Tier 1 C/D and Tier 2/PREIS performance measure reporting form |
Grantee program staff—Tier 1 C/D and Tier 2/PREIS |
485 |
1 |
21 |
1008 |
Total |
|
50,654 |
|
|
7,197 |
Exhibit 5. Estimated 1-Year Annualized Cost to Respondents
Forms |
Type of Respondent |
Number of Respondents |
Total Burden Hours |
Hourly Wage Rate |
Total Respondent Costs |
Perceived impact questions |
Youth 18 or older |
2800 |
233 |
$7.25 |
$1,689.25 |
Reporting form for reach |
Grantee program staff |
107 |
856 |
$30.00 |
$25,680 |
Performance measure reporting form |
Grantee program staff—Tier 1 A/B |
59 |
1573 |
$30.00 |
$47,190 |
Performance measure reporting form |
Grantee program staff—Tier 1 C/D and Tier 2/PREIS |
48 |
1376 |
$30.00 |
$41,280 |
Total |
|
|
|
|
$115,839.25 |
A.13 Capital Costs (Maintenance of Capital Costs)
There are no capital costs associated with this study.
A.14 Cost to Federal Government
With the expected extended period of performance, the cost estimate for the completion of this contract will be $979,000 over 3 years. This total cost covers all activities related to the development of the performance measures, including activities not included in this OMB application. This is the cost estimated by the contractor, RTI International, and includes the estimated cost of coordination with OAH, RTI IRB and OMB applications, development of performance measures, development of the data reporting system, training and technical assistance to the grantees and OAH/ACYF staff in the use of the data reporting system, and data analysis and reporting. Annual cost to the federal government is estimated to be $326,333 ($979,000 /3).
A.15 Program or Burden Changes
There is no change in burden requested, as this is a new information collection.
16. Tabulation of Data and Schedule
RTI will have completed all data collection instruments and the reporting system, and obtained OMB approval by May 2012. Grantees will collect data throughout the first year of program implementation, and report the data at the program required reporting periods (May 31st and November 30). RTI will then analyze the data and prepare a written report, summarizing findings. Data will be broken down by type of grantee (i.e., Tier 1, Tier 2, or PREIS). Participant-level data will also be analyzed according to key characteristics (e.g., gender, race/ethnicity, and age).
The key events and reports to be prepared are listed in Exhibit 7.
Exhibit 7. Time Schedule for the Entire Project
Task/Activity |
Date* |
Develop performance measures |
September 2010-May 2011 |
Develop data collection instruments and reporting system |
May-August 2011 |
Receive OMB approval |
May 2012 |
Train grantees in use of data collection instruments and reporting system |
September 2011 |
Grantees collect data Year 2 |
May 2012-September 2012 |
Grantees enter data in reporting system Year 2 |
September 2012 |
Analyze data and prepare report Year 2 |
September 2012-March 2013 |
Grantees collect data Year 3 |
September 2012-September 2013 |
Grantees enter data in reporting system Year 3 |
September 2013 |
Analyze data and prepare report Year 3 |
September 2013-March 2014 |
*Dates are based on the expected 3-year period of performance
A.17. Display of Expiration Date for OMB Approval
The expiration date for OMB will be displayed on all data collection instruments.
A.18. Exceptions to Certification Statement.
There are no exceptions to the certification statement.
1. Respondent Universe and Sampling Methods
The respondent universe will include all 107 grantees who received funding in September 2011. There will be no sampling of grantees.
The universe of potential participants will vary across grantees, depending on their funding and type of program they are implementing. Tier 1 A/B grantees recruit as many eligible participants as they can afford to serve. Tier 1 C/D, Tier 2 and PREIS grantees have evaluation components and therefore recruit eligible participants to at least an 80% power level for their random assignment evaluations. Grantees recruit schools by first obtaining district-level support and then agreement by school principals. This often involves presentations to school boards, presentations to parent groups, and meetings with school staff. Community-based organizations are usually the grant recipient and therefore recruitment is not necessary. Eligible youth, as they present themselves at the clinic, church, etc., are asked if they’d like to receive the program. If they agree, the consent process is started.
Some evaluation grantees will have random assignment at the cluster level (schools or other groupings) and others will involve random assignment at the individual level. Random assignment will occur at the time of sample enrollment (for most grantees, after baseline surveys are administered). Follow-up data collections will target all youth who were randomly assigned at baseline to the program or control group.
The grantees expect to achieve a response rate of 90% or more for demographics and perceived impact measures as these measures are collected at enrollment and on the last day the program is delivered. Grantees with short-term follow up surveys (3 to 12 months post program delivery) expect to achieve a response rate of 85% or more on the behavior and intention measures. Those with longer-term follow up surveys (18 to 24 months) expect to achieve a 80% or more response rate on these measures. Reasons for projecting these response rates are explained in section B3.
2. Procedures for the Collection of Information
Individual level data will be collected from all consenting youth participating in the program for 102 grantees and a sample of consenting youth for the remaining 5 grantees (these five grantees are serving 9,000 or more youth). Grantees will obtain parental consent and youth assent in accordance with IRB approval for each participant prior to data collection. Active versus passive parental consent is dictated by the grantees’ IRBs. For some grantees, parental consent is not required for all participants. Federal regulations permit the IRB to approve research without parent permission “if the IRB determines that a research protocol is designed for conditions or for a subject population for which permission is not a reasonable requirement to protect the subjects, provided an appropriate mechanism for protecting the children who will participate as subjects in the research is substituted and provided further that the waiver is not inconsistent with federal, state or local law”. For example, youth in one grantee site are under the legal guardianship of the state foster care system, so a caseworker, lawyer, or other identified legal representative will be providing consent for those youth to participate. In other grantees, some of the youth are 18 and older. Parental consent is not required for these participants, so active consent will be obtained directly from those youth. Staff at the sites in which grantees are operating will assist the grantees in obtaining consent/assent, such as school staff in school settings and health educators and clinic staff in clinic settings. Because many grantees have just begun enrolling youth, we are unable to determine consent rates for most grantees at this time.
For the grantees with evaluations, locating some sample members for follow-up will be required. Youth in school-based sites may have changed classrooms or schools over the life of the evaluation. Youth in other grantee sites may have moved. Prior to collecting behavior and intention data, grantees will work to locate sample members in their new classrooms or schools, or obtain any available updates to contact information in accordance with their design plans and IRB approval. Many of the grantees with evaluations will be collecting additional contact data at various points throughout the study through emails, phone calls, and postcards asking youth for this information.
Examples of parental consent forms and youth assent forms are provided in Appendices C and D.
For Tier 1 A/B grantees, individual-level data will pertain only to demographic characteristics and the perceived impact of the program until the Federal evaluation data on the behavior and intention measures for a subset of the A/B grantees are available; for Tier 1 C/D and Tier 2/PREIS programs, these data will also include data on the behaviors and intentions measures.
Data collection procedures from program participants will vary depending on the methods employed by the grantee and their IRB approval, and will likely include paper and pencil, optical scan, and web-based methods. For Tier 1 C/D grantees and Tier 2/PREIS grantees, the methods used for the collection of individual level data may be dictated by the procedures used for evaluation purposes. Grantees will prepare a dataset that contains only the variables needed for the performance measures (See appendix Y, Tier 1 C/D and Tier 2/PREIS performance measure reporting form), which they will upload into a web portal. RTI will merge all of the datasets from all of the grantees. For the perceived impact measures, RTI will create a table for each measure, presenting the proportion of youth who report that they are either less likely or much less likely to have sex as a result of the program, and either more likely or much more likely to use condoms or contraception, stratifying by key demographic variables. For the behaviors and intentions variables, RTI will calculate the difference of differences between the intervention and comparison groups across all grantees, and create tables summarizing these results, stratifying by key demographic variables.
Some of the grantee/intervention level measures will be collected by aggregating data obtained from the staff implementing the program. These data include those pertaining to reach and retention, dosage, and fidelity (See Exhibit 1). The dissemination measures will be tracked and summarized from the grantees’ records. As with the individual measures, the grantee/intervention level measures will be entered into the web portal, in some cases aggregated by demographic characteristics.
Data on the soundness of evaluation plans and evaluation implementation will be provided by another the OAH contractor providing technical assistance to the evaluators.
Once RTI receives the data, RTI will further analyze them by type of grantee. RTI will provide a summary of each measure in an annual report.
All grantee materials that participants will see, including consent/assent forms, data collection instruments, and recruiting materials will include all required PRA information including OMB number and expiration date, PRA blurb, purpose of the collection and use of the data, length of time, and Privacy Act statement. Our web-based reporting system will include all of the required PRA information as well. Finally, all consents, instruments, and other materials used in our Federal Evaluations including these grantees will display both OMB regulation numbers and both expiration dates (one for the OMB cleared evaluation study and one for the collection of performance measures once the request is cleared by OMB).
3. Methods to Maximize Response Rates and Deal with Non-response
Because completion of performance measures is a funding requirement for these cooperative agreements, we expect that we will have a 100% grantee response rate. This requirement was stipulated in the Funding Opportunity Announcement and was explained during the first annual conference which all grantees attended. All grantees will be trained in the web based data collection procedures to facilitate their responses.
We expect the grantees to have youth response rates of 90% or better on demographic and perceived impact measures and 85% or better on short-term follow up surveys and 80% or better on long-term follow up surveys. The grantees can expect such response rates because demographic data is collected at enrollment and perception questions will be collected on the last day of the program. The short-term surveys are collected 6 to 12 months after program delivery ensuring contact data are quite current. Many grantees will administer the surveys in the same location where the program took place (for example, the school). The grantees have invested significant effort in gaining the cooperation of their program sites from the beginning of implementation, minimizing burden and assuring privacy to their youth. Grantees will give their sites detailed information about the data collection, how data collection will be administered and on what schedule, what involvement and time will be required of site staff, and how data will be used and protected.
Grantees will work with their sites to maximize attendance on days data is being collected and to locate youth who have moved. Some grantees are using incentives to encourage participation in the program and in data collection efforts. Many will have make-up sessions to capture any initial non-respondents. Finally, grantees with evaluations will take steps to understand the nature of any non-response.
4. Tests of Procedures or Methods to be Undertaken
There is not enough time before grantees start collecting the performance measure data to perform a formal pilot of the measures. Some of the researchers on this contract’s expert panel have used the measures in the past with various age groups in various settings and have provided us with feedback. Additionally, some of the grantees have begun pilot testing their own instruments including some of these questions and have also provided us with feedback used to develop this final set of proposed measures. Cognitive testing with 9 or less youth ages 10-19 will be performed by RTI and we will solicit feedback on the measures from the grantees after they collect the first year of data. We will consider necessary revisions of the measures at that time.
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
The agency official responsible for receiving and approving contract deliverables is:
Amy Farb
240-453-2836
[email protected]
Office of Adolescent Health/DHHS
1101 Wootton Parkway, Suite
700
Rockville, MD 20852
The persons who designed the data collection are:
Barri Burrus
919-597-5109
RTI International
3040 E. Cornwallis
Research Triangle Park, NC 27709
Ellen Wilson
919-316-3337
RTI International
3040 E. Cornwallis
Research Triangle Park, NC 27709
Ina Wallace
919-541-6967
RTI International
3040 E. Cornwallis
Research Triangle Park, NC 27709
The persons who will collect the data are:
The 107 OAH/ACYF grantees
Appendix A
Federal Register Notice to the Public
Appendix B
RTI Institutional Review Board Notice
Activity does not require IRB approval
Appendix C
Example of A/B Grantee Consent
Appendix D
Example of Evaluation Grantee Consent
1 This number reflects a change from the Federal Register Notice (from 8,333 to 4,212) based on updated information from the grantees.
2 Excluded from the perceived impact burden are youth in 6th grade or less and control youth as they will not be administered these questions. Additionally excluded from this burden are the 16 grantees included in the two Federal evaluations of the TPP program as the burden to collect these data are included in those OMB submissions (OS 0990-0382 and OS 0990-NEW). Finally, the five grantees working with large numbers of youth will report on a subsample. Four of these five grantees are conducting rigorous evaluations of their youth and will report on these evaluation samples. The remaining grantee was asked to subsample at 50% given the service population size estimated.
3 The majority of grantees are serving less than 3,000 youth total. Five grantees are working with larger numbers of youth (estimated at 6000, 7000, 9000, 15000, 20000 As described above, these grantees will subsample their data collection efforts to minimize burden while maintaining statistical rigor.
4 These respondents are already represented in the 107 respondents in the line above
5 These respondents are already represented in the 107 respondents 2 lines above
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-30 |