0990-0424 PAF FU Supporting Statement A OMB comments_Passback Clean

0990-0424 PAF FU Supporting Statement A OMB comments_Passback Clean.docx

Pregnancy Assistance Fund Feasibility And Design Study(Positive Adolescent Futures)

OMB: 0990-0424

Document [docx]
Download: docx | pdf


Part A: Justification for the Collection of 12-Month Follow-Up Survey Data - Pregnancy Assistance Fund Study

OMB Control Number 0990-0424

April 2015 (revised July 2015, second revision December 2015)


Submitted to:

U.S. Department of Health and Human Services Office of Adolescent Health
Office of the Director Department
of Health and Human Services

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

Project Officer:

Submitted by:

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005

Project Director: Amy Farb

Part A: Justification for the Collection of 12-Month Follow-Up Survey Data - Pregnancy Assistance Fund Study

OMB Control Number 0990-0424

April 2015 (revised July 2015, second revision December 2015)






CONTENTS

Part a Introduction 3

A1. Circumstances Making the Collection of Information Necessary 4

1. Legal or Administrative Requirements that Necessitate the Collection 4

2. Study Objectives 4

A.2. Purpose and Use of the Information Collection 5

A.3. Use of Information Technology to Reduce Burden 6

A.4. Efforts to Identify Duplication and Use of Similar Information 6

A.5. Impact on Small Businesses 7

A.6. Consequences of Not Collecting the Information/Collecting Less Frequently 7

A.7. Special Circumstances 7

A.8. Federal Register Notice and Consultation Outside the Agency 7

A.9. Payments to Respondents 8

A.10. Assurance of Confidentiality 8

A.11. Justification for Sensitive Questions 9

A.12 Estimates of the Burden of Data Collection 10

A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 11

A.14. Annualized Cost to Federal Government 11

A.15. Explanation for Program Changes or Adjustments 11

A16. Plans for Tabulation and Publication and Project Time Schedule 11

1. Analysis Plan 12

2. Time Schedule and Publications 12

A17. Reason(s) Display of OMB Expiration Date is Inappropriate 13

A18. Exceptions to Certification for Paperwork Reduction Act Submissions 13

TABLES

A11.1 summary of sensitive questions 9

A12.1 Calculations of Burden Hours and Cost for youth 10

A12.2 Calculations of Annual Burden Hours and Costs to date 10

A16.1 timeline for use of 12-month follow-up survey 12

ATTACHMENTS

ATTACHMENT A: OVERVIEW OF THE PAF EVALUATION

ATTACHMENT b: QUESTION BY QUESTION SOURCE LIST FOR THE 12-month FOLLOW-UP SURVEY

ATTACHMENT C: SOURCES REFERENCED FOR THE 12-month FOLLOW-UP SURVEY

ATTACHMENT D: PERSONS CONSULTED ON INSTRUMENT DEVELOPMENT AND/OR ANALYSIS OF THE PAF 12-month follow-up SURVEY

ATTACHMENT E: CONFIDENTIALITY PLEDGE

ATTACHMENT F: ANALYSIS PLAN

ATTACHMENT G: PRETEST MEMO

ATTACHMENT H: PAF 12-MONTH FOLLOW UP 60-DAY FRN




INSTRUMENTS



Instrument 1: PAF 12-MONTH FOLLOW-UP sURVEY: cALIFORNIA

Instrument 2: PAF 12-MONTH FOLLOW-UP sURVEY: TEXAS


Part a Introduction

In March 2010, Congress authorized the Pregnancy Assistance Fund Competitive Grants Program as part of the Patient Protection and Affordable Care Act (ACA). The grants program is a key element of the federal strategy to support youth and young adults who are having or raising a child. Administered by the Office of Adolescent Health (OAH), the grants program funded a second cohort of 17 grantees—states, tribes, and tribal entities—in summer 2013 to develop and implement programs focused on an array of outcomes, including increasing access to and completion of secondary and postsecondary education, improving child and maternal health, reducing the likelihood of repeat teen pregnancies, increasing parenting and co-parenting skills, decreasing intimate partner violence, and raising awareness of available resources. To promote positive outcomes, grantees may implement a wide variety of services for expectant and parenting youth, women, fathers, and their families. OAH’s continued investment in programs for expectant and parenting youth has led to their request for a rigorous impact and implementation study of such programs, and they have contracted with Mathematica Policy Research to conduct the Pregnancy Assistance Fund Study.

Preliminary PAF Study efforts, including study design and instrument development, were conducted through a Feasibility and Design Study (FADS). The purpose of the FADS was to design rigorous impact evaluations in three sites that serve pregnant and parenting youth (including Pregnancy Assistance Fund grantees), develop data collection materials for all aspects of an evaluation, and conduct telephone interviews with grantees about the program design decisions and early implementation experiences. Information collected through the FADS was also used to provide funding agencies with information to inform the structure and components of programs for expectant and parenting youth and their families, so that the five-year PAF Study will be possible.

The objective of the FADS was to establish a foundation for the PAF Study’s rigorous impact and implementation evaluation. Specifically, FADS : (1) assessed design options for implementation and impact evaluation, (2) documented how programs are operationalized in the field, (3) identified and entered into agreements with three sites for the evaluation, (4) provided assistance to sites to support a rigorous evaluation framework, (5) developed all evaluation instruments and obtained clearance for sample enrollment and baseline data collection, and (6) piloted baseline data collection. Attachment A provides an overview of the components of the PAF Study, which the FADS work supported. Attachment A contains a description of the three sites: experimental design studies in California and Texas and a quasi-experimental study relying on extant administrative records in Washington, DC.

Previous Information Clearance Requests Approved by OMB.

  • August 30, 2014 – OMB approved the instruments associated with two data collection efforts: (1) telephone interviews with the 17 Pregnancy Assistance Fund grantees funded in 2013; and (2) collection of baseline data for the impact study in two sites through a baseline survey (OMB Control # 0990-0424).

As of the end of November 2015, all of the 17 Pregnancy Assistance Fund grantees have completed their interviews, 639 youth have completed the baseline survey in California, and 130 youth have completed the baseline survey in Texas. As of the end of November, the baseline survey response rate was 99 percent of all youth who have provided study consent in California, and 100 percent of all youth who provided study consent in Texas. In California, all study enrollment and baseline surveys should be completed by the end of 2016. In Texas, all study enrollment and baseline surveys should be completed by summer 2017.

Current Information Clearance Request. In this submission, OAH is requesting a revision to the existing approval to add the 12-month follow-up survey instruments to be used in the two impact sites: (1) Pregnancy Assistance Fund 12-Month Follow Up Survey – California (Instrument 1), and (2) Pregnancy Assistance Fund 12-Month Follow Up Survey – Texas (Instrument 2). These surveys are very similar to the baseline survey approved for this evaluation, and the two are nearly identical, except for some minor differences to reflect differences in the interventions. The California survey contains additional items to measure changes in youth resiliency, a primary focus of the program in California. The Texas survey does not contain such resiliency items, but does contain items measuring parenting and relationship skills, a focus of the program in Texas.

1. Circumstances Making the Collection of Information Necessary

1. Legal or Administrative Requirements that Necessitate the Collection

On March 23, 2010, the President signed into law the Patient Protection and Affordable Care Act (ACA), H.R. 3590 (Public Law 111-148, Sections 10211-10214). In addition to its other requirements, the act authorizes $25 million for each of fiscal years 2010 through 2019 and authorizes the Secretary of HHS, in collaboration and coordination with the Secretary of Education, to “establish a Pregnancy Assistance Fund to be administered by the Secretary, for the purpose of awarding competitive grants to States to assist expectant and parenting youth and women.”1

The Office of Management and Budget has requested an evaluation of programs for expectant and parenting youth, including Pregnancy Assistance Fund grantees (per conversations with OAH Director, Evelyn Kappeler), recognizing that there is a unique opportunity to contribute to the field by designing a rigorous evaluation of such programs that can overcome previous challenges.

2. Study Objectives

Using experimental and quasi-experimental designs, the PAF study will test the effectiveness of services to impact subsequent pregnancies, educational, health, sexual behavior, and parenting outcomes. During the FADS, the study team identified and worked with three programs to decide which service components will be evaluated, which participants will be included, and which outcomes will be measured. In addition, the FADS team worked with two of the three program sites to develop a plan for random assignment at either the individual or group (cluster) level. Finally, the FADS team worked with the selected sites to design a process for collecting study data, including evaluation consent, a baseline survey, and two follow-up surveys in two of the three sites.

The three programs selected for the impact evaluation will also participate in a more in-depth implementation study. 2 The in-depth implementation study will take a detailed look at program operations along four key aspects: (1) inputs required for implementation to succeed and be sustained, (2) contextual factors that influence implementation, (3) quality of program implementation, and (4) participants’ responsiveness to service.

There are three sites participating in the PAF Study. Two of these sites (California and Texas) will be randomized controlled trials with primary data collection through surveys of youth. The third site, in Washington, DC, is a pilot to test the use of a quasi-experimental design that relies on administrative data provided through data use agreements with three local public agencies – DC Public Schools, DC Human Services, and DC Department of Health. Youth in DC will not be surveyed; however, the site will participate in data collection for the in-depth implementation study. One of these sites, California, is a current Pregnancy Assistance Fund grantee. The other two sites, one in Texas and the other in Washington, DC, are not Pregnancy Assistance Fund grantees. These three sites are describe in depth in Attachment A, Overview of the PAF Study.

OAH acknowledges that the limitations of studying three programs, only one of which is a Pregnancy Assistance Fund grantee. OAH does not intend to use the results of these three separate program evaluations to generalize to the effectiveness of similar programs nationally. Each of the three selected programs offers a different approach for serving pregnant and parenting teens, and each approach is similar to approaches used across the country. However, since each site was purposefully selected for its ability to support the design of a rigorous impact evaluation, the results cannot be generalized to the broader population of similar programs. Still, the results will add value to a non-existent knowledge base. These three separate evaluations will provide some foundational knowledge, albeit limited, on “what works” for pregnant and parenting teens. Evidence that can, in the future, be expanded with replication studies in other contexts and settings.

OAH is currently requesting OMB approval (through OMB Control #0990-0424) for the collection of 12-month follow-up survey data in the two random assignment impact sites – California and Texas.



A.2. Purpose and Use of the Information Collection

Data collected on the PAF Study 12-month follow-up surveys (Instruments 1 and 2) will be used to measure youth outcomes, with the ultimate purpose of measuring program impacts. The 12- month follow-up data collection for which approval is now sought will focus on four primary types of outcomes related to the objectives of OAH’s funding priorities for the Pregnancy Assistance Fund. The first are sexual risk outcomes, including the extent and nature of sexual activity, use of contraception (if sexually active), and pregnancy. The second are outcomes related to educational attainment and completion, such as whether they are currently enrolled in secondary or post-secondary programs and whether they have earned a diploma and/or other types of certificates of completion. Third, the surveys are designed to measure maternal and child health outcomes, such as access to and use of healthcare. Finally, the surveys will measure parenting skills. The California survey (Instrument 1) includes items that measure youth resilience, a focus of that program. And the Texas survey (Instrument 2) includes additional items relating to parenting and relationships, a focus of that program. In addition, the surveys include a small number of questions that identify socio-demographic or other characteristics of youth in the study sample, which may be used either for descriptive purposes.

Follow-up data will be used to address the following research questions on program impact:

  • Are the programmatic approaches effective, compared to business as usual, at meeting their immediate objectives (for example, improving contraception use, delaying a subsequent pregnancy, and improving educational attainment)?

  • What are their effects on related outcomes, such as access to and use of health care and parenting skills?

  • Do these approaches work better for some groups of youth than for others?3


A.3. Use of Information Technology to Reduce Burden

The data collection plan for the 12-month follow-up survey is the same across the two sites (California and Texas) and also reflects sensitivity to issues of efficiency, accuracy, and respondent burden. We will offer various modes for completing the 12-month follow-up survey. These modes will be a web-survey that will be smart phone compatible and computer-assisted telephone interviewing (CATI). We will use email and text messages with links to the web survey and toll-free telephone number should respondents prefer to complete the survey by telephone or have any issues with the web survey.

The web survey will be designed to allow for completing the 12-month follow-up survey over the web using a computer, tablet, or smart phone. Respondents will be provided a unique PIN and password to access the survey from their device. We will also provide them with a toll-free number to call should they prefer to complete the survey by telephone or have any issues with the web survey. The web survey will also include a link to email the project team with questions or issues.

Web surveys have been an attractive option for surveys of young adults. The Evaluation of the Impact of the Youth Build Program (Youth Build) includes a similar population (16 to 24 year old high school drop outs) who potentially have the same access and literacy issues as PAF study respondents. At the 12 month follow up survey for Youth Build, more than 25 percent completed the survey online. At the 30 month follow up, the percentage completing the survey online increased to over 30 percent.

For those who do not call in or complete the web survey within one month, we will make outbound calls from Mathematica’s Survey Operations Center (SOC). When a respondent is reached, a SOC telephone interviewer will use CATI to complete the survey. If a respondent is not reached, the SOC telephone interviewer will leave a message whenever possible and provide a toll-free number the respondents can use to call and complete the CATI survey. CATI has been successfully used as a follow-up survey method on other federal studies, such as the study of Personal Responsibility Education Program (PREP) (ACF) and study of Pregnancy Prevention Approaches (PPA) (OAH), with over 80 percent response rates among those contacted. It is also successfully being used to administer the baseline survey on this evaluation.


A.4. Efforts to Identify Duplication and Use of Similar Information

OAH has carefully reviewed the information collection requirements for the PAF Study to avoid duplication with existing and ongoing studies of programs to support expectant and parenting youth, and in particular those that are federally funded. The PAF Study will contribute to a very slim knowledge base on effective approaches for improving outcomes for expectant and parenting youth. In the past few decades, many social policy and evaluation efforts have focused on the prevention of teen and unplanned pregnancy. When prevention efforts are absent or failed, we must consider how to support young people facing these daunting challenges. The PAF Study is unique in that it will contribute information on impacts and implementation to the very slim knowledge base and about three very distinct program models, described in detail in Attachment A.

A.5. Impact on Small Businesses

No small businesses are expected to be impacted. Programs in some sites may be operated by non-profit community-based organizations. The data collection plan is designed to minimize burden on such sites by using staff from Mathematica Policy Research to track respondents and collect the data. Mathematica has extensive experience successfully locating youth. For example, for the PREP site that uses similar modes of data collection (HFSA), we have reached 93 percent of all respondents without the aid of the program staff, thus minimizing “sample pursuit” by program staff.

A.6. Consequences of Not Collecting the Information/Collecting Less Frequently

Outcome data are essential to conducting a rigorous evaluation of programs for expectant and parenting teens. Without outcome data, we cannot estimate program effectiveness. Twelve months after program enrollment is an optimal time to examine outcomes. It will be about a year after the teen has given birth and many treatment group members will be able to recall treatment experiences.

A.7. Special Circumstances

There are no special circumstances for the proposed data collection efforts.

A.8. Federal Register Notice and Consultation Outside the Agency

A 60-day Federal Register Notice was published in the Federal Register on May 7, 2015, vol. 80, No. 88; pp. 26281-26282 (see attachment H). There were no public comments.

The baseline and 12-month follow up survey development began with OAH consulting with Mathematica researchers who gathered relevant questions from OMB approved surveys, such as the Personal Responsibility Education Program Multi-Component Evaluation (PREP) and the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA)4. The instruments were refined by the Mathematica researchers considering the specific design of the PAF study and outcomes. The Mathematica staff consulted have experience working with similar populations and designing surveys measuring the outcome of interest in the PAF study. Mathematica worked closely with OAH during this design phase, meeting regularly to go over modifications to the instruments.

Mathematica staff consulted with local evaluators and program staff in both California and Texas program sites to ensure that the survey questions aligned with the outcomes of their specific programs. California local evaluators requested additional items to measure changes in youth resiliency, a primary focus of the program in California. The Texas survey does not contain such resiliency items, but does contain items measuring parenting and relationship skills, a focus of the program in Texas. The names and contact information of the persons consulted in the drafting and refinement of the survey and the analysis of survey items are found in Attachment D. Mathematica will again consult with the local evaluators during the design of the 24-month follow up, which will be included in a separate request.

A.9. Payments to Respondents

A $25 gift card will be provided to survey respondents in appreciation of their continued participation in the study. This is consistent with the amount provided to baseline survey respondents in this study, and this amount for 12-month follow-up survey completion was described in the approved consent forms for this study. These gift cards are important because many of our respondents are members of a hard-to-reach population, expectant and parenting teens. In addition, our surveys include sensitive questions, and thus impose some additional burden on respondents. Research has shown that respondent payments are effective at increasing response rates for populations similar to participants in the California and Texas programs,5,6,7 Research also suggests that providing an incentive for earlier surveys may contribute to higher response rates for subsequent surveys,8 Therefore, providing a modest gift of appreciation at the 12-month follow-up may reduce attrition for a planned second follow-up data collection a year later.


A.10. Assurance of Confidentiality

Mathematica Policy Research has secured IRB approval for the Texas and California sites. New England IRB has approved the Texas site design and baseline data collection plans, and the California Office of the Protection of Human Subjects has approved the design and baseline data collection plan for the California site. Mathematica will secure any additional IRB approvals for the 12-month follow-up survey in each site.

Prior to collecting baseline data, the evaluation team is seeking evaluation consent from the youth themselves (where applicable), and otherwise from a parent or guardian. For the follow-up surveys, the evaluation team will seek assent from respondents before data will be collected. The consent and assent forms were included in the earlier OMB package that included the baseline survey and that has already received OMB approval (OMB Control # 0990-0424). The assent forms state that answers will be kept private and not seen by anyone outside of the study team, that participation is voluntary, and that they may refuse to participate at any time without penalty. Participants will be told that, to the extent allowable by law, individual identifying information will not be released or published; rather, data collection will be published only in summary form with no identifying information at the individual level.

Web-based surveys will be administered via a secure website, and participants must enter a User ID and password that will provided to them via mail, text or email. Computer assisted telephone interviews (CATI) will be administered by phone by trained Mathematica interviewers. All interviewers are required to sign a confidentiality pledge when hired by Mathematica. The survey administration protocol, whether surveys are completed via the web or CATI, provides reassurance that the evaluation team takes the issue of privacy seriously. Participants will be informed that all of their answers will be kept private, that identifying information will be kept separate from their answers, and that no one outside of the study team will see their responses. No identifying information will be attached to the data from any completed survey, whether completed on the web or using CATI; only a unique study ID number will be linked with the responses.

All electronic data will be stored in secure files, with identifying information kept in a separate file from survey and other individual-level data. Survey responses will be stored on a secure, password-protected computer shared drive. Mathematica’s Confidentiality Pledge , signed by all staff, is included in Attachment E.

A.11. Justification for Sensitive Questions

Table A11.1 provides a list of the sensitive questions found on the PAF 12-month follow-up surveys, along with a justification for their inclusion. Sensitive questions are drawn from previously-successful youth surveys and evaluations (see Attachments B and C). The items have been carefully selected, and we have been guided by experience in determining whether or not the benefits of measures may outweigh concerns about the heightened sensitivity among sample members, parents, and program staff to specific issues. Although these questions are sensitive, they are commonly and successfully asked of youth similar to those who will be in the PAF study.

Table A11.1. Summary of Sensitive Questions to be Included on the 12-month Follow-up Survey and Their Justification

Topic

Justification

Sexual activity, incidence of pregnancy, and contraceptive use  (6.1 - 6.12)


Sexual activity, incidence of pregnancy, and contraceptive use are all key outcomes for the evaluation.

Drug and alcohol use and violence (3.1 - 3.6)

There is a substantial body of literature linking various high-risk behaviors of youth, particularly drug and alcohol use, sexual intercourse, and risky sexual behavior. The effectiveness of various program strategies is expected to differ for youth who are and are not experimenting with or using drugs and alcohol (Tapert et al., 2001; Li et al., 2001; Boyer et al., 1999; Fergusson and Lynskey, 1996; Sen, 2002; Dermen et al., 1998; Santelli et al., 2001.)

.

A.12 Estimates of the Burden of Data Collection

OAH is requesting three years of clearance for the PAF Study 12-month follow-up survey. Table A12.1 provides the estimated annual reporting burden for study participants as a result of the 12-month follow-up survey for youth. There is no associated burden for program staff. Table A12.2 provides a summary of burden hours and costs approved to-date, as well as those requested in this ICR.

It is expected that 2,020 young women will be enrolled in the evaluation sample across the two random assignment evaluation sites,- - 1300 in California and 720 in Texas. Sample intake will take place over two years in California and up to three years in Texas. In California, 92 percent of the youth recruited for the study have agreed to the study. Therefore, in order to achieve a sample size of 1,300 youth we will recruit approximately 1,400 youth. In Texas, our consent rate has been 100 percent of all youth recruited for the study. To achieve a sample size of 720 youth, we expect to recruit 720 youth.

The expected response rate for the 12-month follow-up survey is 85 percent9, for a total of 1,717 completed surveys, and an average of 572 per year. Based on experience with similar questionnaires and our pretest (see Appendix G), it is estimated that it will take youth 35 minutes (35/60 hour) to complete the 12-month follow-up survey, on average. The total annual burden for this data collection is estimated to be 572 x 35/60 = 333.67 hours. The cost of this burden is estimated to be 333.67 hours x 0.60 (proportion of youth age 18 or older at the 12-month follow-up) x $7.25 = $1,451.60.

Table A.12.1. Calculations of Burden Hours and Cost for Youth Participants for the 12-month follow-up survey

Instrument

Type of respondent

Total Number of Respondents

Annual Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Total Annual Burden Hours

Total Annual Burden Hours for Youth Age 18 or Older

Hourly Wage Rate

Total Costs

Impact Study

12-month follow-up survey of impact study participants

Participating program females and control group females

1717

572

1

35/60

333.67

200.20

$7.25

$1451.46

Estimated Annual Burden for Youth Participants


333.67



$1,451.46

Note: We assume that 85 percent of the enrolled sample of 2,020 across the two sites will complete the 12-month follow-up survey. Based on the average age at study enrollment thus far, we assume that 60 percent of the sample will be 18 or older at the time of 12-month follow-up survey administration.



Table A.12.2 details the overall burden requested for data collection associated with the PAF Study. A total of 349 hours (and a cost of $934.75) was approved in the prior ICR for this project. A total of 333.67 hours (and a cost of $1,451.46) is requested in this ICR.

Table A.12.2. Calculations of Annual Burden Hours and Costs to Date

Data collection instrument

Type of Respondent

Annual number of respondents

Number of responses per respondent

Average burden hours per response

Total burden hours

Total Burden Hours for Youth Age 18 or Older

Hourly Wage Rate

Total costs

Design and Implementation Analysis (Approved August 30, 2014)

Grantee Interview Protocol

Grantee Administrator

6

1

2

12

N/A

$37.45

$449

Impact Study (Approved August 30, 2014)

Baseline survey of impact study participants

Participating program females and control group females

673

1

0.5

337

67

$7.25

$485.75

Subtotal: Burden approved to date:



349



$934.75

Impact Study (Requested in this ICR)

12-month follow-up survey of impact study participants

Participating program females and control group females

572

1

35/60

333.67

200.20

$7.25

$1,451.46

Estimated Total Annual Burden

682.67


$2,386.21



A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

These information collection activities do not place any capital cost or cost of maintaining requirements on respondents.

A.14. Annualized Cost to Federal Government

Data collection will be carried out by Mathematica Policy Research, under contract with OAH to conduct the PAF Study, with a sub-contract to Decision Information Resources to conduct locating and field follow-up, as needed. The total cost for collecting the 12-month follow-up survey is $1,684,934, and the annual cost is $561,645.

A.15. Explanation for Program Changes or Adjustments

OMB gave approval on August 30, 2014, for the PAF baseline survey and telephone interviews across 17 Pregnancy Assistance Fund grantees (OMB Control # 0990-0424). We now seek a revision to the existing approval to add the data collection associated with the 12-month follow-up survey instruments. This request is an adjustment increasing the total burden from 349 hours to 682.67 hours.

A16. Plans for Tabulation and Publication and Project Time Schedule

1. Analysis Plan

Program impacts will be analyzed separately for each site using survey data collected at baseline and first follow-up (12 months after baseline). Impact analysis will begin after the completion of 12-month follow-up data collection for each site. Regression-adjusted impact estimates will be estimated for each primary outcome in each site, drawing on baseline and follow-up data. The set of primary analyses for each site will be limited to a small set of key outcomes, including measures of sexual risk behavior, subsequent pregnancy, educational attainment, and maternal and child well-being. To support these analyses, the follow-up surveys include items which can measure these key outcomes. Analyses will be performed according to characteristics captured in the baseline survey data, including prior sexual experience and other risk factors. See Attachment F for more detail on the planned analyses.

2. Time Schedule and Publications

OAH expects that the PAF Study will be conducted over five years, beginning in September 2014. This request is for a three year period and subsequent packages will be submitted as necessary for new collections or to extend collection periods. Below is a schedule of the data collection efforts for the 12-month follow-up study, the focus for this ICR:

Table A.16.1. Timeline for Use of 12-month Follow-up Survey

Instrument

Date of 60-Day Submission

Date of 30-Day Submission

Date Clearance Needed

Date for Use in Field

Instrument 1: 12-month follow-up survey - California

May 2015







July 2015

September 2015

January 2016

Instrument 2: 12-month follow-up survey - Texas

May 2015





July 2015

September 2015





June 2016



One of the random assignment sites (California) began enrolling study participants in December 2014, and the 12-month follow-up survey will begin in January 2016. The second random assignment site (Texas) began enrolling in June 2015, and the 12-month follow-up survey will begin there in June 2016. Because OAH plans to analyze each site separately, it is acceptable for the data collection schedule to vary across sites. Sample enrollment in each site is rolling, and the start and end date of sample enrollment varies. In California, the last sample members are expected to be enrolled in December 2016, and therefore 12-month follow-up data collection will end in early 2018. OAH expects to produce a 12-month impact report for California in 2018. In Texas, the last sample members are expected to be enrolled in summer 2017, and therefore 12-month follow-up data collection will end in fall 2018. OAH expects to produce a 12-month impact report for Texas in 2019. Analysis and reporting of the 12-month follow-up data for each site will be funded under two separate CLINs, which is expected to extend the contract through 2019.

A17. Reason(s) Display of OMB Expiration Date is Inappropriate

All instruments, and consent and assent forms, will display the OMB Control Number and expiration date.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

SUPPORTING REFERENCES FOR INCLUSION OF SENSITIVE
QUESTIONS OR GROUPS OF QUESTIONS

Boyer, Cherrie B., Jeanne M. Tschann, and Mary-Ann Shafer. "Predictors of Risk for Sexually Transmitted Diseases in Ninth Grade Urban High School Students." Journal of Adolescent Research, vol. 14, no. 4, 1999, pp. 448-65.

Dermen, K. H., M. L. Cooper, and V. B. Agocha. "Sex-Related Alcohol Expectancies as Moderators of the Relationship between Alcohol use and Risky Sex in Adolescents." Journal of Studies on Alcohol., vol. 59, no. 1, 1998, pp. 71.

Fergusson, David M. and Michael T. Lynskey. "Alcohol Misuse and Adolescent Sexual Behaviors and Risk Taking." Pediatrics, vol. 98, no. 1, 1996, pp. 91.

Li, Xiaoming, Bonita Stanton, Lesley Cottrell, James Burns, Robert Pack, and Linda Kaljee. "Patterns of Initiation of Sex and Drug-Related Activities among Urban Low-Income African-American Adolescents." Journal of Adolescent Health : Official Publication of the Society for Adolescent Medicine., vol. 28, no. 1, 2001, pp. 46.

Santelli, John S., Leah Robin, Nancy D. Brener, and Richard Lowry. "Timing of Alcohol and Other Drug use and Sexual Risk Behaviors among Unmarried Adolescents and Young Adults." Family Planning Perspectives, vol. 33, no. 5, 2001.

Sen, Bisakha. "Does Alcohol-use Increase the Risk of Sexual Intercourse among Adolescents? Evidence from the NLSY97." Journal of Health Economics., vol. 21, no. 6, 2002, pp. 1085.

Tapert, Susan F., Gregory A. Aarons, Georganna R. Sedlar, and Sandra A. Brown. "Adolescent Substance use and Sexual Risk-Taking Behavior." Journal of Adolescent Health: Official Publication of the Society for Adolescent Medicine., vol. 28, n3, 2001, pp.181.




2 A separate ICR for the implementation study data collection activities was approved on May 18, 2015 (OMB Control Number 0990-0428)

3 We do not expect to have the power to rigorously examine impacts among subgroups. Such analyses will be considered exploratory, and to better understand whether the programs may be having uniform or differential impacts among subgroups identified by pre-intervention measures.

4 ACF received initial OMB approval for the PPA baseline survey on July 26, 2010 (OMB Control Number 0970-0360). In summer 2011, oversight of PPA was transferred to the Office of Adolescent Health (OAH) within the Office of the Assistant Secretary, and the project is now tracked with a different OMB Control Number (0990-0382). The OMB Control Number for the Teen Pregnancy Prevention Replication Study is 0990-0394. OMB approval for the PREP follow-up survey was received on May 8, 2013 (OMB Control Number 0970-0398).



5 Berlin, Martha, Leyla Mohadjer, Joseph Waksberg, Andrew Kolstad, Irwin Kirsch, D. Rock, and Kentaro Yamamoto. 1992. An experiment in monetary incentives. In JSM proceedings, 393–98. Alexandria, VA: American Statistical Association.

6 James, Jeannine M., and Richard Bolstein. 1990. The effect of monetary incentives and follow-up mailings on the response rate and response quality in mail surveys. Public Opinion Quarterly 54 (3): 346–61.

7 Singer, Eleanor, and Richard A. Kulka. 2002. Paying respondents for survey participation. In Studies of welfare populations: Data collection and research issues, eds. Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, 105–28. Washington, DC: National Academy Press.

8 Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. 1998. Does the payment of incentives create expectation effects? Public Opinion Quarterly 62:152–64.

9 At the time of the original submission, we indicated a 90 percent response rate based on the responses rates from a very similar evaluation on a highly mobile population of young parents using similar data collection modes. Over the past year of follow-up data collection from that evaluation, the response rates have dipped to 85 percent. We are therefore updating our expected response rates to be 85 percent.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy