PAF Supporting Statement A_InDepth Study_Revised120216_track changes

PAF Supporting Statement A_InDepth Study_Revised120216_track changes.docx

Pregnancy Assistance Fund (PAF) Implementation Study

OMB: 0990-0428

Document [docx]
Download: docx | pdf


Part A: Justification for the Collection of In-depth Implementation Study Data - Positive Adolescent Futures (PAF) Study (0990-0428)


February 2015

Revised December 2016


Submitted to:

U.S. Department of Health and Human Services Office of Adolescent Health
Office of the Director Department
of Health and Human Services

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

Project Officer: Amy Farb

Submitted by:

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005

Project Director: Amy FarbSusan Zief

Part A: Justification for the Collection of In-depth Implementation Study Data - Positive Adolescent Futures (PAF) Study

February 2015


Revised December 2016




CONTENTS

Part a Introduction 1

A1. Circumstances Making the Collection of Information Necessary 2

1. Legal or Administrative Requirements that Necessitate the Collection 2

2. Study Objectives 3

A.2. Purpose and Use of the Information Collection 6

A.3. Use of Information Technology to Reduce Burden 7

A.4. Efforts to Identify Duplication and Use of Similar Information 7

A.5. Impact on Small Businesses 7

A.6. Consequences of Not Collecting the Information/Collecting Less Frequently 7

A.7. Special Circumstances 8

A.8. Federal Register Notice and Consultation Outside the Agency 8

A.9. Payments to Respondents 8

A.10. Assurance of Confidentiality 8

A.11. Justification for Sensitive Questions 9

A.12 Estimates of the Burden of Data Collection 9

1. Annual Burden for Program Staff 9

2. Annual Burden for Youth Participants 10

3. Overall Burden 10

A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 11

A.14. Annualized Cost to Federal Government 11

A.15. Explanation for Program Changes or Adjustments 12

A16. Plans for Tabulation and Publication and Project Time Schedule 12

1. Analysis Plan 12

2. Time Schedule and Publications 13

A17. Reason(s) Display of OMB Expiration Date is Inappropriate 14

A18. Exceptions to Certification for Paperwork Reduction Act Submissions 14

TABLES

A12.1 Calculations of Burden Hours and Cost for STAFF 10

A12.2 Calculations of Burden Hours and Cost for Youth Participants 11

A12.3 Calculations of Annual Burden Hours and Costs 11

ATTACHMENTS

ATTACHMENT A: OVERVIEW OF THE PAF EVALUATION

ATTACHMENT B: PERSONS CONSULTED ON INSTRUMENT DEVELOPMENT

ATTACHMENT C: CONSENT LETTERS AND FORMS AND YOUTH ASSENT

FORM FOR FOCUS GROUPS

ATTACHMENT D: CONFIDENTIALITY PLEDGE

ATTACHMENT E: QUESTION by QUESTION SOURCES FOR THE STAFF SURVEY

ATTACHMENT F: 60-DAY FEDERAL REGISTER NOTICE



INSTRUMENTS

INSTRUMENT #1 : MASTER LIST OF TOPICS FOR IMPLEMENTATION STUDY

INSTRUMENT #2 : MASTER TOPIC GUIDE FOR SMALL GROUP INTERVIEWS WITH STAFF

INSTRUMENT #3 : STAFF SURVEY

INSTRUMENT #4 : TOPIC GUIDE FOR FOCUS GROUP DISCUSSION WITH PARTICIPATING YOUTH

INSTRUMENT # 5: PROTOCOL TO COLLECT ATTENDANCE AND CONTENT COVERAGE DATA

Part a Introduction

In March 2010, Congress authorized the Pregnancy Assistance Fund Competitive Grants Program as part of the Patient Protection and Affordable Care Act (ACA). The grants program is a key element of the federal strategy to support youth and young adults who are having or raising a child. Administered by the Office of Adolescent Health (OAH), the grants program funded a second cohort of 17 grantees—states, tribes, and tribal entities—in summer 2013 to develop and implement programs focused on an array of outcomes, including increasing access to and completion of secondary and postsecondary education, improving child and maternal health, reducing the likelihood of repeat teen pregnancies, increasing parenting and co-parenting skills, decreasing intimate partner violence, and raising awareness of available resources. To promote positive outcomes, grantees may implement a wide variety of services for expectant and parenting youth, women, fathers, and their families. OAH’s continued investment in programs for expectant and parenting youth has led to their request for a rigorous impact and implementation study of such programs, and they have contracted with Mathematica Policy Research to conduct the Positive Adolescent Futures (PAF) Study.

Preliminary PAF Study efforts, including study design and instrument development, will be conducted through a Feasibility and Design Study (FADS). The purpose of the FADS is to design rigorous impact evaluations in three sites that serve pregnant and parenting youth (including Pregnancy Assistance Fund grantees), develop data collection materials for all aspects of an evaluation, and conduct telephone interviews with grantees about the program design decisions and early implementation experiences. Information collected through the FADS will also be used to provide funding agencies with information to inform the structure and components of programs for expectant and parenting youth and their families, so that the five-year PAF Study will be possible.

The objective of the Feasibility and Design Study (FADS) is to establish a foundation for the Positive Adolescent Futures (PAF) Study rigorous impact and implementation evaluation. Specifically, FADS will: (1) assess design options for implementation and impact evaluation, (2) document how programs are operationalized in the field, (3) identify and enter into agreements with three sites for the evaluation, (4) provide assistance to sites to support a rigorous evaluation framework, (5) develop all evaluation instruments and obtain clearance, and (6) pilot baseline data collection. Attachment A provides an overview of the components of the PAF Study, which the FADS work is supporting.



Current Information Clearance Request. With this new ICR, OAH is requesting OMB approval for instruments related to the PAF In-Depth Implementation Study.

  1. The Master List of Topics for Staff Interviews (Instrument 1)

  2. The Topic Guide for Group Discussions with Front-line Staff (Instrument 2)

  3. The Staff Survey (Instrument 3)

  4. The Topic Guide for Focus Group Discussions with Participating Youth (Instrument 4)

  5. Protocol for Collecting Attendance and Content Coverage Data (Instrument 5)

The data collected from these instruments will provide a detailed understanding of program implementation in the three rigorous impact study sites.

A1. Circumstances Making the Collection of Information Necessary

1. Legal or Administrative Requirements that Necessitate the Collection

On March 23, 2010, the President signed into law the Patient Protection and Affordable Care Act (ACA), H.R. 3590 (Public Law 111-148, Sections 10211-10214). In addition to its other requirements, the act authorizes $25 million for each of fiscal years 2010 through 2019 and authorizes the Secretary of HHS, in collaboration and coordination with the Secretary of Education, to “establish a Pregnancy Assistance Fund to be administered by the Secretary, for the purpose of awarding competitive grants to States to assist expectant and parenting youth and women.”1

The Office of Management and Budget has requested an evaluation of programs for expectant and parenting youth, including Pregnancy Assistance Fund grantees (per conversations with OAH Director, Evelyn Kappeler), recognizing that there is a unique opportunity to contribute to the field by designing a rigorous evaluation of such programs that can overcome previous challenges.

2. Study Objectives

There is currently little rigorous program evaluation published in the expectant and parenting youth literature. This is due, in part, to the lack of federal funding to evaluate programs until very recently. Additionally, there are methodological difficulties inherent in conducting evaluations of programs for these youth. For example, the sample sizes available for evaluation within any one program are generally small. In addition, low program enrollment and low retention rates reflect the complex social profiles and needs of this population.

Within OAH there is a unique opportunity to contribute to the field by using the Feasibility and Design Study (FADS) contract to scan the field for sites where rigorous evaluation is possible. For example, Pregnancy Assistance Fund grants are made to states and tribal entities; the grantees are implementing programs across large geographic areas. Many grants are supporting existing programs that have a demonstrated history of recruiting, engaging, and retaining expectant and parenting youth for the intended program duration.

The objective of the FADs is to establish a foundation for the Positive Adolescent Futures (PAF) rigorous impact and implementation evaluations. Specifically, FADS will: (1) assess design options for implementation and impact evaluation, (2) document how programs are operationalized in the field, (3) identify and enter into agreements with three sites for the evaluation, (4) provide assistance to sites to support a rigorous evaluation framework, (5) develop all evaluation instruments and obtain clearance, and (6) pilot baseline data collection.



Impact and In-depth Implementation Study. Using experimental and quasi-experimental designs, the PAF study will test the effectiveness of services to impact subsequent pregnancies, educational, health, sexual behavior, and parenting outcomes. During the FADS, the study team will identify and work with three programs to decide which service components will be evaluated, which participants will be included, and which outcomes will be measured. In addition, the FADS team will work with program sites to develop a plan for random assignment at either the individual or group (cluster) level. Finally, the FADS team will work with the selected sites to design a process for collecting study data, including evaluation consent, a baseline survey, and two follow-up surveys.

The three programs selected for the impact evaluation will also participate in a more in-depth implementation study. The in-depth implementation study will take a detailed look at program operations along four key aspects: (1) inputs required for implementation to succeed and be sustained, (2) contextual factors that influence implementation, (3) quality of program implementation, and (4) participants’ responsiveness to service.

There are three sites participating in the PAF Study. Two of these sites (California and Texas) will be randomized controlled trials with primary data collection through surveys of youth. The third site, in Washington, DC, will use a quasi-experimental design and rely on administrative data provided through data use agreements with three local public agencies – DC Public Schools, DC Human Services, and DC Department of Health. Youth in DC will not be surveyed; however, the site will participate in data collection for the in-depth implementation study.

OAH is currently requesting OMB approval for the collection of data on program implementation for the in-depth implementation study in each of the three study sites. These sites are describe in depth in Attachment A, Overview of the PAF Study.



A.2. Purpose and Use of the Information Collection

The in-depth implementation study will collect and analyze data to contextualize the analysis of program impacts. Data will be obtained from the following sources: (1) individual and group discussions with program developers, program leaders and front-line staff, program partners and other stakeholders (Instruments 1 and 2); (2) a paper and pencil survey of frontline staff and supervisors (Instrument 3); (3) group and individual interviews with participating youth (Instrument 4); and (4) a protocol for recording attendance and content coverage (Instrument 5). Through these data collection efforts, the study will document the program context in each site, the planned intervention, the implementing organization, other organizational partners participating in implementation, implementation systems, youth’s program dosage and youth’s experiences and satisfaction with the programs.

The data will serve two main purposes. First, the information will enable the study team to produce clear, detailed descriptions of each intervention that is evaluated and the counterfactual in each site. This documentation is critical for understanding the meaning of impact estimates. Second, the data will be used to assess fidelity of implementation and the quality of program delivery. This information is essential for determining whether the interventions were implemented well and whether the evaluation provided a good test of each site’s intervention.

A.3. Use of Information Technology to Reduce Burden

For program attendance and content coverage data, sites will be able to either submit an extract from their existing information systems or use a spreadsheet that has been developed by Mathematica to facilitate data entry (Instrument 5), whichever method is least burdensome to them. The spreadsheet has been designed based on experience from prior studies, such as the PREP Multi-Component Evaluation, with similar types of programs and delivery methods. As such, it is flexible and easy-to-use, while ensuring high quality data is collected.

A.4. Efforts to Identify Duplication and Use of Similar Information

OAH has carefully reviewed the information collection requirements for the PAF Study to avoid duplication with existing and ongoing studies of programs to support expectant and parenting youth, and in particular those that are federally funded. The PAF Study will contribute to a very slim knowledge base on effective approaches for improving outcomes for expectant and parenting youth. In the past few decades, many social policy efforts have focused on the prevention of teen and unplanned pregnancy. When prevention efforts are absent or failed, we must consider how to support young people facing these daunting challenges. The evidence base for doing so is slim. The PAF evaluation will add three effectiveness studies to this literature, and will provide a detailed description of grantees’ programmatic approaches.

The PAF Study is also unique in that it will contribute information on impacts and implementation to the very slim knowledge base and about three distinct program models.

A.5. Impact on Small Businesses

No small businesses will be involved. Programs in some sites may be operated by non-profit community-based organizations. The data collection plan is designed to minimize burden on such sites by providing staff from Mathematica Policy Research to collect data during the site visits and through follow-up telephone interviews as needed.

A.6. Consequences of Not Collecting the Information/Collecting Less Frequently

Implementation data are essential for understanding the results of a rigorous evaluation of pregnancy prevention programs. Data collection early in program implementation is crucial for documenting site implementation plans and early program experiences, while data collection late in program implementation is essential for learning about actual service delivery and unplanned adaptations, fidelity to plans, participant engagement, and changes in program context during the evaluation period. Without implementation data, we lose the opportunity to document the evolution of program implementation during the evaluation and provide lessons based on the experiences of the sites. Collecting implementation data less frequently would either make it impossible to assess fidelity of program implementation or require reliance on program documents and respondent recall to document program implementation plans.

A.7. Special Circumstances

There are no special circumstances for the proposed data collection efforts.

A.8. Federal Register Notice and Consultation Outside the Agency

The 60-day Federal Register Notice was posted on November 20, 2014. No public comments were received. A copy of the 60-day Federal Register Notice is found in Attachment F.

The names and contact information of the persons consulted in the drafting and refinement of the in-depth implementation study instruments are found in Attachment B.

A.9. Payments to Respondents

For youth who participate in a focus group or interview, a $25 gift card will be provided as a token of appreciation for the time commitment associated with their participation. In previous studies, providing a gift card as a thank you has been essential for obtaining a strong youth response rate for focus groups.


A.10. Assurance of Confidentiality

Site and state staff participating in group or individual interviews will receive information about privacy protection when arrangements are made for meeting with them, and information about privacy will be repeated as part of the study field staff’s introductory comments during site visits. Site visit staff will be informed about privacy procedures during training and will be prepared to describe them and to answer questions raised by local program staff.

There will be a separate consent process for participation in youth focus groups. and semi-structured interviews. Youth under age 18 will need a signed parental consent form, as well as youth assent, for participation in a focus group. Youth 18 or older must provide consent to participate in a focus group. A copy of these forms is included as Attachment C. Focus group consentConsent and assent forms state that answers will be kept private, that youths’ participation is voluntary, that they may refuse to participate, and that identifying information about them will not be released or published. The focus group consent forms also include additional language explaining the unique confidentiality risks associated with participation in a group interview.

All program attendance and content coverage data will be transmitted with a unique identifier rather than personally identifying information. The unique identifier is necessary to support combining the program attendance data with outcome data. All electronic data will be stored in secure files.

For administration of hard copy staff surveys, site visitors will provide respondents with a chance to opt out of the staff survey, should they want to do so. The questionnaire will be distributed in a sealed envelope, and the questionnaire and distribution envelope will have a label with a unique staff ID number. No identifying information will appear on the questionnaire or the return envelope.

Staff are trained to keep all data collection forms in a secure location and are instructed not to share any materials with anyone outside of the study team. Surveys completed at the time of the site visit will be collected by site visitors and brought back to the Mathematica office. Surveys completed later will be mailed back to Mathematica in postage-paid envelopes.

All electronic data will be stored in secure files, with identifying information kept in a separate file from survey and other individual-level data. Survey responses will be stored on a secure, password-protected computer shared drive. Mathematica’s Confidentiality Pledge , signed by all staff, is included in Attachment D.

A.11. Justification for Sensitive Questions

There are no sensitive questions in the in-depth implementation study instruments. The questions focus on program experiences and context, and do not ask participants about their sexual activity or other risk-taking behavior.

A.12 Estimates of the Burden of Data Collection

OAH is requesting three years of clearance for the implementation study data collection activities. Tables A12.1 and A12.2 provide the estimated annual reporting burden calculations for the data collection from staff and focus groupsdata collection with participants. These are broken out separately as burden for staff (Table A12.1) and for youth participants (Table A12.2). Table A12.3 provides a summary of the annual burden hours and costs for this new ICR..

  1. Annual Burden for Program Staff

It is expected that across the three evaluation sites, there will be a total of 25 program administrators and 80 case managers/home visitors. Each program administrator will be interviewed twice, once at each site visit, for a total annual burden hours of (25/3 years) x 2 or about 16 hours. We anticipate that about half of the case managers/home visitors (40) will volunteer for focus groups. These staff focus groups will average 1 hour in length, and will only be conducted once (during the second site visit). Annual burden hours are estimated at (40/3 years) x 1 = 13. All program administrators and case managers will be asked to complete a 35 minute survey. Annual burden hours are estimated to be (105/3) x 0.6 = 21 hours. Administrative data on program attendance and content coverage will be collected from approximately 6 program administrators across the two experimental design sites (California and Texas). The annual number of respondents is estimated to be (6/3) = 2, and they will be providing data once per month and will spend about 0.5 hours per month compiling the data. Annual burden hours are estimated to be 2 x 12 x 0.5 = 12. Across all implementation study data collections, we estimate 62 total annual hours of burden. Assuming a wage rate of $20.76, the cost of this burden is estimated to be 62 hours x $20.76 = $1,287.12. This hourly wage rate represents the mean hourly wage rate for community and social service occupations (National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor, May 2010).










Table A12.1. Calculations of Burden Hours and Cost for Staff

Instrument

Type of respondent

Total Number of Respondents

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Total Burden Hours

Average Hourly Wage

Total Annual Cost

In-Depth Implementation Study

Semi-structured interview

Program administrators

25

8

2

1

16

$20.76

$332.16

Staff focus group

Case managers

40

13

1

1

13

$20.76

$269.88

Staff survey

Program administrators and case managers

105

35

1

0.6

21

$20.76

$435.96

Program attendance and content coverage protocol

Program administrators

6

2

12

.5

12

$20.76

$249.12

Estimated Annual Burden Hours for Program Staff

62


$1,287.12



2. Annual Burden for Youth Participants

It is expected that about ten percent of 2,000 youth participating in the programs will be available and interested in participating in either a focus group or individual interview at the time of the site visits (200). The focus group or interview is expected to take 1.5 hours, yielding an annual burden estimate of (200/3 years) x 1.5 hours = 100 hours. It is estimated that 20 percent of the annual number of respondents (200/3)*.20 = (13 youth) will be aged 18 or older and have a wage rate of $7.25, yielding an annual cost estimate of $ 141 (13 x 1.5 x $7.25).


Table A12.2. Calculations of Burden Hours and Cost for Youth Participants

Instrument

Type of respondent

Total Number of Respondents

Annual Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Total Burden Hours

Total Burden Hours for Youth Age 18 or Older

Hourly Wage Rate

Total Costs

Impact Study

Youth focus group

Program participants

200100

6733

1

1.5

10050

2010

$7.25

$14172.50

Youth semi-structured interview

Program participants

100

33

1

1.5

50

10

$7.25

$72.50

Estimated Annual Burden for Youth Participants


100

20


$141145


3. Overall Burden

Table A12.3 detail the overall burden requested for this ICR for the PAF Study In-Depth Implementation Component. A total of 162 hours (and a cost of $1,428.12) is requested in this ICR.

Table A12.3. Calculations of Annual Burden Hours and Costs

Data collection instrument

Type of Respondent

Annual number of respondents

Number of responses per respondent

Average burden hours per response

Total burden hours

Total Burden Hours for Youth Age 18 or Older

Hourly Wage Rate

Total costs

PAF Study In-Depth Implementation Study

Semi-structured interview

Program administrators

8

2

1

16

N/A

$20.76

$332.16

Staff focus group

Case managers

13

1

1

17

N/A

$20.76

$269.88

Staff survey

Program administrators and case managers

35

1

0.6

21

N/A

$20.76

$435.96

Program attendance and content coverage protocol

Program administrators

2

12

.5

12

N/A

$20.76

$249.12

Youth focus group

Program participants

6733

1

1.5

10050

2010

$7.25

$14172.50

Youth semi-structured interviews

Program participants

33

1

1.5

50

10

$7.25

$72.50

Estimated Total Annual Burden

162


$1,428432.12


A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

These information collection activities do not place any capital cost or cost of maintaining requirements on respondents.

A.14. Annualized Cost to Federal Government

Data collection will be carried out by Mathematica Policy Research, under contract with OAH to conduct the PAF Study. The cost for collecting the implementation study data is $230,000.00, and the annual cost is $76,666.67.

A.15. Explanation for Program Changes or Adjustments

There are no program changes or adjustments.

A16. Plans for Tabulation and Publication and Project Time Schedule

1. Analysis Plan

The instruments included in this OMB package for the in-depth implementation study will yield data that will be analyzed using qualitative and quantitative methods to describe program implementation, assess the program’s overall quality, and examine fidelity to the program model and experience with program implementation. A thorough understanding of program implementation will provide context for interpreting program impacts, while a greater understanding of how programs can be implemented with high quality is expected to inform the next generation of programming.

The research team will create a coding scheme consisting of a hierarchy of conceptual categories and classifications linked to the evaluation research questions, dimensions of implementation, and program logic models. Team members will then use software (Atlas.ti) to assign codes to specific text in the electronic file of site visit notes and other documents. Coding the qualitative data in this way will enable the team to access data on a specific topic quickly and to organize information in different ways to facilitate the identification of themes and compile the evidence supporting them. As data collection proceeds, the coding scheme will be refined to better align it with both themes and topics that emerge from the data and with the research questions (Ritchie and Spencer, 2002).2 To facilitate analyses of patterns and themes across sites, we will also code key site-level characteristics, such as type of program model and characteristics of the youths served.

After all the qualitative data have been coded, we will use the software to retrieve data on the research questions and subtopics to identify themes and triangulate across data sources and individual respondents. Much of the meaning of the data will be discerned through descriptive analyses—qualitative and quantitative--that organize data thematically; create summary statistics that characterize overall experiences in each site, as well as variations across and within sites; and examine themes and topics from multiple perspectives and highlight the similarities and differences among them (Patton, 2002).3 We will also explore relationships across themes (for example, relationships between the types of implementation challenges sites face and their staffing patterns and partnership arrangements).

.

2. Time Schedule and Publications

OAH expects that the PAF Study will be conducted over five years, beginning in September 2014. This request is for a three year period and subsequent packages will be submitted as necessary for new collections or to extend collection periods. Below is a schedule of the data collection efforts for the in-depth implementation study, the focus for this ICR:

Instrument

Date of 30-Day Submission

Date Clearance Needed

Date for Use in Field

In-depth Implementation Study

Master list of topics

February 2015

April 2015

April 2015

Master interview guide for staff

February 2015

April 2015

April 2015

Staff survey

February 2015

April 2015

April 2015

Focus groups for program youth

February 2015

April 2015

April 2015

Program Observation template

February 2015

April 2015

April 2015



One of the random assignment sites (California) began enrolling study participants in December 2014, and implementation study activities will begin in April 2015. The second random assignment site (Texas) will begin enrolling in spring 2015, with implementation study activities beginning in summer 2015. In the quasi-experimental site (Washington, DC), implementation data collection activities will occur in April 2015. Because OAH plans to analyze each site separately, it is acceptable for the data collection schedule to vary across sites. The timing of site visits will be determined after sites are confirmed and specific implementation plans are known, but the goal is to conduct the first site visit early in the implementation period for most sites and to conduct a second visit later in the implementation period to allow for program maturation and to help capture variations in youth experiences over time. The timelines for the staff survey and the focus groups will coincide with the site visits.

We will produce site-specific implementation reports in 2016 that convey information that policy and program decision makers need on key subtopics of interest.

A17. Reason(s) Display of OMB Expiration Date is Inappropriate

All instruments, and consent and assent forms, will display the OMB Control Number and expiration date.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.




2 Ritchie, J., and Spencer, L. (2002). Qualitative data analysis for applied policy research. In Huberman, A.M., and Miles, M.B. The qualitative researcher’s companion. Thousand Oaks, CA: Sage Publications.

3 Patton, M.Q. (2002). Qualitative research and evaluation methods: Third edition. Thousand Oaks, CA: Sage Publications.




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBCollette
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy