1 Part A_PREP_DIS_IS OMB Draft_2014-11-13

1 Part A_PREP_DIS_IS OMB Draft_2014-11-13.docx

Personal Responsibility Education Program (PREP) Multi-Component Evaluation

OMB: 0970-0398

Document [docx]
Download: docx | pdf


U.S. Department of Health
and Human Services

Office of Planning, Research and Evaluation & Family and Youth Services Bureau, Administration for Children and Families

7th floor West Aerospace Building

370 L'Enfant Promenade, SW

Washington, DC 20047

Project Officers: Clare DiSalvo, Dirk Butler




Part A: Justification for the Collection of Implementation Survey Data - Personal Responsibility Education Program (PREP) Multi-Component Evaluation

0970-0398

Draft

August 2014








CONTENTS

Part A: Justification for IMPLEMENTATION DATA COLLECTION 1

A1. Circumstances making the collection of information necessary 1

1. The Personal Responsibility Education Program (PREP) 1

2. The PREP Evaluation 1

3. The Design and Implementation Study (DIS) - Implementation Survey 3

A.2. Purpose and use of the information collected 3

A.3. Use of information technology to reduce burden 6

A.4. Efforts to identify duplication and use of similar information 7

A.5. Impact on small businesses 8

A.6. Consequences of not collecting the information/collecting less frequently 8

A.7. Special circumstances 8

A.8. Federal Register notice and consultation outside the agency 8

A.9. Payments to respondents 9

A.10. Assurances of confidentiality 9

A.11. Justification for sensitive questions 10

A.12. Estimates of the burden of data collection 10

1. Estimate of Annual Burden 10

2. Estimates of Annual Costs 10

3. Overall Burden 11

A13. Estimates of other total annual cost burden to respondents and record keepers 15

A.14. Annualized cost to federal government 15

A.15. Explanation for program changes or adjustments 15

A16. Plans for tabulation and publication and project time schedule 15

1. Analysis Plan 15

2. Time Schedule and Publications 16

A17. Reason(s) display of OMB expiration date is inappropriate 16

A18. Exceptions to certification for Paperwork Reduction Act submissions 16

REFERENCES 17



TABLES

Table A.1: PREP Evaluation Components 2

Table A.2: Implementation Survey Instruments 3

Table A.3: Implementation Study Semi-Structured Interview Topics by Respondent Type 6

Table A.4: Estimate of the Burden and Cost for the Grantees and Implementation Sites for Implementation Study Data Collection 11

Table A.5: Estimate of Burden and Cost for the PREP Evaluation – Approved and Requested Burden 12

Table A.6: Schedule for the Implementation Survey 16



ATTACHMENTS

ATTACHMENT A: 60 Day Federal Register Notice a.1

ATTACHMENT B: PREP Evaluation Description b.1



INSTRUMENTS

INSTRUMENT #1: Implementation Survey Interview Topic Guide

INSTRUMENT #2: Emails Introducing the Implementation Survey to Interview Respondents


Part A: JUStification for IMPLEMENTATION DATA COLLECTION

The Family and Youth Services Bureau and the Office of Planning (FYB), Research and Evaluation (OPRE) within the Administration for Children and Families (ACF) in the U.S. Department of Health and Human Services (HHS) have contracted with Mathematica Policy Research and its subcontractors to conduct the Personal Responsibility Education Program Multi-Component Evaluation. The purpose of the evaluation is to assess the implementation, outcomes, and impact of programs implemented as part of the Personal Responsibility Education Program (PREP). This package requests clearance for a second round of data collection conducted for the evaluation’s Design and Implementation Study (DIS). For more information on data collection related to previously approved activities, see Information Collection Requests (ICRs) under OMB Control # 0970-0398.

A1. Circumstances making the collection of information necessary

1. The Personal Responsibility Education Program (PREP)

On March 23, 2010 the President signed into law the Patient Protection and Affordable Care Act (ACA), H.R. 3590 (Public Law 111-148, Section 2953), which authorized the Personal Responsibility Education Program (PREP). PREP provides grants to states, tribes, tribal communities, and local organizations to support evidence-based programs to reduce teen pregnancy and sexually transmitted infections (STIs). The programs are required to provide education on both abstinence and contraceptive use. The programs will also offer information on adulthood preparation subjects such as healthy relationships, adolescent development, financial literacy, parent–child communication, education and employment skills, and healthy life skills. Grantees are encouraged to target their programming to high-risk populations—for example, homeless youth, youth in foster care, pregnant or parenting teens, youth residing in geographic areas with high teen birth rates, and Native American youth. States and territories acquire PREP funding through formula grants (state PREP), and local organizations and tribes obtain it via a competitive grant process (Competitive PREP and Tribal PREP). The program is administered by the Family and Youth Services Bureau (FYSB), within the Administration for Children and Families (ACF), in the U.S. Department of Health and Human Services (HHS).

2. The PREP Evaluation

As part of ACA, Congress mandated a federal evaluation of the PREP program. To meet this need, FYSB and the Office of Planning, Research and Evaluation (OPRE) within ACF have contracted with Mathematica Policy Research and its subcontractors (“the study team”) to conduct the PREP Multi-Component Evaluation (the PREP Evaluation), a seven year evaluation to document how PREP-funded programs are operationalized in the field, collect and analyze performance measure data from PREP grantees, and assess the effectiveness of selected PREP-funded programs on reducing teenage pregnancies, sexual risk behaviors, and STIs. The PREP Evaluation contains three complementary components. The purposes and objectives of each component are described in Table A.1.








Table A.1. PREP Evaluation Components

Evaluation Component

Purpose and Objectives

Design and Implementation Study

Overall

  • Provide a broad descriptive analysis of how states are using PREP grant funding to develop and support evidence-based teen pregnancy and STI prevention programs.

Design Survey

  • Conduct telephone interviews with all PREP state grantees prior to or at the start of program implementation.

  • Describe states’ plans to implement evidence-based programming under PREP, including the reasons why key program design decisions were made and states’ goals for program implementation.

Implementation Survey

  • Conduct semi-structured telephone interviews with multiple respondents in four states one year into implementation to examine the structures in place to support evidence-based PREP programs.

  • Develop a detailed description of four states’ efforts to support the implementation of evidence-based programs with quality and fidelity.

Performance Analysis Study

  • Develop PREP program performance measures.

  • Collect and analyze program performance measures from all PREP grantees to understand whether program objectives are being met and whether technical assistance is needed to support program improvement.

Impact and In-Depth Implementation Study

Overall

  • Assess the impacts and implementation of funded programs in four selected PREP sites.

Impact Study

  • Determine whether the selected PREP-funded programs are effective at reducing teen pregnancy, STIs, and sexual risk behaviors.

  • Provide sound, scientific evidence about program effectiveness within the context of this large-scale replication effort.

Implementation Study

  • Identify factors that affect large-scale replication of the selected program models.

  • Assess the quality of delivery and fidelity to the the selected program models.


The Office of Management and Budget (OMB) has previously approved four information collection requests (ICRs) related to the PREP Evaluation:

  • November 6, 2011: OMB approved “Field Data Collection” as part of the Impact and In-Depth Implementation Study, which involved collecting data on various program models and assessing the feasibility of conducting a rigorous evaluation (OMB Control # 0970-0398).

  • March 7, 2012: OMB approved the “Design Survey” conducted as part of the Design and Implementation Study, which involved interviewing state administrators about key decisions they made about the design of their PREP programs (OMB Control #0970-0398).

  • March 12, 2013: OMB approved the instruments associated with two data collection efforts: (1) collection of PREP performance measures from state and tribal PREP grantees for the Performance Analysis Study through participant entry and exit surveys and the Performance Reporting System Data Entry; and (2) collection of baseline data for the Impact and In-Depth Implementation Study through a baseline survey (OMB Control # 0970-0398).

  • November 8, 2013: OMB approved (1) the data collections associated with the Performance Analysis Study for CPREP grantees and for (2) the collection of youth follow-up data, staff interviews, a staff survey, and youth focus groups under the Impact and In-Depth Implementation Study (OMB Control # 0970-0398).

3. The Design and Implementation Study (DIS) - Implementation Survey

In this submission, ACF requests OMB approval for an instrument used to collect data for the Implementation Survey phase of the PREP Evaluation’s DIS component. The Design Survey phase of the DIS component was implemented across all State PREP grantees. The Implementation Survey phase of the DIS will provide a detailed description of how a sub-set of four states have created structures and supports to assist program providers in implementing evidence-based programs with quality and fidelity to their designs. During the Design Survey interviews, nearly all PREP grantees identified maintaining program fidelity as one of their primary objectives, and nearly all grantees reported plans to support training, technical assistance, and monitor program implementation (Zief et. al.). The Implementation Survey will examine (a) the different structures and practices that states have put in place to support the successful implementation of evidence-based programs with fidelity, (b) the extent to which the structures and practices may vary within different state contexts, and (c) lessons learned from supporting state-wide program implementation.

The instrument used to conduct Implementation Survey data collection—Implementation Survey Interview Topic Guide (Instrument 1)—is attached to this submission. Also attached are the 60 Day Federal Register Notice request for comments on Implementation Survey data collection (Attachment A), a summary of the PREP evaluation (Attachment B), and the e-email that will be used to introduce the survey to respondents and to schedule interviews (Attachment C).

A.2. Purpose and use of the information collected


To achieve the Implementation Survey goals, the study team will conduct semi-structured telephone interviews with staff involved in PREP program implementation at multiple levels within four states. Based on program structure and staffing information collected during Design Survey interviews, we expect that interview respondents will include:

  • State grantee lead staff. State PREP funds are provided to a state agency, such as a department of health. Interviews with these state agency administrators will focus on the states’ overall goals and plans for assuring and maintaining program quality and fidelity across providers, and the state’s role in administering and/or overseeing these activities.

  • Training and technical assistance staff. During Design Survey interviews, states indicated that they either directly provide training and technical assistance to program providers, or that they contract with another organization (or organizations) to do so. Be they state or sub-contractor employees, the study team will interview these staff about the goals and activities of training and technical assistance for PREP programs across the state.

  • Evaluators. As with training and technical assistance, states either evaluate their PREP programs or contract with an outside organization to do so. Interviews with these respondents will focus on how the state and its program providers use evaluation data for program monitoring and continuous improvement purposes.

  • Program providers. The study team will speak with key staff among program providers to learn about staffing and organizational details that support PREP implementation as well as the use of and perceived effectiveness of the implementation structures and supports that are in place for the PREP program in the state.

Speaking with respondents from across these groups will ensure that the data collected represent the range of perspectives and positions involved in supporting implementation quality and fidelity. Further, it will ensure that the study team understands not only how service delivery and administrative processes are intended to work, but also how they actually work.

We anticipate that the study team will interview an average of 8 respondents per state, for a total of 32 respondents across the 4 selected states. While the specific respondents in each state will likely vary, we expect that among these eight respondents per state will be four state-level staff—one state grantee respondent, two training and technical assistance respondents, and one evaluator respondent—and one manager from each of four program providers from within the state.

The study team will use the Implementation Survey Interview Topic Guide (attached Instrument 1) to guide the Implementation Survey interviews. The Topic Guide is informed by the principles of implementation science. Implementation science is the study of how evidence-based or evidence-informed programs and practices are translated, replicated, and scaled up in diverse, “real world” service delivery settings. The Topic Guide is structured largely on the definitions and elements of implementation stages and implementation drivers, as laid out by the National Implementation Research Network (NIRN). The NIRN has reviewed and summarized findings from a large body of literature on implementation to identify the practices and supports that are common among successfully implemented programs or interventions that produce their intended outcomes. The Topic Guide is organized into eight constructs:

  1. PREP Implementation Structure and Planning. The Design Survey captured information about the initial design and implementation plans for all the PREP grantees. The objective for the Implementation Survey is to understand how the implementation structure may have changed in the four study states and why, as well as lessons learned about the implementation structure that could inform future initiatives to provide evidence-based programs state-wide. Questions in this construct will also gather information on the means for and perceived success in assessing the PREP model fit for program providers, and the readiness of provider organizations, their partners, and communities to support the PREP program.

  2. Implementation Support: Training and Technical Assistance. Questions in this construct will focus first on capturing details of the implementation support structure and adequacy of funding for supports within each of the four states. The questions will then delve into the timing, content, modes, and frequency of training and technical assistance to support the PREP program. Additionally, questions will gather information on how training and technical assistance needs are identified and initiated, and what follow-up occurs to further refine implementation and decision-making.

  3. Implementation Drivers: Staff Competency Drivers. While training and technical assistance build and support staff capacity to deliver evidence-based programs as intended, additional drivers that are specific to the implementation staff themselves also contribute to the competency with which a program is implemented. Questions in this construct will examine criteria and process for staff selection, the degree of staff turnover or retention and strategies to support consistency in staff for service delivery, the means through which program expectations are communicated to staff and staff receptiveness to the program, and, lastly the means, focus, and frequency of staff supervision and performance assessment.

  4. Implementation Drivers: Organization Drivers. The capacity and support that exists within each organization for the program also contributes to the quality and fidelity with which the program can be implemented. Questions in this construct will examine: (1) decision support data systems—what data are collected, with what quality assurances, and how the data are used in implementation; (2) the degree of facilitative administration—what organization leaders are willing to do or change to support implementation; (3) communication and feedback loops that involve all levels of staff and the range of organizational partners in implementation refinement and decision-making; and, (4) strategies, successes, and challenges in systems interventions—how partners, communities, and other service systems are brought on board to support implementation and how responsive they are to changes that support implementation.

  5. Fidelity Assessment and Monitoring. While PREP grantees are not held to specific benchmarks in maintaining fidelity, the majority of state grantees indicated that fidelity monitoring would be an important emphasis in their programs. Questions will focus on the means through which expectations about adherence to the service model are communicated to programs and staff, and means, focus, and frequency of monitoring service delivery, and the adaptations that have been made to PREP programs and why.

  6. Evaluation Capacity. Questions in this construct will gather information about the evaluation capacity in each of the four states, as well as how information from evaluation is communicated and used to inform PREP implementation. They will also assess the type and frequency of PREP program evaluations, and ask about key findings from completed program evaluations (if available) and the extent to which state grantees use evaluation results to improve program models.

  7. Sustainability. Questions in this construct will examine the extent to which plans are underway to preserve the service delivery and funding structures to continue the PREP program and whether these pursuits are coordinated across implementation partners.

  8. Perceptions and Lessons Learned about PREP Implementation. The Implementation Survey will end with a number of summary questions to ensure that the perspectives of each respondent on the successes and challenges in implementing PREP and the effectiveness of the support structures are gathered.

The specific questions asked by the study team during the semi-structured interviews will vary by respondent type, but all questions will remain within the scope of the constructs discussed above and detailed in attached Instrument 1: Implementation Survey Interview Topic Guide. Each interview will last for an average of one hour.

Table A.2. Implementation Study Semi-Structured Interview Topics by Respondent Type


State grantee lead staff

Training and technical assistance provider staff

Evaluator staff

Program provider managers

Implementation Structure and Planning

Implementation structure



Model fit for service providers



Lessons from PREP planning



Implementation Support: Training and Technical Assistance (TA)

Implementation support

structure

Funding for training and TA


Training to support PREP

implementation


Ongoing support and TA


Implementation Drivers: Competency Drivers

Staff selection


Staff turnover and retention


Staff expectations and

receptiveness


Staff supervision and

performance assessment




Implementation Drivers: Organizational Drivers

Decision-support data

systems


Facilitative administration


Communication and

feedback loops


Systems interventions

Fidelity Assessment and Monitoring

Program modifications or

adaptations


Adherence to service model


Monitoring service delivery

Evaluation Capacity

Evaluation capacity



Sustainability

Sustainability


Perceptions and Lessons Learned about PREP Implementation

Perceptions and lessons learned


A.3. Use of information technology to reduce burden


The Implementation Survey is not conducive to the use of information technology.



A.4. Efforts to identify duplication and use of similar information


We have carefully reviewed the information collection requirements for PREP to avoid duplication with either existing studies or other ongoing federal teen pregnancy prevention evaluations and believe that the PREP Evaluation complements, but does not duplicate, the existing literature and other ongoing federal teen pregnancy prevention evaluations.

There are three other federal teen pregnancy prevention-related evaluations currently in the field, each with a very specific focus. They are: (1) the Evaluation of Adolescent Pregnancy Prevention Approaches, sponsored by the Office of Adolescent Health within HHS, which focuses on testing promising and innovative new models for reducing teen pregnancy (OMB Control # 0970-0360); (2) the Teen Pregnancy Prevention Replication Study, also sponsored by the Office of Adolescent Health within HHS, which focuses on the testing of evidence-based models for reducing teen pregnancy (which are being scaled up through the Teen Pregnancy Prevention Program administered by the HHS Office of Adolescent Health) (OMB Control # 0990-0375); and (3) the Community-Wide Initiatives Study (OMB Control # 0920-0952), sponsored by the Centers for Disease Control and Prevention, which focuses on testing community saturation models for reducing teen pregnancy.

We believe that the PREP Evaluation complements these other evaluations by adding much-needed information on the replication of evidence-based programs, with particular emphasis on (1) replication among high-risk populations and in new settings and (2) how states and localities choose and implement evidence-based programs most appropriate for their local contexts. Further, the evaluation provides a unique opportunity to document and test adaptations to existing evidence-based program models—for example, through the incorporation of adulthood preparation education or through the natural adaptation that will arise as states choose and implement programs in different ways. Finally, the evaluation will offer lessons on the successes and challenges in scaling up and disseminating evidence-based programming on a very broad scale, with implications not only for the field of teen pregnancy prevention research but also for other areas of social services research.

The Implementation Survey phase of the PREP Evaluation’s Design and Implementation Study component will provide policymakers and practitioners with critical information about approaches to supporting and maintaining high-quality implementation of evidence-based programs. This is not currently being assessed as part of the three evaluations mentioned above or through any of the other components of the PREP Evaluation. The Implementation Study phase of the ongoing Impact and In-Depth Implementation Study component of the PREP Evaluation is targeted toward understanding implementation of and fidelity to a specific PREP model within each of four states that are the focus of the Impact Study phase. The Implementation Survey differs from this work in that the survey is focused on understanding the state infrastructure to support implementation quality and fidelity for a variety of program models within each selected state (not the specific implementation of one model). The study team does not plan to select any states for the Implementation Survey that are participating in the Impact and In-Depth Implementation Study; however, if one of these states helps us to achieve important variation along one of the key dimensions of state selection, the study team might invite the state to participate in the Implementation Survey. Should this unlikely scenario arise, the study team will make every effort not to interview the same state-level respondents who are interviewed for the Impact and In-Depth Implementation Study.   Further, the study team will ensure that program providers that are already involved in the In-Depth Implementation Study will not also be selected for the Implementation Survey in order to minimize the burden placed on them.

The study team will make use of extant data collected through multiple PREP evaluation components to reduce respondent burden on the Implementation Survey. Prior to conducting the interviews, the study team will review (a) data collected from state PREP grantees during the Design Survey phase of the DIS Study (including grantee information from state’s applications for PREP funds and Design Survey interview responses), and (b) performance measures data collected for the Performance Analysis Study. The study team will customize Implementation Survey topic guides to ensure that they only ask for information not already collected for the evaluation, or that is not otherwise available from public sources. Respondents will only be asked those questions relevant to their role in PREP program quality assurance, monitoring, evaluation, and technical assistance to PREP providers. No respondent will be asked the same question more than once.


A.5. Impact on small businesses


PREP program providers in some states may be small, community-based organizations. We will minimize burden on such sites by requesting an interview of just one respondent, conducting the interview over the phone, and ensuring that the interview can be completed in one hour or less.

A.6. Consequences of not collecting the information/collecting less frequently


Implementation Survey data will be collected only once. These data are essential for developing an in-depth understanding of how states support and monitor evidence-based programs to ensure that they are implemented with quality and fidelity. Not collecting these data would make it impossible to accurately understand and assess states’ quality assurance structures. The lessons learned from this data collection effort can inform the structures and supports needed to support large-scale, successful implementation of evidence-based programs. No other component of the PREP evaluation will provide such evidence.

A.7. Special circumstances


There are no special circumstances for the proposed data collection efforts.


A.8. Federal Register notice and consultation outside the agency


In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and OMB regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on October 22, 2013, Volume 78, Number 204, pages 62637-62638 and provided a 60-day period for public comment. During the notice and comment period, no comments were received. A copy of the 60-day FRN is included as Attachment A.

The names and contact information of the persons consulted in the drafting and refinement of the interview topic guide and analysis plan for the Implementation Survey are:

Clare DiSalvo

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services

370 L'Enfant Promenade, SW

7th Floor West

Washington, DC  20447

(202) 401-4537


Dirk Butler

Family and Youth Services Bureau

Division of Abstinence Education

U.S. Department of Health and Human Services

370 L’Enfant Promenade, SW

Washington, DC 20477

(202) 260-2242

Robert Wood

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 936-2776


Susan Zief

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 275-2291


Gretchen Kirby

Mathematica Policy Research

1100 1st Street, NE, 12th Floor
Washington, DC 20002-4221

(202) 484-3470


Diane Paulsell

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 084543-2393

(609) 275-2297

Jessica Ziegler

Mathematica Policy Research, Inc.

955 Massachusetts Ave., Suite 801

Cambridge, MA 02139

(617) 715-9939



A.9. Payments to respondents


No payments to respondents are proposed for this information collection.


A.10. Assurances of confidentiality


The Implementation Survey will not collect or report any personally identifiable information. Nonetheless, the study team will adhere to a set of strict standards to ensure that data and respondent privacy are protected. Respondents will receive information about privacy protection when arrangements are made for speaking with them, and information about privacy will be repeated as part of the study team’s introductory comments during interviews. The study team will be informed about privacy procedures during training and will be prepared to describe them and to answer questions raised by respondents.

The study team will ensure respondent privacy in all Implementation Survey publications. An Implementation Survey report will summarize the commonalities and differences in how the four states approach program fidelity and quality monitoring and technical assistance. The report will primarily describe themes emerging across the states, without naming particular states, programs, or people. However, when necessary to convey a key point or finding, the report will highlight examples from individual states and sites. The state and sites included in these examples will only be identified by name when the example discusses a best practice, and the study team will obtain their approval of all such references. The report will also contain brief summaries of each state’s implementation support and monitoring approach that will identify states by name. These summaries will only include facts about program monitoring and support, will not impart respondent or study team opinions, and will not compare states to one another.

Individual interview respondents will not be identified by name in any Implementation Survey publications. They will be notified of the reporting approaches discussed above as part of the privacy information provided by the study team prior to interviews (see the introduction of attached Instrument 1: Implementation Survey Interview Topic Guide).

A.11. Justification for Sensitive Questions


The Implementation Survey Interview Topic Guide does not contain topics of a sensitive or personal nature. No personal information will be requested from respondents interviewed for the Implementation Survey. The interviews will focus on respondents’ knowledge, experiences, and role in PREP program monitoring and technical assistance. Nonetheless, respondents will be informed that they do not have to respond to any questions that they do not feel comfortable answering (see the introduction of attached Instrument 1: Implementation Survey Interview Topic Guide).

A.12. Estimates of the burden of data collection


We are requesting three years of clearance for the Implementation Survey data collection.

1. Estimate of Annual Burden – Current ICR

The annual estimated hours of burden for the data collection included in this request for clearance is about 12 hours (see Table A.3), which equals the sum of the estimated annual burden for all semi-structured telephone interviews across 32 state-level and provider-level respondents. Each interview is expected to last one-hour.

2. Estimates of Annual Costs – Current ICR

The annual estimated cost to respondents for the Implementation Survey data collection is $361.08 (see Table A.3). The total annual cost for state-level interview respondents is $232.08 (6 annual hours * $38.68). This hourly wage rate represents the mean hourly wage rate for social scientists and related workers (National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor, May 2013). The total annual cost for provider-level respondents is $129.00 (6 annual hours * $21.50). This hourly wage rate represents the mean hourly wage rate for community and social service occupations (National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor, May 2013).



Table A.3. Estimate of the Burden and Cost for the Grantees and Implementation Sites for Implementation Study Data Collection


Total Number of Respondents

Annual Number of Respondentsa

Number of Responses per Respondent

Average Burden Hours per Response

Total Annual Burden Hoursa

Average Hourly Wage

Total Annualized Costa

State-level Respondents

16

6

1

1

6

$38.68

$232.08

Provider-level Respondents

16

6

1

1

6

$21.50

$129.00

Total:





12


$361.08

aAll burden estimates are annualized over three years.

3. Overall Burden under OMB #0970-0398

Table A.4 details the overall burden approved and requested for data collection associated with the PREP Multi-Component Evaluation. A total of 36,931 annual burden hours (at an annualized cost of $463,824) has been approved thus far with the prior four ICRs for this project. A total of 12 annual burden hours (at an annualized cost of $361.08) is requested in this ICR. If approved, the total annual approved burden for this project (i.e. the prior annual burden summed with the requested annual burden) will be 36,943 hours (at an annualized cost of $464,185.08). Note: Some of data collection for the instruments below is complete, while other instruments are still being used in the field. Please see Table A.4 for specific details regarding instruments that are no longer being used vs. those that are still in use to date.




























Table A.4. Estimate of Burden and Cost for the PREP Evaluation – Approved and Requested Burden

Data Collection Instrument

Type of Respondent

Total Number of Respondents

Number of Responses per Respondenta

Average Burden Hours per Response

Total Burden Hours

Annual Burden Hours

Annual Burden Hours for Age 18 or Older

Hourly Wage Rate

Total Annualized Cost

Field Data Collection for Impact and In-Depth Implementation Study (Approved November 6, 2011; Data Collection Completed)

Discussion Guide for use with Macro-Level Coordinators

Macro-Level Coordinators

30

1

1

30

10

N/A

$33.59

$333.90

Discussion Guide for use with Program Directors

Program Directors

60

2

2

240

80

N/A

$27.21

$2,176.80

Discussion Guide for Use with Program Staff

Program Staff

120

1

2

240

80

N/A

$23.76

$1,900.80

Discussion Guide for Use with School Administrators

School Administrators

210

1

1

210

70

N/A

$35.54

$2,487.80

Design and Implementation Study (Approved March 7, 2012; Data Collection Completed)

Design Survey: Discussion Guide for Use with PREP State-Level Coordinators and State-Level Staff

State-Level Coordinators and State-Level Staff

90a

1

1

90

30

N/A

$37.45

$1,123.50

Performance Analysis Study and Baseline Data (Approved March 12, 2013; Instruments Still in Use)

Participant Entry Survey

PREP State and Tribal Participants

105,309

1

0.08333

8,775

2,925

731

$7.25

$5,300.00

Participant Exit Survey

PREP State and Tribal Participants

133,722

1

0.16667

22,287

7,429

743

$7.25

$5,386.00

Baseline Survey

PREP State and Tribal Participants

5,700

1

0.75

4,275

1,425

143

$7.25

$1,037.00

Performance Reporting System Data Entry Form

PREP State and Tribal Grantee Administrators

195

1

24

4,680

1,560

N/A

$21.35

$33,306.00

Sub-awardee Data Collection and Reporting

PREP State and Tribal Sub-Awardee Administrator

1,050

1

18.6667

19,600

6,533

N/A

$20.76

$135,625.00

Implementation Site Data Collection

PREP State and Tribal Site Facilitator

4,200

1

8

33,600

11,200

N/A

$20.76

$232,512.00

Impact and In-Depth Implementation Study (Approved November 8, 2013; Instruments Still in Use)

Participant Entry Survey

CPREP Participants

17,673

1

0.08333

1,473

491

123

$7.25

$892

Participant Exit Survey

CPREP Participants

22,961

1

0.16667

3,827

1,276

128

$7.25

$928

Performance Reporting Data System Entry Form

CPREP Grantees

37

2

19

1,406

469

N/A

$20.76

$9,736

Implementation Site Data Collection Protocol

CPREP Implementation Sites

300

2

6

3,600

1,200

N/A

$20.76

$24,912

First Follow-Up Survey

Participants

4,800

1

0.75

3,600

1,200

120

$7.25

$870.00

Second Follow-Up Survey

Participants

2,250

1

0.75

1,688

563

56

$7.25

$406.00

Focus Group Discussion Guide

Participants

320

1

1.5

480

160

16

$7.25

$117.00

Master List of Topics for Staff Interviews

State, Grantee, Subawardee and Implementation Site Staff

160

2

1

320

107

N/A

$20.76

$2,221.00

Staff Survey

Implementation Site Staff

100

2

0.5

100

33

N/A

$20.76

$685.00

Program Attendance

Implementation Site Staff

90

12

0.25

270

90

N/A

$20.76

$1,868.00

Subtotal: Burden Approved To-Date




36,931



$463,824

Design and Implementation Study (Requested with Current ICR)

Implementation Survey Interview Topic Guide

State level Respondents

16

1

1

16

6

N/A

$38.68

$232.08

Implementation Survey Interview Topic Guide

Provider-level Respondents

16

1

1

16

6

N/A

$21.50

$129.00

Subtotal: Burden Requested with Current ICR




12



$361.08

Total: Annual Burden - Approved To-Date and Requested



36,943



$464,185.08

aNumber of responses over the three year period.






A13. Estimates of other total annual cost burden to respondents and record keepers


These information collection activities do not place any capital cost or cost of maintaining requirements on respondents.

A.14. Annualized cost to federal government


The estimated total cost for completion of the PREP Evaluation—across all data collection activities—is $7,935,964, which is an annualized cost of $2,645,321. This includes the total estimated cost for completion of the Implementation Survey phase of the DIS, which is $400,000 or $133,333 per year annualized across three years. PREP Evaluation costs include OMB applications; development of data collection instruments; data collection; data analysis; and reporting.

A.15. Explanation for program changes or adjustments


OMB has previously approved four ICRs related to the PREP Evaluation and under OMB Control # 0970-0398, listed above is section A.1. Circumstances making the collection of information necessary. This request will increase the total burden requested for the PREP Evaluation, under OMB Control No. 0970-0398.

A16. Plans for tabulation and publication and project time schedule


This section details the analysis plan and time schedule and publications related to the Implementation Survey phase of the DIS. For information related to previously approved activities, see ICRs under OMB #0970-0398.

1. Analysis Plan


The Implementation Survey will use the information collected from across respondents to conduct an in-depth analysis of how states support the PREP program to promote quality and fidelity in implementation. The analysis plan consists of three steps:  (1) code the qualitative interview data; (2) organize results in a series of tables for further synthesis, and (3) identify themes in the data within and across states and program providers.

Code the qualitative data. First, the study team will create a coding scheme that closely follows the seven constructs of the protocols and the subtopics contained within each of them. The study team will then use a qualitative software package (NVivo or Atlas.ti) to assign codes to each question response in the electronic files of interview notes. Coding the qualitative data in this way will enable the team to access data on a specific topic quickly and to organize information in different ways to facilitate the identification of themes and compile the evidence supporting them. For thematic data, such as perceptions of structures, successes, and challenges, the coding scheme will be refined to better align it with both themes and topics that emerge from the data and with the research questions (Ritchie and Spencer, 2002).

Organize results in a series of tables. Next, study team will organize key descriptive data—such as information on program implementation structure (e.g., program model), support structure (e.g., training frequency, duration, and topics), and staff qualifications, etc.—into tables. This will ensure that data about program implementation and support is documented in a standardized way that allows for systematic analysis across sites.

Identify themes and patterns in the data. After all the data have been coded and organized, the study team will use the software to retrieve data on the research questions and subtopics to identify themes and triangulate across data sources and individual respondents. Much of the meaning of the data will be discerned through qualitative descriptive analyses that organize data thematically, and that examine themes and topics from multiple perspectives and highlight the similarities and differences among them (Patton, 2002). The study team will also explore relationships across themes (for example, relationships between the types of implementation challenges sites face and their staffing patterns and partnership arrangements).

2. Time Schedule and Publications


Findings from the Implementation Survey will be presented in a report that summarizes the commonalities and differences in how the survey states approach program fidelity and quality monitoring and technical assistance, and that contains brief profiles of each state’s implementation support and monitoring approach. Table A.5 shows the schedule for the Implementation Survey.

Table A.5. Schedule for the Implementation Survey

Activity

Date

Conduct telephone interviews

Fall 2014 – Winter 2015

Analysis

Spring 2015

Reporting

Summer 2015


A17. Reason(s) display of OMB expiration date is inappropriate


The OMB approval number and expiration date will be displayed or cited on all instruments and forms completed as part of the data collection.


A18. Exceptions to certification for Paperwork Reduction Act submissions


No exceptions are necessary for this information collection.

References

National Implementation Research Network (NIRN). FPG Child Development Institute, University of North Carolina, Chapel Hill. Research and Resources. Retrieved from http://nirn.fpg.unc.edu/resource-search on 9 June 2014.

Patton, M.Q. (2002). Qualitative research and evaluation methods: Third edition. Thousand Oaks, CA: Sage Publications.

Ritchie, J., and Spencer, L. (2002). Qualitative data analysis for applied policy research. In Huberman, A.M., and Miles, M.B. The qualitative researcher’s companion. Thousand Oaks, CA: Sage Publications.

Zief, Susan, Rachel Shapiro, and Debra Strong. “The Personal Responsibility Education Program: Launching a Nationwide Adolescent Pregnancy Prevention Effort. Final Report.” Princeton, NJ: Mathematica Policy Research, October 2013.




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAlexandra Clifford
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy