Part A_9.25.19_CleanX

Part A_9.25.19_Clean.DOCX

Cross-site Study Data for Improving Implementation Evaluation among Office of Adolescent Health (OAH) Teen Pregnancy Prevention (TPP) Grantees to inform National Implementations (IMAGIN)

OMB: 0990-0469

Document [docx]
Download: docx | pdf


Part A: SS Justification for the Collection

Cross-site Study Data for Improving Implementation Evaluation among Office of Population Affairs (OPA) Teen Pregnancy Prevention (TPP) Grantees to inform National Implementations (IMAGIN)


April 2019





Point of Contact: Tara Rice

Office of Population Affairs
Office of the Assistant Secretary for Health,

U.S. Department of Health and Human Services

1101 Wootton Parkway, Suite 700CONTENTS

PART A: INTRODUCTION 1

A.1. Circumstances Making the Collection of Information Necessary 2

1. Legal or Administrative Requirements that Necessitate the Collection 2

2. Study Objectives 2

A.2. Purpose and Use of the Information Collection 4

A.3. Use of Information Technology to Reduce Burden 7

A.4. Efforts to Identify Duplication and Use of Similar Information 7

A.5. Impact on Small Businesses 7

A.6. Consequences of Not Collecting the Information/Collecting Less Frequently 7

A.7. Special Circumstances 8

A.8. Federal Register Notice and Consultation Outside the Agency 8

A.9. Payments to Respondents 8

A.10. Assurance of Confidentiality 8

A.11. Justification for Sensitive Questions 9

A.12.A. Estimates of the Burden of Data Collection 9

A.12.B. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 10

A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 11

A.14. Annualized Cost to Federal Government 11

A.15. Explanation for Program Changes or Adjustments 11

A16. Plans for Tabulation and Publication and Project Time Schedule 12

1. Analysis Plan 12

2. Time Schedule and Publications 12

A17. Reason(s) Display of OMB Expiration Date is Inappropriate 13

A18. Exceptions to Certification for Paperwork Reduction Act Submissions 13

SUPPORTING REFERENCES 14



TABLES

Table A.2.1. Summary and timeline of cross-site study activities 6

Table A.12.A.1. Calculations of Annual Burden Hours 9

Table A.12.B.1. Estimated Annualized Burden Costs 11

Table A.16.1. Timeline for Use of Data Collection Instruments 13




ATTACHMENTS

ATTACHMENT A: EMAIL INVITATION FOR GRANTEE LEADERSHIP INTERVIEW

ATTACHMENT B: EMAIL INVITATION TO BEGIN SITE VISIT PLANNING

ATTACHMENT C: FRONTLINE STAFF SURVEY INVITATION AND REMINDER EMAILS

ATTACHMENT D: PERSONS CONSULTED ON INSTRUMENT DEVELOPMENT AND/OR ANALYSIS

ATTACHMENT E: CONFIDENTIALITY PLEDGE

ATTACHMENT F: 60 DAY FEDERAL REGISTER NOTICE


INSTRUMENTS

INSTRUMENT 1: GRANTEE PROGRAM LEADERSHIP staff INTERVIEW GUIDE (initial and follow up)

INSTRUMENT 2: KEY PROGRAM STAFF INTERVIEW GUIDE

INSTRUMENT 3: COMMUNITY STAKEHOLDER INTERVIEW GUIDE

INSTRUMENT 4: FRONTLINE STAFF SURVEY




PART A: INTRODUCTION

The consequences of adolescent sexual activity remain a critical social and economic issue in the United States. Although births to teen mothers have dropped sharply over the past 25 years, the teen birthrate remains higher in the United States than in other industrialized countries and varies widely across geographic regions and racial and ethnic groups (Martin et al. 2017). Adolescents and young adults also account for half of all sexually transmitted infection (STI) cases each year (Centers for Disease Control and Prevention [CDC] 2017), and rates of STIs continue to rise (CDC 2018). Because adolescent sexual activity often occurs outside stable, long-term relationships, it is frequently linked to risk behaviors such as alcohol and substance use, teen dating violence, and sexual assault, and specific vulnerabilities, such as living in foster care or involvement with the juvenile justice system.

As part of the government’s ongoing efforts to support youth in making healthy decisions about their relationships and behaviors, Congress authorized the Teen Pregnancy Prevention (TPP) program in 2010. Administered by the Office of Population Affairs (OPA) (formerly the Office of Adolescent Health, which merged into OPA, effective June 2019) of the Office of the Assistant Secretary for Health (OASH) at the U.S. Department of Health and Human Services (HHS), TPP provides funding to local organizations to implement evidence-based (Tier 1) and promising new programs (Tier 2). The TPP Tier 1 grantees (funded in 2010 and in 2015) focused on replicating and testing specific curricula; however, the evaluations of these programs raised more questions than they provided answers about how to design and deliver high quality and effective TPP programs.

With a new TPP funding opportunity announcement in spring 2018, OPA is addressing the knowledge gap. The first phase (Phase I) of these new grants have a specific focus on program improvement and formative evaluation to help grantees identify and integrate elements of effective programs, and facilitate readiness for implementation and summative evaluation. Summative evaluation will occur in Phase II for a subset of Phase I grantees that show readiness for summative evaluation. In 2018, OPA awarded two-year Phase I grants to 14 Tier 2 grantees (TPP18). An additional 30 two-year Phase I grants to support evidence based programming (Tier 1) are expected in 2019 (TPP19). Grantees are expected to develop or select programs that integrate sexual risk avoidance (SRA) and/or sexual risk reduction (SRR) approaches, as well as address youth, family, and systems-level protective factors.

OPA’s contract, Improving Implementation Evaluation among OPA TPP Grantees to Inform National Implementation (IMAGIN), supports the TPP program improvement agenda. The study will support grantees in two goals 1) assessing the quality and feasibility of their programming and using data to improve program delivery through continuous quality improvement, and 2) provide important information on the design and implementation of effective TPP programs to grantees and relevant efforts across the nation.

With this new ICR, OPA seeks approval for data collection activities for a cross-site study to address the second goal above. These activities, to be conducted with grantees and their partners, will include (1) interviews with leadership staff, such as grantee directors; (2) discussions with key program staff; (3) discussions with community stakeholders; and (4) a frontline staff survey. The cross-site study will document and describe the process, challenges, and successes related to getting programs ready for implementation and summative evaluation. The study will also assess how a multiphase grant effort supported grantees in implementing and preparing for rigorous evaluation of their program models.

A.1. Circumstances Making the Collection of Information Necessary

1. Legal or Administrative Requirements that Necessitate the Collection

As part of the government’s ongoing efforts to support youth in making healthy decisions about their relationships and behaviors, Congress authorized the Teen Pregnancy Prevention (TPP) program in 2010. Administered by the Office of Population Affairs of the Office of the Assistant Secretary for Health (OASH) at HHS, TPP provides funding to local organizations to implement evidence-based (Tier 1) and promising new programs (Tier 2). Section 301 of the Public Health Service Act (42 U.S.C.241) authorizes studies relating to the causes, diagnosis, treatment, control, and prevention of diseases; PL 115-245 authorizes TPP program funding.

2. Study Objectives

Current literature on the factors related to successful implementation suggests that for social service programs to achieve their intended outcomes, the interventions must be designed well, executed and implemented as intended, and operate in a local environment or context that enables their success (Fixsen et al. 2005). Drawing on the conceptual framework for IMAGIN, the cross-site team will examine three specific indicators of implementation readiness: 1) the readiness of the program model, 2) the readiness of the organization implementing the program, and 3) the degree to which there is promising evidence to support the program’s implementation and evaluation in its local context (Figure 1).

Figure 1. Conceptual framework for IMAGIN

Note: CQI = continuous quality improvement.

Using a thematic and holistic approach and relying on data from several sources, the cross-site study will address specific research questions to meet four key objectives:

  1. Understand how ready for implementation the proposed program models are. The cross-site study team will document and describe the elements related to the design of the proposed programs, and explore how ready the programs are for implementation and summative evaluation. We will examine each program’s theory of change, content, structure, and approach and how its design incorporates protective factors and other elements related to optimal health. Our research questions will explore the targeted outcomes and the process used to develop or select a program that is the right fit for the target community. We will also look at the degree to which the program’s materials have been developed, refined, and standardized for use, whether or not the program’s content and methodology are trauma-informed and medically accurate, and the types of guidance and benchmarks that have been developed to ensure implementation with fidelity. For example, the study team will look at how well the model specifies the type of staff required to implement the model, including their qualifications, the staff-to-participant ratio, and training requirements.

  2. Understand the grantee organization’s readiness for implementation and evaluation. For a program to be fully implemented and meet its desired outcomes, the implementing organization must have support from leaders, the staff and infrastructure to deliver the program as intended, and systems must be in place to monitor and improve program delivery. Therefore, the cross-site team will examine the different factors that affect readiness at the organizational level, such as: (1) the degree to which organization leaders believe in and support the program; (2) how the grantee plans to recruit, select, and train staff to meet program needs; (3) the degree to which processes for reaching and engaging participants have been defined; (4) the specific supports in place for staff, such as supervision and coaching; and (5) the types of data the grantee will collect to inform and improve performance and delivery on a consistent basis (known as continuous quality improvement, or CQI). We will also examine the grantee’s plans for formative evaluation, and how the grantee intends to use the Phase I grant structure to prepare for full implementation and evaluation. Finally, the team will explore whether and how the organization engaged with local stakeholders, partners, and families in the target community to get their input in planning for implementation.

  3. Document and describe evidence emerging from early implementation and formative evaluation. Once grantees begin implementing (and in some cases, formatively testing) their programs during Phase I, the study team will examine what they learn through this experience of initial implementation and how it prepares them for full implementation and summative evaluation. As part of this examination, we will incorporate questions on promising evidence to support the program and the rationale for a summative evaluation, such as community need and demand for the program’s services, the degree to which the program is being implemented with fidelity, and any data on youth outcomes that the grantee collects as part of its CQI process or formative evaluation. We will also explore staff perspectives on the program and each organization’s level of preparation and readiness, learn about any gaps and challenges grantees identified during early implementation, and identify the steps they took or are planning to take to address these for full implementation and summative evaluation.

  4. Define and disseminate lessons learned and guidance for the field as they relate to implementation readiness. The study team will examine common characteristics, features, challenges, and successes related to implementation readiness, and develop specific themes, lessons, and concrete tips and guidance for grantees operating in a variety of contexts. For example, there may be particular lessons and challenges related to implementation and evaluation readiness for grantees serving foster care youth, or those who are implementing programs in urban school districts.

OPA seeks Office of Management and Budget (OMB) approval for the data collection activities (Instruments 1 – 4) listed below to inform the objectives of the cross-site study. A three year clearance is needed because the TPP19 grantees are expected to receive funding in September 2019. Data collection activities for the cross-site study will begin towards the end of the first year of the grant period, for TPP18 grantees (fall 2019) and TPP19 grantees (summer 2020).

  • Discussion Guide for Interviews with Leadership Staff (Instrument 1). The discussion guide for interviews with leadership staff (directors or managers) consists of a series of topics, aligned with the cross-site study’s main objectives and research questions. The list of topics will guide initial and follow-up semi-structured discussions with leadership staff to elicit input related to program and organizational readiness, and will be tailored according to each grantee’s particular context. The initial interviews with leadership staff will last 90 minutes. Follow-up interviews with leadership staff will last 60 minutes.

  • Discussion Guide for Interviews with Key Program Staff (Instrument 2). The discussion guide for interviews with program staff (supervisors and frontline staff) consists of a series of topics aligned with the study objectives and research questions. It will guide semi-structured discussions with staff engaged in day-to-day operations and delivery of the programs, to elicit their input on program and organizational readiness, and the local context and suitability of the program. The topics will be tailored based on each grantee’s particular context and progress towards readiness. Interviews will last up to one hour and will be conducted during planned site visits to a subset of grantees. See Section B.1 for details on the selection of grantees for site visits.

  • Discussion Guide for Interviews with Community Stakeholders (Instrument 3). The discussion guide for interviews with key community stakeholders (such as local leaders) consists of a series of topics, aligned with the study objectives and research questions, and will guide semi-structured discussions with to elicit input related to program’s fit and suitability for the community’s needs. The interviews will last up to 45 minutes and will be conducted during planned site visits to a subset of grantees.

  • Frontline Staff Survey (Instrument 4). The 30-minute web-based survey is designed to collect information on staff backgrounds and roles, the training and preparation they received to deliver the program, their experiences with early implementation and data collection for evaluation, and key lessons related to program and organizational readiness.

A.2. Purpose and Use of the Information Collection

This ICR describes the data collection activities for the cross-site study that will document grantees’ preparation for program implementation and summative evaluation. The study will provide OPA and the field with a better understanding of the important factors and supports needed to develop, and put into operation SRA/SRR program models for full implementation and rigorous evaluation.

Study Design

The cross-site study will include 44 grantees receiving TPP Tier 1 and Tier 2 grants, in two cohorts: currently, OPA has funded 14 TPP18 Tier 2 grantees. Next year, 30 TPP19 Tier 1 grantees are expected to receive funding, bringing the total number of grantees to 44. There are three sources of data for the cross-site study being collected using four instruments, described below: (1) phone interviews with grantee leadership staff, (2) site visits that include in-person interviews with key program staff and community stakeholders; and (3) an online a front-line staff survey. An overview of the study’s activities is below (Table A.2.1)

Discussions with grantee leadership staff. The study team will conduct phone interviews with grantee leadership, beginning in fall 2019 for TPP18 grantees, and summer 2020 for TPP19 grantees, pending OMB approval (Instrument 1). A designated member of the cross-site study team will reach out to grantee leadership staff over email (Attachment A) to introduce and explain the study, and invite respondents to schedule a time for an initial 90 minute-long discussion. Prior to conducting the interviews, the team will assess existing grantee materials and reports provided by OPA, to tailor the topic guide and define specific topics relevant for the grantee’s context. Each interview will be conducted by a two-member team (a lead interviewer and a note-taker). The study team will follow-up to conduct a second, 60-minute interview with grantee leadership staff for all 44 grantees, in year two of their grant period to give them an opportunity to reflect on lessons learned and recommendations for future grantees.

Discussions with program staff and community stakeholders. After the initial discussions with leaders, the study team will visit a select number of grantees for two days each. The cross-site study team will select up to 14 grantees across the TPP18 and TPP19 cohorts, to conduct site visits, targeting a mix of grantees at different readiness stages and with varied implementation contexts. The study team will work with OPA and project officers to develop criteria and indicators of readiness for site selection that ensure an appropriate mix of grantees. More details on site selection are included in B.1. We will begin our visits in late fall 2019 (pending OMB approval), prioritizing a subset of three or four TPP18 grantees that are approved by OPA to begin full implementation in the second year of their Phase I grant. We will visit an additional four TPP18, and up to an additional seven TPP19 grantees in summer and fall 2020, to give them time to reflect on their challenges, success, and lessons.. The number of visits that can be conducted will ultimately depend on grantees’ progress toward implementation readiness and their status at the end of their first year.

Two-person teams led by a site study leader or other senior project staff will conduct each visit. Prior to the visit, the cross-site study team will coordinate with the program staff (and local evaluator, if any) to identify the relevant interview respondents for each grantee and decide on the best time for the site visit based on staff convenience and schedules (Attachment B). During the visit, site visitors will conduct individual and small-group interviews (where similar staff roles permit) with the following types of staff (Instruments 2 and 3):

  • Program managers or supervisors and frontline staff with major responsibility for the program’s day-to-day operations

  • Key community-based stakeholders, such as staff at partner agencies or community leaders with knowledge of or involvement in the program or services

Based on the document review of grantee applications, evaluation plans, and grantee quarterly reports submitted to OPA, the study team will begin to tailor protocols, removing topics that are not relevant, informed by grantee-specific information, goals, and status as they make progress over time. The study team expects that by or before the end of the first year for each cohort, the team will have a better understanding of grantee plans and goals for the second year of their grant, and will tailor the topic guides in fall 2019 and spring 2020 prior to the site visits. Depending on their stage of readiness, we will want to understand the activities and processes they follow to achieve readiness, and what specific lessons could be drawn from that experience.

Survey of frontline staff for each grantee. For all grantees, the cross-site study team will survey the frontline staff delivering the programs (Instrument 4). It is expected that up to eight respondents from each grantee will take the survey, but this could vary based on the scope of the program and number of staff. The 30-minute web-based survey is designed to collect information on staff backgrounds and roles, the training and preparation they received to deliver the program, their experiences with early implementation and data collection for evaluation, and key lessons related to program and organizational readiness. The survey draws on similar surveys from other projects and incorporates specific closed- and open-ended questions aligned with and designed to address the cross-site study research questions. Frontline program staff will receive an email with study background information and details on how to login and complete the survey at a time that is convenient for them (Attachment C).

Table A.2.1. Summary and timeline of cross-site study activities

Study activity

Timeline

OPA awards funding to TPP18 grantees

September 2018

Submit OMB package for IMAGIN Cross-site Study

February 2019

OPA awards funding to TPP19 grantees

July 2019

Grantee Leadership Interviews (TPP18)

October 2019, July 2020



Site Visits and interviews (TPP18)

October – November 2019

Frontline Staff Survey (TPP18)

October – November 2019

Grantee Leadership Interviews (TPP19)

May – June 2020, March – April 2021

Site visits and interviews (TPP18 and 19)

July – December 2020

Frontline Staff Survey (TPP19)

September 2020 – October 2020

Analysis and reporting

April – September 2021

a The actual start date depends on OMB approval.

The cross-site study will meet the needs of OPA by providing important information related to the factors that affect implementation and evaluation readiness, and the steps necessary to ensure optimal implementation of SRA/SRR programs. The information obtained through the cross-site study can be used to inform decisions related to future government investments in SRA/SRR programs seeking to address teen pregnancy prevention and for the field of youth-serving professionals at large.

A.3. Use of Information Technology to Reduce Burden

In planning the interviews and site visit, the study team will use emails to communicate with grantee leadership and program staff, and schedule the data collections to minimize burden to the extent possible.

Additionally, if necessary to minimize burden and collect comprehensive information, the study team will conduct small-group interviews, rather than individual interviews, when staff roles and schedules align. Each group interview will include staff at the same or similar levels. For example, one group interview may be held with two or three frontline workers, such as caseworkers or outreach specialists. A separate group discussion may be held with supervisors of frontline staff. If there is only one staff member in a particular level or staff schedules do not align, however, an individual interview will be conducted. Group interviews will allow the study team to reduce the length of time spent at the site, while still obtaining comprehensive and in-depth information from staff with a range of experiences.

The frontline staff survey is a web-based survey. Web-based surveys can decrease respondent burden and improve data quality. Unlike paper instruments in which respondents must determine question routes themselves, the web-based application will include built-in skips and will route respondents to the next appropriate question based on their answers. The web-based program automatically skips them out of any questions that are not relevant to them, thus reducing burden on respondents having to navigate through various paths. Additionally, data checks can be programmed into the survey to eliminate responses that are out of range as well as conflicting responses, thereby ensuring a cleaner dataset for analysis.

A.4. Efforts to Identify Duplication and Use of Similar Information

The information collection requirements for this cross-site study have been carefully reviewed to determine what information is already available from existing studies and program documents and what must be collected for the first time. Although the information from existing sources improves our understanding of intervention design and implementation, OPA does not believe that it provides enough information on the complexities of program implementation and the relationship between core components of implementation and youth outcomes.

A.5. Impact on Small Businesses

No small businesses will be involved in the data collection. Programs in some sites may be operated by non-profit community-based organizations. The data collection plan is designed to minimize burden on such sites by providing staff from Mathematica Policy Research to collect data during the site visits and through follow-up telephone interviews as needed.

A.6. Consequences of Not Collecting the Information/Collecting Less Frequently

Rigorous evaluation of innovative initiatives is crucial to building evidence of what works and how best to allocate scarce government resources. This cross-site study represents an important opportunity for OPA to gain a better understanding of the implementation factors and supports needed to develop, and put into operation SRA or SRR program models that can be rigorously evaluated.

Not collecting information for the study would limit the government’s ability to document the kinds of activities implemented and how those activities can be successfully implemented with Federal funds, as well as to measure the effectiveness of innovative approaches or programs. Data from this initial information collection offer an opportunity to determine whether the cost and time associated with this phase produce high-quality program models and rigorous evaluation designs.

The interviews and frontline staff surveys are a one-time collection effort. If these interviews and surveys are not conducted, the evaluation team will be limited in its ability to examine themes seen in the document reviews.

A.7. Special Circumstances

There are no special circumstances for the proposed data collection efforts.

A.8. Federal Register Notice and Consultation Outside the Agency

A 60-day Federal Register Notice was published in the Federal Register on February 8, 2019, Vol. 84, No. 27; pp. 2887-2888 (see Attachment F). No public comments were received.

The names and contact information of the persons consulted in the drafting and refinement of the instruments are in Attachment D.

A.9. Payments to Respondents

No incentives for respondents are proposed for this information collection.

A.10. Assurance of Confidentiality

Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. Participants will be informed that interviews will be recorded and that their information will be kept private to the extent permitted by law. Study staff will be trained on privacy procedures, and will be prepared to describe them and to answer questions raised by participants. All field staff and phone interviewers will be required to sign a confidentiality pledge when hired by Mathematica, a blank example of this pledge is provided (Attachment E).

Mathematica has established security plans for handling data during all phases of the data collection. The plans include a secure server infrastructure for online data collection of the web-based frontline staff survey (Instrument 4), which features HTTPS encrypted data communication, user authentication, firewalls, and multiple layers of servers to minimize vulnerability to security breaches. Hosting the survey on an HTTPS site ensures that data are transmitted using 128-bit encryption; transmissions intercepted by unauthorized users cannot be read as plain text. This security measure is in addition to standard user PIN and password authentication that precludes unauthorized users from accessing the web application. Any personally identifiable information used to contact respondents will be stored in secure files, separate from survey and other individual-level data. Once the respondent’s submitted survey is deemed complete, contact information will be deleted from the secure server. In addition, we will use a unique identifier for program name, stored separately, for analysis..

A.11. Justification for Sensitive Questions

The study will collect demographic information, including race, from frontline staff members through the online staff survey, because programs are delivered in a range of contexts. It is important to have insight on who the staff are who deliver programs and services beyond their position title and the role they play. For example, it would be useful to understand if and how the staff reflect the target population they serve. The survey question is:


Which of the following best describes you?

MARK ALL THAT APPLY

1 □ American Indian or Alaska Native

2 □ Asian

3 □ Black or African American

4 □ Native Hawaiian or Other Pacific Islander

5 □ White

6 □ Prefer not to say

A.12.A. Estimates of the Burden of Data Collection

OPA is requesting three years of clearance for the IMAGIN cross-site study. Table A12.A.1 provides the estimated annual reporting burden for study participants as a result of the leadership staff interviews, program staff interviews, community stakeholder interviews and the frontline staff survey.

  1. Annual Burden for Grantee Leadership Staff. It is expected that there will be a total of 44 grantees that receive funding: 14 Tier 2 grantees in 2018 and 30 Tier 1 grantees in 2019. The study team will conduct one initial interview with a member of the grantee leadership staff for 90 minutes. Each grantee director will be interviewed over the phone, for total annual burden hours of [44 x (90/60)]/3 years) or 22 hours annually.

The study team plans to conduct one follow up interview with all grantee leadership staff that will last 1 hour and be conducted over the phone. The annual burden hours will be (44/3 years) or 15 hours annually.

  1. Annual Burden for Program Staff. Within each of the 14 grantees selected for a site visit, we will conduct 1 hour interviews with up to 10 staff members: eight frontline staff and two program supervisors. These interviews will take place during site visits, for a total annual burden of (14 x 10)/3 years or 47 hours annually. Additionally, we will conduct a 30 minute (30/60 hour) frontline staff survey with up to 8 frontline staff from each grantee, for a total of up to 352 respondents. The total annual burden hours for this effort will be [352 x (30/60)]/3 years, or 59 hours annually.

  2. Annual Burden for Community Stakeholders. For each of the 14 grantees selected for a site visit, we will conduct a 45 minute (45/60 hour) interview with a total of 2 community stakeholders, for a total number of 28 respondents. The total annual burden will be [28 x (45/60)]/3 years or 7 hours annually.















Table A.12.A.1. Calculations of Annual Burden Hours

Instrument

Type of respondent

Total Number of Respondents

Annual Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Annual Burden Hours

1. Grantee Program Leadership Staff Interview







Initial

Leadership Staff

44

15

1

90/60

23

Follow-up

Leadership Staff

44

15

1

1

15

2. Key Program staff interview

Frontline staff and supervisors

140

47

1

1

47

3. Community stakeholder interview

Community stakeholders

28

9

1

45/60

7

4. Frontline staff survey

Frontline program staff

352

117

1

30/60

59

Estimated Annual Burden Total



203



151

A.12.B. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

We estimate the average hourly wage for community stakeholders, frontline staff and frontline staff supervisors at the grantee organizations, $23.69, to be the average hourly wage of “community and social service occupations” as determined by the U.S. Bureau of Labor Statistics Occupational Employment and Wage Statistics for 2018 (U.S. Department of Labor 2018). For grantee program leadership staff we estimate the hourly wage to be $37.86, the 90th percentile hourly wage of “community and social service occupations”. The estimated annual cost burden is $4,115.65 (Table A.12.B.1).















Table A.12.B.1. Estimated Annualized Burden Costs

Instrument

Type of respondent

Annual Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Annual Burden Hours

Annual Burden Hours for Youth Age 18 or Older

Hourly Wage Rate

Total Annual Costs

1. Grantee Program Leadership Staff Interview

Initial

Leadership Staff

15

1

90/60

23

23

$37.86

$870.78

Follow-up

Leadership Staff

15

1

1

15

15

$37.86

$567.90

2. Key Program staff interview

Frontline staff and supervisors

47

1

1

47

47

$23.69

$1,113.43

3. Community stakeholder interview

Community stakeholders

9

1

45/60

7

7

$23.69

$165.83

4. Frontline staff survey

Frontline program staff

117

1

30/60

59

59

$23.69

$1,397.71

Estimated Annual Total






151

151


$4,115.65



A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

The proposed information collection activities do not place any capital cost or cost of maintaining requirements on respondents.



A.14. Annualized Cost to Federal Government

Data collection and analysis will be carried out by Mathematica, under contract with OPA to conduct the IMAGIN Study. OPA staff will not be involved in either data collection or data analysis; thus, there are no Agency labor or resources involved in conducting this study. The total cost to the federal government for the data collection activities under this current request will be $477,867 over 3 years, or an annualized cost of $159,289.

A.15. Explanation for Program Changes or Adjustments

This is a new data collection.

A16. Plans for Tabulation and Publication and Project Time Schedule

1. Analysis Plan

The cross-site study team will analyze all relevant information from program materials, interviews with grantee leaders and program staff, and the frontline staff survey. We will explore key factors affecting implementation readiness in order to understand the facilitators of and challenges to program implementation in different contexts, and provide actionable findings and lessons that inform ongoing program improvements, refinement, and planning for summative evaluation.

Analyzing mostly qualitative data requires creating data structures and using them systematically. We will use a qualitative analysis software package (NVivo) to develop a preliminary codebook for organizing and categorizing the data to align with the IMAGIN conceptual framework(s), research questions, and common components of the grantee programs. The coding will enable us to retrieve and examine data linked to specific questions and topics, and will facilitate analyses of themes across multiple grantees. The cross-site study team will initially code a small subset of interview transcripts at the same time to become familiar with the codebook, tailor and refine codes to fit the context of the grantee programs, and ensure reliability. We will then divide the remaining data among the team members who will conduct the analysis. Once the data are coded, we will generate code reports from NVivo for each grantee, analyze these to identify themes and patterns within and across grantees, and document findings on the research questions.

In addition to qualitative analyses, we will conduct descriptive quantitative analyses of the survey data from frontline staff. These systematic analyses will yield more details on frontline staff’s experiences, motivations, and perceptions and grantees’ readiness to implement and evaluate their programs. We expect these analyses to complement and augment the information we obtain in the interviews and through other data sources. We will examine findings across the quantitative and qualitative data sources and explore possible reasons for any misalignment. Similarly, we will use the qualitative data to better understand and explain patterns in the quantitative data, such as variations in metrics of performance.

Once the analyses are complete, we will develop a final implementation report that will focus on key findings about the factors that helped or hindered the process of implementation readiness in different contexts, along with actionable lessons for grantees, researchers, and funders. The report will spotlight successful or challenging cases that tell the stories of different program trajectories, and will share concrete takeaways for the reader. Aside from the specific products drawing on the cross-site study, these analyses will also play a critical role in developing targeted tools, resources, and practitioner-focused guides or briefs related to implementation readiness. They can also be used to inform future requests for applications for similar grant programs.

2. Time Schedule and Publications

OPA expects that the IMAGIN study will be conducted over three years, beginning in September 2018. This request is for a three-year period beginning in October 2019, pending OMB approval. A schedule of the data collection efforts for the cross-site study follows (Tables A.2.1 and A16.1).

Table A.16.1. Timeline for Use of Data Collection Instruments

Instrument

Date of 60-Day Submission

Date of 30-Day Submission

Date Clearance Needed

Date for Use in Field

Instrument 1:

February 2019

April 2019

October 2019

October 2019

Leadership Staff Interviews





Instrument 2: Key Program Staff Interviews

February 2019

April 2019

October 2019

October 2019

Instrument 3; Community Stakeholder Interviews

February 2019

April 2019

October 2019

October 2019

Instrument 4: Frontline staff survey

February 2019

April 2019

October 2019

October 2019

A17. Reason(s) Display of OMB Expiration Date is Inappropriate

All instruments will display the OMB Control Number and expiration date.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.



SUPPORTING REFERENCES

National Occupational Employment and Wage Estimates, Bureau of Labor Statistics, Department of Labor, May 2017. Available at https://www.bls.gov/oes/current/oes
210000.htm. Accessed January 22, 2019.

Centers for Disease Control and Prevention. “Sexually Transmitted Disease Surveillance 2016.” Atlanta, GA: U.S. Department of Health and Human Services, 2017.

Centers for Disease Control and Prevention. “New CDC Analysis Shows Steep and Sustained Increases in STDs in Recent Years.” Atlanta, GA: CDC, August 28, 2018. Available at https://www.cdc.gov/nchhstp/newsroom/2018/press-release-2018-std-prevention-conference.html. Accessed August 30, 2018

Fixsen, D., S. Naoom, K. Blase, R. Friedman, and F. Wallace. Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, 2005.

Martin, J.A., B.E. Hamilton, M.J.K. Osterman, A.K. Driscoll, and T.J. Matthews. “Births: Final Data for 2015.” National Vital Statistics Report, vol. 66, no. 1. Hyattsville, MD: National Center for Health Statistics, 2017

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJen Walzer
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy