TPP Replication Study - Implementation ICR Part B June 2012 Final CLEAN

TPP Replication Study - Implementation ICR Part B June 2012 Final CLEAN.doc

Teen Pregnancy Prevention Replication Evaluation: Replication Study

OMB: 0990-0397

Document [doc]
Download: doc | pdf

Supporting Justification for OMB Clearance of Teen Pregnancy Prevention Replication Evaluation



Part B: Statistical Methods for Implementation Data Collection





September 2011



(Updated June 2012)



B1. Respondent Universe and Sampling Methods

For the TPP evaluation, HHS has selected approximately three program models, representing different approaches to the prevention of teen pregnancy, and will select up to three replications of each model. Of the 9 replications selected, five will be school-based and four will operate in community settings (clinics, social service or other public and private agencies and organizations, churches).

The program models were selected purposively. ASPE/OAH selected evidence-based program models that were chosen for replication by more than five grantees and that were of particular policy interest (1.culturally-relevant to Latino youth; 2. implemented in teen, school and reproductive health clinics; 3. most widely-used classroom-based model but only ever evaluated by developer). The selection of sites was also purposive, but “best case scenario” was not a criterion. We needed to find, among five or six grantees, those that could provide an adequate sample of study participants within a two-year period and that seemed able to meet the demands of a rigorous evaluation. (Most of the Tier 1 grantees were not required to conduct a rigorous evaluation on their own; only the largest Tier 1 grants had this requirement).

To be awarded a grant, all of the potential candidates had to demonstrate prior experience with youth and an ability to mount a program. OAH committed to ensuring fidelity of implementation, so that all replications selected for the evaluation can be expected to implement with reasonable fidelity – therefore the evaluation represents tests of well-implemented programs rather than of what we would get if we randomly selected from all those who implement a program model nationally.

The total sample of youth for the study is approximately 8,550, a sufficient sample to detect policy-relevant impacts of individual program replications For each replication (which can occur across multiple locations), youth will be assigned to a treatment group that receives the intervention or to a control group that does not. Selection of the unit of randomization will be driven by: a) the setting in which the replication is implemented; the need to minimize disruption of the program’s normal operation; and by the desire to minimize contamination across groups, to the greatest extent possible. While, for school-based studies, random assignment of schools is the most desirable option, it requires a level of resources that the TPP grantees who were candidates for the evaluation do not have. Specifically, many of the grantees do not have access to the number of schools needed for a school-based random assignment study, either because there are not enough schools in the locality the grantee is serving or because of a lack of funds to serve youth in so many different schools. Given the resource constraints, and the fact that all the interventions will be delivered by trained program staff not school staff, thus reducing though not eliminating potential contamination, classes (e.g., health or wellness classes) will be the unit of random assignment within each school in the three replications of Reducing the Risk. In the three replications of the ¡Cuídate! program model, where the program may be delivered after school or as a “pullout” from a regular class, individual youth will be randomly assigned. In community-based settings, it would be impossible to randomly assign organizations (even if grant resources were adequate for such a design), since the settings are quite heterogeneous within a locality (YMCAs, social service or child welfare offices, other youth-serving locations). In these cases, individual youth will be randomly assigned.

Within each replication site, implementation study data will be collected from staff and community members in positions with varying roles and responsibilities who will be knowledgeable about the origins and operations of their program and the challenges it has encountered. Focus groups will also be held with 8-12 participating youths per group, with different levels of involvement in the program (or possibly in the control group), who agree to participate in a focus group discussion. The youths will not be randomly selected for the focus group. Up to 369 participants will be included in the implementation study annually.

Since the sample of youth for the impact study will be built over a minimum of a two-year period (but may extend into a third year) interviews and focus groups will be conducted at each site once in each of up to three years, so that we can understand the experience of youth in the impact study. In each site, we will plan the visits and interviews to occur during the period the program is being implemented. Our widest window will be for Safer Sex, which operates continuously, next for Reducing the Risk, which has an 8-16 week window, depending on whether a school uses block scheduling, and finally ¡Cuídate! which has at most, an eight-week window (it may be implemented in more than one cycle per year in one or more sites).

At each site, at each of three time-points, we will conduct focus group discussions with youth participating in the program. In smaller replication sites, we will conduct individual interviews with front-line staff, whenever possible, but conduct group interviews if the grantee feels these will be less disruptive of program operations. In larger sites, we anticipate that there will always be a mix of individual interviews and small group interviews so that we can include the perspectives of all front-line staff rather than a selected sample.

B2. Procedures for Collection of Information

Which topics will be relevant and important and who is best positioned to provide information on each aspect of program implementation will vary from site to site. For that reason, we established an overall topic guide (Attachment A) to organize data collection and documentation of each site’s implementation. The master topic guide identifies the information that will be gathered to document the program plans (background, readiness and preparation, program design and theory of change, and local context); describe program implementation (funding, infrastructure, staffing, training and technical assistance, outreach and recruitment, enrollment, and key program features); assess implementation fidelity and service quality (fidelity benchmarks, quality indicators) and youth responsiveness (attendance, interaction and engagement, satisfaction); implementation challenges and strategies for dealing with them; and describe the control condition (i.e., the services provided to youth in the control group).

These topics will be explored through eight main data sources: program documents; replication site documents and records; evaluation team notes from the site selection and monitoring process; OAH program officers’ reports; performance and fidelity data required as a condition of the grant and collected by grantee staff and/or local evaluators; interviews with key informants during site visits and telephone discussions (including focus groups with participants and frontline staff); observation of program activities; and the baseline and followup surveys of the evaluation sample (addressed in other OMB submissions). Although the general topic guide will be tailored to the circumstances and design of each site’s program, we can project in general terms which of these data sources will be accessed to explore the major topics of interest (see Attachment E).

The most intensive data collection for the implementation study will take place in three visits to each evaluation site, each for two or three days. A first visit will occur early in the period of program operations and after some of the sample for the study has been enrolled. A second visit will be conducted during the subsequent year, with the exact timing depending on the length of the program and the schedule of its activities. In most sites a third visit will be required to document the implementation of the program as a later cohort of the sample experiences it. Two-person teams led by the site study leader will conduct each visit. The site study leader will be a senior project member who has been the regular point of contact with the grantee and who can communicate clearly, organize work effectively, provide strong analytical thinking, and remain objective and professional. The two-person approach to the site visits will increase the effectiveness of probing during interviews and the accuracy of information obtained. It also builds in flexibility to accommodate site schedules, allowing site visitors to split up and cover different interviews if the need arises.

Preparation for site visits will involve customizing the plan and protocols to each site. This will involve two steps. The first step is to prepare a site-specific logic model or “program framework.” Beginning with a general template, the site study leader will fill in what is known about the logic of a site’s program at the outset of data collection, the planned inputs, contextual factors and external influences, and program vision, as well as the intermediate and longer term outcomes the program seeks to affect. This process will highlight gaps in our understanding of what the program developer and the site leaders believe are the processes for affecting youth behavior and the factors that will affect the program’s success. It will also provide a basis for identifying implementation fidelity benchmarks.

The second step in customizing, even before a site’s program is fully implemented, is creating a preliminary program profile and preliminary control condition profile. The study site leader will review existing documents available from the program developer and program site leaders, as well as notes from discussions during the site selection and readiness assessment process, to gather as much information as possible on the topics listed in the topic guide. These documents might include implementation plans, grant applications, program budgets or justifications, communications with evaluation staff, staffing plans, and materials used to communicate about the program. The site study leader will use this information to create the beginnings of a site/program profile and a control condition profile. The entries to these profiles, and the gaps in the partially completed profiles, will focus our attention on what needs to be investigated or confirmed in further data collection. The site study leader will use these profiles to plan site visits, so that individual and small group interviews focus on information that cannot be obtained from other sources.

The site study leader will then create customized discussion guides to ensure that we collect the needed information in an efficient, consistent way from the most appropriate respondents. The site-specific plan will include a customized topic guide, which may elaborate on or provide “local language” versions of topic definitions, and may eliminate some topics as not relevant or already thoroughly explored. The plan will identify which information will be collected from which sources, key respondents who should be interviewed, and other sources that should be tapped. Implementation study leaders will review the site visit plans and customized discussion guides for each site to help ensure consistency across sites and to facilitate inclusion of topics to inform cross-site issues that are emerging from early visits.

On-site data collection will be done in five ways. We will conduct interviews with key personnel, group discussions with front-line staff, focus group discussions with participants, observation of program activities, and discussions with personnel at control group locations (in sites with cluster random assignment). Collecting data from diverse respondents who may have different information or perspectives will allow us to triangulate information and gain a more complete understanding of program implementation.

Interviews. During each visit, site visitors will conduct individual and small-group interviews with people with the following roles or perspectives:

  • Program leadership at the site (staff with major responsibility for implementing and delivering the program)

  • Representative of the sponsoring organization (school district, nonprofit organization, public agency)

  • Key school or community-based organization representatives (depending on site locations)

  • Program partners, including funders and other parties involved in delivering service components

  • Community members knowledgeable about related services for adolescents

The interviews will be conducted with tailored protocols based on the master topic guide, customized by site study leaders. Site visitors will request copies of any documents identified in these interviews that might provide additional information about relevant topics. In addition, site visitors will work with other evaluation team members to make or facilitate requests for program records related to the participation of evaluation sample youths.

Interviews and Group Discussions with Front-Line Staff. The individuals who lead activities and provide services to youths have an important role in the program. They have a unique perspective on the training and support they received for carrying out their responsibilities, the implementation of some key program features, and the strengths and needs of the youths with whom they work. Discussing these topics with the frontline staff directly will ensure that our understanding of each program is informed by the experiences of those who are responsible for implementing key activities.

We will invite as many frontline staff (and others who conduct program activities) as feasible and practical to participate in discussions. The site visitors will work with program leaders to arrange the discussion with staff at a convenient time and location. These discussions will be guided by a protocol (Attachment B), which will be customized in advance of the site visit.

Focus Group with Youth Participants. Another perspective that is crucial for understanding program implementation is that of the youths who participated in the program. We will convene a group of participating youths and talk to them about their decisions to participate in specific program activities, their opinions about the activities in which they participated, the aspects of the program that they liked or would change, and their participation in other similar programs.

Site visitors will work with program staff to identify and recruit up to 12 program participants per focus group, from multiple program locations if feasible or from just one or two if locations are too dispersed. Focus groups will be conducted using a general guide (Attachment C) which, like other protocols, will be tailored to each site to reflect proper program nomenclature and the program design. Food will be provided and a small incentive (e.g., a $10 I-Tunes card) will be offered.

Focus Group with Youth in the Control Group

It is possible that focus groups with youth in the control group would allow us to explore the possibility of the contamination that might occur if dating occurs between program and control group youth. The site visit leader will explore with grantees and school or agency staff the best way to recruit these youth. Food will be provided and a small incentive (e.g., a $10 I-Tunes card) will be offered.

Program Observation. Observing some program activities can help deepen site visitors’ understanding of information obtained in interviews and group discussions and provide illustrations of the way the program works. Site visitors will observe typical program activities with youth and record descriptive information about the setting, staffing, participants, topics covered and messages conveyed, and engagement of staff and participants in the activity.

To the extent feasible, site visits will be scheduled so that site visitors can observe program activities in several program locations, selected in consultation with the program leaders to represent a range of activities, settings, and participant characteristics. For example, if the program and evaluation is being conducted in multiple locations within a school district or community, site visitors will arrange to observe activities in settings that illustrate the variation that exists in program staffing, school or setting characteristics, and youth characteristics. Information from the observations will not be used to rate the program or generate outcome or mediating variables; it will be used to help site visitors understand how the program works and illustrate information in the program/site profile or evaluation report.

Discussions about the Counterfactual. Site visits will clarify the counterfactual services available to control group youth. Two scenarios are possible. If the site evaluation design involves random assignment of classes, then our field staff will conduct interviews with teachers from those classes using the first part of a topic guide (Attachment D). Site visit staff may also conduct focus groups with control group youth. For sites in which the design involves random assignment of individuals, then we will investigate any alternative services intentionally provided to control group youth as well as the range of services within the community that may be available to youth in both groups by interviewing relevant staff at organizations seen as the major sources of alternative services, as well as possible focus groups with youth. We will identify those organizations through our contacts with site staff and our own independent web-based research about the site. Those interviews will be guided by the outline in the second half of Attachment D.

B3. Methods to Maximize Response Rates and Deal with Nonresponse

Site visits will be planned well in advance so that all identified respondents can participate in individual or group interviews, as appropriate. We anticipate that refusals to participate and absences will be rare.

B4. Tests of Procedures or Methods to be Undertaken

No pretest of the implementation study protocols has been conducted.

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The TPP Replication Evaluation implementation study site visits will be overseen by staff of Abt Associates Inc., the contractor selected in September 2011 to conduct the study. Listed below are the individuals whom HHS consulted on the collection and/or analysis of the implementation data; they include the contractor staff from the feasibility and design study for the TPP Replication Evaluation, as well as members of the PPA Technical Work Group who attended a TWG meeting in spring 2010 to review the design and provided input about the overall project design.


Meredith Kelsey

Abt Associates Inc.

55 Wheeler Street

Cambridge MA 02138


Jean Layzer

Belmont Research Associates

42 Fairmont Street

Belmont MA 02478


Kimberly Francis

Abt Associates Inc.

55 Wheeler Street

Cambridge MA 02138


Randall Juras

Abt Associates Inc.

55 Wheeler Street

Cambridge MA 02138


Alan Hershey

Mathematica Policy Research, Inc.

P.O. Box 2391

Princeton, NJ 08543

(609) 275-2384


Ellen Kisker

Twin Peaks Partners, LLC

7639 Crestview Drive

Longmont, CO 80504

(303) 834-8364


Alicia Meckstroth

Mathematica Policy Research, Inc.

P.O. Box 2391

Princeton, NJ 08543

614-505-1401


Rachel Shapiro

Mathematica Policy Research, Inc.

P.O. Box 2391

Princeton, NJ 08543

(609) 936-279-6384


Jennifer Manlove, Karen Walker, and Kristine Andrews

Child Trends

4301 Connecticut Ave. NW
Washington, DC 20008-2333
(202) 362-5580


PPA TECHNICAL WORK GROUP MEMBERS


James Jaccard, Ph.D.

Professor of Psychology

Department of Psychology

Florida International University

945 Roderigo Ave.

Coral Gables, FL 33134

Phone: 305-348-0274


Meredith Kelsey

Abt Associates

55 Wheeler St.

Cambridge, MA 02138


Christine Markham

The University of Texas School of Public Health

P.O. Box 20186

Houston, TX 77225

(713) 500-9646


Pat Paluzzi

President

Healthy Teen Network

1501 Saint Paul St., Suite 124

Baltimore, MD 21202

(410) 685-0410


Susan Philliber

Philliber and Associates

16 Main St.

Accord, NY 12404

(845) 626-2126


Michael Resnick

Division of Adolescent Health and Medicine

717 Delaware St. SE, Suite 370

Minneapolis, MN 55414-2959

(612) 624-9111


Jeffrey Smith, Ph.D.

Professor of Economics and Faculty Associate, Survey Research Center, Institute for Social Research

Department of Economics

University of Michigan

238 Lorch Hall

611 Tappan St

Ann Arbor, MI 48109-1220

(734) 764-5359


Don Winstead

Deputy Secretary

Department of Children and Families

1317 Winewood Blvd.

Building 1, Room 202

Tallahassee, FL 32399-0700

(850) 921-8533


Inquiries regarding statistical aspects of the study design should be directed to:


Lisa Trivits, Ph.D.

Office of the Assistant Secretary for Planning and Evaluation (ASPE)

U.S. Department of Health and Human Services

200 Independence Ave, SW

Washington, DC 20201

(202) 205-5750


or


Diana Tyson, Ph.D.

Office of the Assistant Secretary for Planning and Evaluation (ASPE)

U.S. Department of Health and Human Services

200 Independence Ave, SW

Washington, DC 20201

(202) 401-6670


Dr. Trivits and Dr. Tyson are the TPP Evaluation project officers. Both have overseen the current implementation instruments.

Inquiries related to the Teen Pregnancy Prevention Program, or evaluations of it, may be directed to:

Amy Farb, Ph.D

Office of Adolescent Health

Office of the Assistant Secretary for Health

U.S. Department of Health and Human Services

1101 Wootton Parkway, Suite 700

Rockville, MD 20852

(240) 453-2836

11

File Typeapplication/msword
File TitleSupporting Justification for OMB Clearance of Evaluation of Pregnancy Prevention Approaches Part B: Statistical Methods for Base
AuthorMary Hess
Last Modified ByCTAC
File Modified2012-06-29
File Created2012-06-29

© 2024 OMB.report | Privacy Policy