Grantee Heads

Teen Pregnancy Prevention Replication Evaluation: Replication Study

TPP Replication - Implementation Study - Appendix B_TPPDiscussionguide1

Grantee Heads

OMB: 0990-0397

Document [doc]
Download: doc | pdf

APPENDIX B


Evaluation of Teen Pregnancy Prevention Replications


Discussion Guide for Use with HEAD OF GRANTEE AGENCY, PERSON LISTED AS CONTACT PERSON ON PROPOSAL (updated by OAH)

































Program Name:

Program Location:

Sponsoring Organization:

Individual(s) Interviewed: (names and titles)

Contract Staff (as appropriate):

Date of Communication:


Instructions for Site Visitors


This visit has a set of very specific goals:


  • To ensure that we have a complete and up-to-date understanding of the roles and responsibilities of the grantee and partners and their staff:


  • To expand our understanding of aspects of readiness and preparation that would support strong replication of a program model (preliminary information has been abstracted from the grant proposal and other extant documents and is incorporated into the profile you received. Please make sure that you have read this and are thoroughly familiar with it so that you can probe for updated information and identify incorrect information);


  • To understand the plan for replication of the program model, the adaptations that were approved and made, and the extent to which the replication was implemented as planned (again, you will have been given a summary of the replication plan as contained in the proposal and updated in the request for continuation funding. Your task will be to use discussion and direct observation to determine how the program is actually implemented and to determine the extent to which the various aspects of the replication were implemented as planned);


  • To understand the local context in which the replication is being implemented (this includes the school or agency environment as well as the social structure ,behavioral norms, resources and services availability of the local community);


  • To understand the ways in which the grantee changed or adapted aspects of the replication plan in response to local needs or pressures;


  • To understand the challenges encountered in replicating the program model and in other aspects of implementation, the extent to which staff are able to address those challenges and the strategies they employ to address them; and


  • To document the services provided to the members of the control group.


These goals will have been articulated in prior correspondence with the sites before the visit, but you should reiterate them at the beginning of any discussion with staff. You should hand the following statement to everyone you interview and an appropriate version of it to youth participants in focus groups.


Thank you for taking time to talk about (Name of Program). As you know, we are conducting the TPP Replication Evaluation. As an important part of that effort, we are visiting programs that are participating in the evaluation two or three times during their grant period, in an effort to understand and document the process of replicating an evidence-based model in real-world settings, the challenges that arise, and how staff on the ground respond to those challenges. The information we gather will serve two purposes: it will help future program operators and policymakers understand what is needed to replicate a program with fidelity, to implement it as intended, with the populations originally targeted; and it will be used to help us understand variations in program impacts, where they occur.


You may, of course, choose not to discuss any topic or end the discussion at any time. We will combine the information from this visit and subsequent visits, with information from your program documents and the performance and fidelity data you have collected, to create a narrative account of your program and the process of replicating the program model you selected. At the same time, we will combine the information about your program with information about other programs in the study to identify consistent themes that apply more generally across a range of program types and replication efforts. Neither your name nor the names of any individuals will be reported, and the notes we take about our discussions will not be shared with or provided to the federal government or anyone else except the members of the evaluation team.


A: Readiness/Preparation: Sponsoring Agency


A1. Age, size, structure, mission of the sponsoring agency. Type and scope of current operations.

(Use profile and check for accuracy)


A2. Position in and saturation in community.


Probes: How is the agency viewed in the community – in terms of its mission, the accessibility of its programs and services, its ability to reach and serve needy populations?


A3. Prior experience with programming for youth, with sexual health programming

Probes: If not mentioned earlier, what experience has the agency had with youth programming, sexual health programming? How successful were these earlier efforts in terms of attracting and retaining the target population, ability to implement the intervention as planned, any outcomes measured? Any adverse reactions/opposition from community members?


A4. Selection of program model for replication


Probes: What information did you use in determining the need for the program (problems in community, statistics on teen pregnancy, births, STIs)? Did you seek advice from others in the community or involve others in the choice of the program? What were your considerations in selecting (name of program model)? In what ways did it appear appropriate to the needs you identified? Did you foresee any challenges in implementing this program model – if so what were your concerns (agency policies, community opposition, school district concerns about aspects of the program)? What was your vision for the program and what it might accomplish or lead to?


A5. External support for the program


Probes: What resources, if any, were there in the community to support the program (sexual health services, youth programs as sources for referral into the program or sources for additional services)? Were there organizations or individuals in the community you felt you could count on to support the program (school district or school staff, local government agencies, private agencies)?


B. Readiness/Preparation: Staffing


B1. Recruitment and selection of staff for the program


Probes: What was your plan for staffing the program (supervisory vs. front-line staff? Was your plan to use existing staff to implement the program or to recruit staff specifically for this program? Advantages vs. disadvantages of the decision? If decision was to use existing staff, how did you select them, what were criteria for selection? If decision was to hire new staff, how did you recruit them, what qualifications, skills were you looking for?


B2. Staff training prior to implementation


Probes: What amount and kind of training did you feel it was necessary for staff to have? What type and amount of training did they receive before the program began? Who provided the training? Did you feel it was adequate? Were staff required to do any other type of preparation?


B3. Staff commitment


Probes: How committed are staff to this specific program? Do you think they believe in the program’s goals? Feel the activities and content are appropriate for the youth population they are working with? Did their feelings about the program change as a result of the pilot? In what ways?


C. Readiness/Preparation: Site-Specific Replication Plan


C1. Approved changes/adaptations to the program model (Use program profile and probe for changes or updates)


Probes: My understanding is that your proposed plan for replication was as follows (name target population, recruiting and retention strategies, staffing, program components, number and length of sessions, settings, delivery strategies). Have I missed anything? Did you make any subsequent changes to the plan with OAH approval? What were the reasons for the change(s)?


D. Implementation: Putting the Program in Place


D1. Settings for the program


Probes: Were you able to implement the program in the number and type of schools (other settings) that you planned? What obstacles did you encounter? Were you able to overcome them? How?


D2. Staffing the program


Probes: Did you make any changes in program staffing as a result of the pilot year? What were they? What is the workload (case flow) for front-line staff? Is it more or less than you expected? What are the reasons for the difference? Have you lost any of your original staff? How many and over what period?


D3. Target population


Probes: Are you serving the youth you planned to serve, in terms of numbers, characteristics, risk factors? If not, what barriers to your original plan did you encounter? What outreach strategies have you developed to recruit participants? How do you recruit youth for the program? Have you encountered problems with retention? What strategies have you developed to improve retention?


D4. Schedule for program activities


Probes: How is the program delivered? In how many sessions, of what length, and over what period of time? What challenges to scheduling the program did you encounter? How does scheduling affect retention?


D6. Program components/activities


Probes: have you been able to implement all the components/activities required by the program model (as adapted for the replication)? If not, which ones have you had to drop or modify? What were the reasons for the change?


E. Implementation: Administrative and Supervisory Processes


E1. Working with partners

Probes: Were you able to work productively with the partners you originally proposed? What problems or barriers did you encounter? What roles did the partners play in implementing the program? Which partnerships were most effective?


E2. Decision-making and problem-solving processes and strategies


Probes: Who is involved in making decisions about the program, solving problems that arise? How do you bring front-line staff into the process?


E3. Maintaining school and community support


Probes: What have you done to maintain support for the program in schools (or community agencies)? What difficulties have you encountered?


E4. Rules and standards


Probes: In addition to the performance standards that OAH requires you to meet, are there other standards or rules that you have developed to ensure strong implementation of the program? What are they?


F. Support for Staff Performance


F1. In-service training for staff


Probes: Do you provide in-service training for your front-line staff? What type and amount do you provide? What about new staff … how are they trained?


F2. Consultation and coaching


Probes: In addition to any in-service training, who can front-line staff go to for advice, consultation? Does this happen as a regularly scheduled activity, or as needed?


F3. Monitoring, evaluation and feedback


Probes: Who is responsible for monitoring staff performance, in particular monitoring fidelity to the program model and effectiveness of delivery? How is that information used, in addition to reporting it to OAH? Is it used to provide feedback to front-line staff? Who provides the feedback and on what schedule? What has been staff reaction to the monitoring tools and any feedback? Do they find it helpful? Do they believe that the monitoring tools assess performance accurately?


G. Community Context


G1. External events that affected program implementation


G2. Community characteristics (ask only about gaps in our information)


Probes: Urbanicity, population size, SES, race/ethnicity, major industries? Are there major religious affiliations, practices, or influences? Are there major issues or challenges facing the community?


G3. Community attitudes toward the problem of teen pregnancy


Probes: What are the prevailing attitudes towards adolescent sexual and other risk behaviors? What are the beliefs about teen pregnancy (i.e. a large problem, a manageable problem)? Are teen sexual behavior and pregnancy perceived as problems by members of community?


G4. Visibility of the program and community response


Probes: Is this program (highly) visible in the community? What is the level of community support for and/or opposition to the program from schools/school supervisors/community leaders? What are the sources of support for and/or opposition to the program from schools/schools supervisors/community leaders? Have you received any positive or negative messages about your program? Are there particular components of the program that are perceived positively or negatively by the community?



Abt Associates Inc. Appendix B: Discussion Guide for Head of Grantee Agency

File Typeapplication/msword
File TitleAPPENDIX B
AuthorSeth F. Chamberlain
Last Modified ByCTAC
File Modified2012-06-29
File Created2012-06-29

© 2024 OMB.report | Privacy Policy