Guide for frontline staff

Teen Pregnancy Prevention Replication Evaluation: Replication Study

TPP Replication - Implementation Study -Appendix E_TPPDiscussionguide4

Guide for frontline staff

OMB: 0990-0397

Document [doc]
Download: doc | pdf

APPENDIX E


Evaluation of Teen Pregnancy Prevention Replications


Discussion Guide for Use with FRONTLINE STAFF (health educators, facilitators)
































Program Name:

Program Location:

Sponsoring Organization:

Individual(s) Interviewed: (names and titles)

Contract Staff (as appropriate):

Date of Communication:


Instructions for Site Visitors


This visit has a set of very specific goals:


  • To ensure that we have a complete and up-to-date understanding of the roles and responsibilities of the grantee and partners and their staff:


  • To expand our understanding of aspects of readiness and preparation that would support strong replication of a program model (preliminary information has been abstracted from the grant proposal and other extant documents and is incorporated into the profile you received. Please make sure that you have read this and are thoroughly familiar with it so that you can probe for updated information and identify incorrect information);


  • To understand the plan for replication of the program model, the adaptations that were approved and made, and the extent to which the replication was implemented as planned (again, you will have been given a summary of the replication plan as contained in the proposal and updated in the request for continuation funding. Your task will be to use discussion and direct observation to determine how the program is actually implemented and to determine the extent to which the various aspects of the replication were implemented as planned);


  • To understand the local context in which the replication is being implemented (this includes the school or agency environment as well as the social structure ,behavioral norms, resources and services availability of the local community);


  • To understand the ways in which the grantee changed or adapted aspects of the replication plan in response to local needs or pressures;


  • To understand the challenges encountered in replicating the program model and in other aspects of implementation, the extent to which staff are able to address those challenges and the strategies they employ to address them; and


  • To document the services provided to the members of the control group.


These goals will have been articulated in prior correspondence with the sites before the visit, but you should reiterate them at the beginning of any discussion with staff. You should hand the following statement to everyone you interview and an appropriate version of it to youth participants in focus groups.


Thank you for taking time to talk about (Name of Program). As you know, we are conducting the TPP Replication Evaluation. As an important part of that effort, we are visiting programs that are participating in the evaluation two or three times during their grant period, in an effort to understand and document the process of replicating an evidence-based model in real-world settings, the challenges that arise, and how staff on the ground respond to those challenges. The information we gather will serve two purposes: it will help future program operators and policymakers understand what is needed to replicate a program with fidelity, to implement it as intended, with the populations originally targeted; and it will be used to help us understand variations in program impacts, where they occur.


You may, of course, choose not to discuss any topic or end the discussion at any time. We will combine the information from this visit and subsequent visits, with information from your program documents and the performance and fidelity data you have collected, to create a narrative account of your program and the process of replicating the program model you selected. At the same time, we will combine the information about your program with information about other programs in the study to identify consistent themes that apply more generally across a range of program types and replication efforts. Neither your name nor the names of any individuals will be reported, and the notes we take about our discussions will not be shared with or provided to the federal government or anyone else except the members of the evaluation team.


A: Readiness/Preparation: Staff background


A1. Staff education and experience


Probes: Can you tell me about yourself – how long you have been with the agency, what you were doing before you came here? What aspects of your education and experience do you see as most helpful for this job? Experience with youth programs? Sexual health services or interventions? Social services?


B. Readiness/Preparation: Staffing


B1. Structure of program staffing


Probes: Can you help me understand how the project works, what your responsibilities are, who you report to? In your view, is the staffing appropriate to mount a strong implementation of the program? If not, what additional staff do you think would make the program stronger – numbers and type of staff?


B2. Staff training prior to implementation


Probes: What amount and kind of training did you receive before the program began? Who provided the training? Did you feel it was adequate? Were you required to do any other type of preparation?


B3. Staff commitment


Probes: How committed are do you feel to this specific program? Do you believe in the program’s goals? Feel the activities and content are appropriate for the youth population you are working with? Did your feelings about the program change as a result of the pilot? In what ways? How about other frontline staff?


D. Implementation: Putting the Program in Place


D1. Settings for the program


Probes: Were you able to implement the program in the number and type of schools (other settings) that you planned? What obstacles did you encounter? Were you able to overcome them? How? If the obstacle remained, what changes did you make in your strategy for implementing the program? How has this affected the implementation of the program, your ability to recruit and retain youth, other aspects of the program?


D2. Staffing the program


Probes: Were there any changes in program staffing as a result of the pilot year? What were they? What is your workload (case flow)? Is it more or less than you expected? What are the reasons for the difference?




D3. Target population


Probes: Are you serving the youth you planned to serve, in terms of numbers, characteristics, risk factors? If not, what barriers to your original plan did you encounter? What outreach strategies have you developed to recruit participants? How do you recruit youth for the program? Have you encountered difficulties in recruiting? Have you encountered problems with retention? What strategies have you developed to improve retention? What are barriers to participation that you have little or no control over?


D4. Schedule for program activities


Probes: How is the program delivered? In how many sessions, of what length, and over what period of time? What challenges to scheduling the program did you encounter? How does scheduling affect retention? How does it affect your ability to deliver the program?


D6. Program components/activities/materials


Probes: have you been able to implement all the components/activities required by the program model (as adapted for the replication)? If not, which ones have you had to drop or modify? What were the reasons for the change? Do you feel that the materials provided, manuals and other guidance are adequate to help you implement the program with fidelity? What else would be helpful?


D7. Gaps in /problems with program content


Probes: Are there activities or program content that are inappropriate for the population you are serving? That seem out of date? Are there gaps in content, information that your youth population needs that is not part of the program? How have you dealt with these issues?


D8. Satisfaction with program model


Probes: Overall, do you feel that the program model you are replicating is the correct choice for the youth population you are serving? If not, in what ways is it less than ideal? In retrospect, would you choose a different program model to replicate? Which one (or what characteristics would be important)?


D9. Response of participants


Probes: How engaged are youth in the activities/content of the program? What aspects of the program/activities/content are they most/least responsive to? Have you had any feedback form them about the program? What kinds of comments do they make about the program? Have you made any changes as a result of these comments? What kinds of changes did you make?







E. Implementation: Administrative and Supervisory Processes


E1. Working with partners

Probes: Were you able to work productively with the partners that were originally proposed? What problems or barriers did you encounter? What roles did the partners play in implementing the program? Which partnerships were most effective?


E2. Decision-making and problem-solving processes and strategies


Probes: Who is involved in making decisions about the program, solving problems that arise? Are you or other frontline staff involved? How?


F. Support for Staff Performance


F1. In-service training for staff


Probes: Do you receive in-service training? What type and amount? Who does the training? Do you provide feedback about the relevance and effectiveness of the training? Does your supervisor or trainer make changes in response to your feedback? What about new staff … how are they trained?


F2. Consultation and coaching


Probes: In addition to any in-service training, who can you go to for advice, consultation? Does this happen as a regularly scheduled activity, or as needed? Do you find the consultation helpful? In what ways?


F3. Monitoring, evaluation and feedback


Probes: Who is responsible for monitoring staff performance, in particular monitoring fidelity to the program model and effectiveness of delivery? Is that information used to provide feedback to you and other front-line staff? Who provides the feedback and on what schedule? How do you feel about the monitoring and feedback – has it been helpful? Do you think that the monitoring tools assess performance accurately? What would help you do your job better?



Abt Associates Inc. Appendix E: Discussion Guide for Use with FRONTLINE STAFF

File Typeapplication/msword
File TitleAPPENDIX B
AuthorSeth F. Chamberlain
Last Modified ByCTAC
File Modified2012-06-29
File Created2012-06-29

© 2024 OMB.report | Privacy Policy