TPP Observation Form

TPP Program Observation Form.pdf

Teen Pregnancy Prevention (TPP) Performance Measures for FY2020

TPP Observation Form

OMB: 0990-0438

Document [pdf]
Download: pdf | pdf
Program Observation Form for TPP Grantees
Grantee:

Program Implementer(s):

Location:
Observer:
Observation Date:

Session Number/Name:
Duration of Session:
# of Participants:

Introduction: The purpose of the observation form is to measure the fidelity and quality of
implementation of the program delivery. Please use the guidelines below when completing
the observation form and do not change the scoring provided; for example, do not circle
multiple answers or score a 1.5 rather than a 1 or a 2.
You should complete the observation form after viewing the entire session, but you should
read through the questions prior to the observation. It is also helpful to take notes during your
viewing; for example, for Question 1, each time an implementer gives explanations, place a
checkmark next to the appropriate rating.
Instructions: The following questions assess the overall quality of the program session and
delivery of the information. Use your best judgment and do not circle more than one response.
1. In general, how clear were the program implementer’s explanations of activities?
1
2
3
4
5
Not clear
Somewhat clear
Very clear
1 - Most participants do not understand instructions and cannot proceed; many questions asked.
3 - About half of the group understands, while the other half ask questions for clarification.
5 - 90-100% of the participants begin and complete the activity/discussion with no hesitation and no
questions.
2. To what extent did the implementer keep track of time during the session and activities?
1
2
3
4
5
Not on time
Some loss of time
Well on time
1- Implementer does not have time to complete the material (particularly at the end of the session);
regularly allows discussions to drag on (e.g., participants seem bored or begin discussing non-related
issues in small groups).
3 - Misses a few points; sometimes allows discussions to drag on.
5 - Completes all content of the session; completes activities and discussions in a timely manner (using
the suggested time limitations in the program manual, if available).

July 2011

3. To what extent did the presentation of materials seem rushed or hurried?
1
2
3
4
5
Very rushed
Somewhat rushed
Not rushed
1- Implementer doesn’t allow time for discussion; doesn’t have time for examples; tells participants they
are in a hurry; body language suggests stress or hurry.
3 - Some deletion of discussion/activities; sometimes states but does not explain material.
5 - Does not rush participants or speech but still completes all the materials; appears relaxed.
4. To what extent did the participants appear to understand the material?
1
2
3
4
5
Little understanding
Some understanding
Good understanding
Use your best judgment based on participant conversations and feedback.
Roughly: 1 - Less than 25% seem to understand; 3 - About half; 5 - 75-100% understand.
5. How actively did the group members participate in discussions and activities?
1
2
3
4
5
Little participation
Some participation
Active participation
Use your best judgment based on listening to the discussions and feedback.
Roughly, 1 - Less than 25% participate; 3 - About half participate. 5 - 75-100% participate
6. On the following scale, rate the implementer on the following qualities:
a) Knowledge of the program
1
2
Poor

3
Average

4

5
Excellent

1 - Cannot answer questions, mispronounces names; reads from the manual.
5 - Provides
information above and beyond what’s in the manual; seems very familiar with the concepts
and answers questions with ease.
b) Level of enthusiasm
1
2
Poor

3
Average

4

5
Excellent

1 - Presents information in a dry and boring way; lacks personal connection to material;
appears ―burned out.‖
5 - Makes clear that the program is a great opportunity; gets participants talking and excited;
outgoing.
c) Poise and confidence
1
2
Poor

3
Average

4

5
Excellent

1 - Appears nervous or hurried; does not have good eye contact.
5 - Does not hesitate in addressing concerns. Well organized, not nervous.

July 2011

d) Rapport and communication with participants
1
2
3
Poor
Average

4

5
Excellent

1 – Doesn’t remember names; does not ―connect‖ with participants; acts distant or unfriendly.
5 - Gets participants talking and excited; very friendly; uses people’s names when appropriate;
seems to understand the community and its needs.
e) Effectively addressed questions/concerns
1
2
3
Poor
Average

4

5
Excellent

1 - Engages in ―power struggles‖; responds negatively to comments; gives inaccurate
information; doesn’t direct participants elsewhere for further info.
5 - Answers questions of fact with information, questions of value with validation; if doesn’t
know the answer, is honest about it and directs them elsewhere.
7. Rate the overall quality of the program session.
1
2
3
Poor
Average

4

5
Excellent

Summary measure of all the preceding questions. Assesses both the extent of material covered and the
performance of the implementer.
Excellent sessions looks like:
Participants are doing rather than talking about activities
Non-judgmental responses to questions
Answering questions of fact with information, questions of value with validation
Good time management and well organized
Adequate pacing—not too fast and did not drag
Using effective checks for understanding.
Poor sessions look like:
Lecture-style of presenting the content
Reading the content from the notebook
Stumbling along with the content and failing to make connections to what has been
discussed previously or what participants are contributing.
Uninvolved participants
Getting into power struggles with participants about the content.
Judgmental responses
Flat affect and boring style
Unorganized and random
Loses track of time.

July 2011

Note: The following questions (8, 9, and 10) are for grantee’s internal use only for program
improvement purposes. These questions are optional and will not be reported to OAH or ACYF
for performance measurement purposes.
8. Briefly describe any implementation problems you noticed, including any major changes to the
content or delivery of the material; time wasted in getting the session started or finished, etc:
____________________________________________________________________________________________________
____________________________________________________________________________________________________
____________________________________________________________________________________________________
________________________________________________________________________
9. Please note at least one major strength of the session and/or facilitator’s delivery of the material:
____________________________________________________________________________________________________
____________________________________________________________________________________________________
____________________________________________________________________________________________________
________________________________________________________________________
10. Other Comments: Use the space below for additional comments regarding strengths or weaknesses
of the session, particularly if there is anything that affected your ratings above.
____________________________________________________________________________________________________
____________________________________________________________________________________________________
____________________________________________________________________________________________________

____________________________________________________________________________________
____________________________________________________________________________________

July 2011


File Typeapplication/pdf
File TitleCOMMUNITIES THAT CARE
AuthorUniversity of Washington
File Modified2020-01-10
File Created2011-10-07

© 2024 OMB.report | Privacy Policy