3 Program Leader Debrief Interview Protocol

Pilot Study of Participant Outcomes Survey for the Creative Forces®: NEA Military Healing Arts Network Community Arts Engagement Grant Program

Attachment A3 CF Program Leader Debrief Interview Protocol

OMB: 3135-0146

Document [pdf]
Download: pdf | pdf
ATTACHMENT A
CREATIVE FORCES COMMUNITY ENGAGEMENT PROGRAM
PARTICIPANT OUTCOMES SURVEY PILOT TEST INSTRUMENTS
PROGRAM LEADER DEBRIEF INTERVIEW PROTOCOL
PROGRAM:__________________________________
DATE OF INTERVIEW:__________________________
INTERVIEWER:_______________________________
Thank you for participating in this feedback session about the pilot study of a Participant Outcomes Survey
for the Creative Forces®: NEA Military Healing Arts Network Community Arts Engagement Grant Program.
Your input will help the National Endowment for the Arts and its cooperator, Mid-America Arts Alliance,
identify what went well and what needs to be improved for implementing the Participant Outcomes Survey
as part of a future evaluation.
No one associated with this program, the National Endowment for the Arts, and the Mid-America Arts
Alliance will be able to hear or read your comments. Your contributions today will not be attributed to you or
your organization in any reports; any identifying information will be removed from our notes. Additionally,
neither your participation nor your comments will affect the outcome of any present or future grant
applications, contract proposals, or cooperative agreement proposals with the National Endowment for the
Arts, Mid-America Arts Alliance.
Here are several important reminders for today’s conversation:
• Your participation is completely voluntary.
• You may skip any question or stop participating at any time.
• The interview will last no longer than 60 minutes.
• Today’s discussion will be recorded so we can accurately capture your thoughts, examples, and insights
and be able to actively participate.
We encourage you to be thoughtful and honest about your experiences resulting from your participation in
the program; there are no right or wrong answers.
1. Coordination with program directors: This survey requires coordination with program directors to access
participants. For the survey pilot test, this included the initial webinar and our subsequent meeting. From
your perspective, what went well in terms of preparing the program leaders and coordinating the
survey? What could be improved?
2. Outreach to participants: From your perspective, what went well with the outreach to your program’s
participants? How was the outreach received? What recommendations do you have to make it more
effective? Please consider the advance information shared with them in your answer.
3. The feasibility of communicating with participants via email: Ideally, program participants receive
information about the survey and a link to access the survey via email. What worked well? What are your
recommendations for improving communication with participants?
4. Engaging participants: What recommendations do you have for engaging program participants in the
survey in the future?
5. Administration of the survey across multiple programs that use different implementation models and
timeframes: The survey methodology needs to be flexible enough to accommodate different program

1

models and timeframes. How compatible was the methodology with your program model and timeline?
6. Availability of comparison groups at the pilot sites: In the future evaluation, Creative Forces may want
to do a comparison study with the survey. In the comparison study, participants of the Community
Engagement Programs would take the survey, as they did for this pilot. In addition, the survey would be
taken by similar people who are not part of the community engagement program. Are you aware of any
local organizations, programs, or agencies that might have a comparison group?
The next few questions focus on participants’ experience, and you may not have any information about that.
I’d still like to ask them, in case you are aware of anything we should consider.
7. Use of individual, unique codes for accessing the survey: For the pilot testing, each participant received
a unique access code for the survey. This access code provides an additional layer of confidentiality. Are
you aware of any feedback about the codes that would be useful?
8. Survey technology: The survey was administered electronically, and participants could access it using
handhelds, tablets, and computers. What is your perception of the ability of participants to access the
survey electronically?
9. General: Do you have any other feedback about the process? Is there anything you wished had gone
differently? What other support would be helpful for you or your organization if the survey happened
again?
10. Closing: We’re just about done with today’s interview. As we wrap up, is there anything else you’d like to
share that we haven’t talked about today?
Thank you again for your participation. Have a great day.

2


File Typeapplication/pdf
AuthorShawn Bachtler
File Modified2022-07-06
File Created2022-07-06

© 2024 OMB.report | Privacy Policy