Download:
pdf |
pdfAttachment E: Monitoring, Evaluation, and Learning Plan
DO NOT DISTRIBUTE DRAFT REPORT. FOR INTERNAL PURPOSES ONLY.
DRAFT PLAN | March 2024
The National Endowment for the Arts’
‘ArtsHERE’ Equity Pilot Initiative
Monitoring, Evaluation, and Learning Plan
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
2
DRAFT PLAN | March 2024
The National Endowment for the Arts’ ‘ArtsHERE’ Equity
Pilot Initiative
Monitoring, Evaluation, and Learning Plan
Submitted to
Patricia Moore Shaffer and Kathryn Zickuhr
Office of Research and Analysis, The National Endowment for the Arts
Prepared by
James Bell Associates
2000 15th St N suite 100
Arlington, VA 22201
(703) 528-3230
www.jbassoc.com
Connie Park, Project Director
Crystal Coles, Deputy Director
Kirsten Keene, Project Manager
Alexandria Smith, Research Assistant
Contents
Change Log ....................................................................................................................................... 1
Background ....................................................................................................................................... 2
Monitoring, Evaluation, and Learning Plan..................................................................................... 3
Guiding Principles and Frameworks for the MEL Plan ..................................................................... 4
Theory of Change and Logic Model ................................................................................................. 5
Domains and Research Questions .................................................................................................. 9
Data Collection Methods ................................................................................................................ 10
Description of Data Sources .......................................................................................................... 11
Data Analysis ................................................................................................................................. 15
Limitations of the MEL Plan ........................................................................................................... 17
OMB Clearance ............................................................................................................................. 18
Timeline ......................................................................................................................................... 18
Process for Updating MEL Plan ..................................................................................................... 20
Component 1. Monitoring Plan ...................................................................................................... 20
Monitoring Plan Purpose and Scope ............................................................................................. 20
Monitoring Plan Questions ............................................................................................................. 21
Monitoring Methods ....................................................................................................................... 22
Component 2. Evaluation Plan....................................................................................................... 24
Evaluation Plan Purpose and Scope ............................................................................................. 24
Evaluation Questions ..................................................................................................................... 24
Evaluation Methods ....................................................................................................................... 27
Component 3. Learning Plan.......................................................................................................... 32
Learning Plan Purpose and Scope ................................................................................................ 32
Learning Plan Questions ............................................................................................................... 32
Learning Methods .......................................................................................................................... 35
Stakeholder Engagement Plan....................................................................................................... 38
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
ii
Exhibits
Exhibit 1. Theory of Change ............................................................................................................. 7
Exhibit 2. Logic Model ...................................................................................................................... 8
Exhibit 3. MEL Plan High-Level Timeline ...................................................................................... 19
Exhibit 4. Primary Research Questions for the Monitoring Plan ................................................ 21
Exhibit 5. Primary Research Questions for the Evaluation Plan ................................................. 25
Exhibit 6. Primary Research Questions for the Learning Plan .................................................... 33
Exhibit 7. Stakeholder Engagement Strategies ............................................................................ 39
List of Appendices
Appendix A. Indicator Summary Table
Appendix B. ArtsHERE Detailed Project Timeline
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
iii
Change Log
The table below will document over the entirety of the ArtsHERE pilot any adjustments to the
monitoring, evaluation, and learning (MEL) plan, including the nature of changes, who made the
changes and when, and the rational and/or learning driving the change. This will serve as a tool for
reflecting on adjustments and course corrections made to the initiative and the MEL plan throughout
the pilot period as ArtsHERE partners continually learn. More description on the process for updating
the plan is provided later in the document.
Nature of the change
Who made the change
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
When change was
made
Learning/Rationale
driving the change
1
Background
Through ArtsHERE, the NEA is investing in a range of projects from eligible organizations
throughout the arts and cultural ecosystems that have demonstrated a commitment to equity within
their practices and programming. Examples of this commitment include programming reflective of
the community it serves, planning activities inclusive of diverse voices, and consistent engagement
with underserved groups/communities that have rich and inspiring artistic and cultural contributions
to share. For this evaluation, “underserved group/community” refers to those whose opportunities to
experience the arts have been limited by factors such as geography, race/ethnicity, economics,
and/or disability. This initiative will also support sharing these organizational stories with the broader
arts and cultural sectors.
Specifically, the NEA will support this work through nonmatching, project-based subgrants and
professional development activities including learning opportunities and peer networking. Subgrants
will be awarded to a range of organizations, such as those that center arts and cultural activities
within their communities; work at the intersection of the arts and other domains (such as community
development, health/well-being, or economic development); and are diverse in terms of geography,
scale of operations and programming, and budget size. Throughout this document, ArtsHERE
subgrant recipients will be referred to as grantees. The NEA is partnering with South Arts, a
Regional Arts Organization (RAO), to undertake ArtsHERE. The intent of ArtsHERE is to strengthen
the capacity of organizations already engaging with underserved groups/communities to increase
arts participation, learn from their experiences in undertaking this work, and connect these
organizations to each other and other relevant entities through technical assistance and peerlearning opportunities designed and facilitated by the Mid-America Arts Alliance. These opportunities
are intended to bolster, amplify, and extend effective organizational strategies and ways of working.
While specific learning opportunities will be designed collaboratively with the grantees, topics may
include strategic planning; budgeting; grant management; community engagement; diversity, equity,
inclusion, and accessibility (DEIA) training; studying the characteristics of healthy arts and cultural
ecosystems; or other related topics. In the long-term, investments made through the initiative will
build grantee capacity to sustain meaningful community engagement and increase arts participation
for underserved groups/communities.
Specific activities associated with ArtsHERE are framed through the three pillars listed below.
•
Investment. Nonmatching, project-based grants ranging from $65,000 to $130,000 will be
awarded to approximately 95 eligible organizations across the country.
•
Learning. Grantee leadership and staff will participate in technical assistance and peer-learning
communities with other ArtsHERE grantees for knowledge-sharing, network-building, peer-based
learning, and other offerings.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
2
•
Evaluation. As a pilot initiative, the study will be documented and evaluated to better
understand the activities supported and how grantees approached the work. The evaluation will
result in a summary of lessons learned and may inform the future of the ArtsHERE initiative.
The pilot initiative reflects goals and objectives identified in the NEA 2022–2026 Strategic Plan and
Equity Action Plan. ArtsHERE partners have further articulated the connections between the
intentions, activities, and goals in the theory of change (exhibit 1) and logic model (exhibit 2). An
additional description of the ArtsHERE initiative, including eligibility for organizations and allowable
activities, is documented in the ArtsHERE Program Guidelines.
The pilot initiative includes two task order contracts for the evaluation. The first contract (Task Order
#1) will be implemented through a contract with James Bell Associates (JBA), funded and
coordinated by the NEA Office of Research and Analysis (ORA). JBA will facilitate this evaluation
through the development of a Monitoring, Evaluation, and Learning Plan and is responsible for
facilitating the Learning Plan. A second contract (Task Order # 2) will be issued for the
implementation of the evaluation and monitoring components and will be contracted with a second
contractor. Throughout the MEL plan, JBA will be referred to as the first contractor. The contractor
responsible for activities and deliverables in Task Order #2 will be referred to as the second
contractor.
Monitoring, Evaluation, and Learning Plan
The Monitoring, Evaluation, and Learning (MEL) plan will guide efforts to monitor progress, make
midcourse corrections if necessary, and evaluate outcomes of ArtsHERE. In particular, the MEL plan
will guide the measurement and evaluation approaches used to answer key evaluation questions
and provide a common thread to capture the range of activities conducted by subgrantees.
Motivating this plan is an interest in learning how the NEA might support similar field-building
initiatives in other areas of its portfolio. Beyond learning how it might improve its own activities and
funding practices and activities, NEA plans for ArtsHERE to generate insights to strengthen the arts
and culture sector and to inform future practice and national strategy for public funding for the arts.
NEA is also interested in understanding ArtsHERE’s role in supporting the development of local and
national grantee connections and organizational capacities.
To accomplish these objectives the MEL plan will include three separate but related components.
•
Component 1. Monitoring Plan
•
Component 2. Evaluation Plan
•
Component 3. Learning Plan
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
3
The remainder of this document details the MEL approach. Sections address guiding principles and
frameworks that undergird its design and implementation, the theory of change and logic model,
data collection and analysis plans, distinct characteristics of each MEL component, stakeholder
engagement practices, and dissemination and communication strategies.
Guiding Principles and Frameworks for the MEL Plan
The approach to planning and implementing the MEL plan components is undergirded by the
principles of codesign and developmental, culturally responsive and community-engaged evaluation
frameworks. Developmental Evaluation, coined by Michael Quinn Patton, is a growing area of
evaluation practice advocated for emerging programs and innovations. 1 It works well in contexts in
which "implementation is likely to change in response to emerging conditions on the ground". 2 The
practice focuses on continuous adaptation and learning to understand implementation, capture
decision points, feed data back to key groups, and demonstrate progress. Patton refers to principles
as sensitizing concepts over operational rules, therefore Developmental Evaluation is as much a
mindset as it is a process or way of doing something. 3 Inherent within its principles are attention to
learning and development, evaluation rigor, systems-thinking, timely feedback, and
cocreation/codesign. This transformative approach is based on a diversified and multicultural
collaboration presented through an inclusive power and privilege lens that uses codesign to actively
involve all key groups within the evaluation design process to help ensure the results/outcomes meet
their needs. This process requires time for relationship, capacity, reflexivity, creativity, and
consensus building across all partners in the codesign space. As such, conducting evaluations
within the context of this work requires flexibility to adapt to the complex and evolving conditions
within communities.
Power and privilege perspectives are grounded in critical theory and assume power differentials,
both earned and unearned, are central to all human interactions. Being community engaged in
evaluation means the work is carried out in a manner that helps identify and equitably shifts the
power difference among communities (i.e., ArtsHERE grantees, funders, researchers, and technical
assistance providers). All key groups are meaningfully engaged across all aspects of the
evaluation—ranging from theory of change development through validating data collection
instruments, helping to cocreate recommendations, and sharing the findings. Authentic participation
______
1
Michael Quinn Patton, Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use (New York, NY:
Guilford, 2011).
2
USAID, Implementing developmental evaluation: A practical guide for funders, (USAID, 2019), 7.,
3
Michael Quinn Patton, “What Is Essential in Developmental Evaluation? On Integrity, Fidelity, Adultery, Abstinence, Impotence,
Long-Term Commitment, Integrity, and Sensitivity in Implementing Evaluation Models,” American Journal of Evaluation 37, no. 2
(March 22, 2016): 250–65, https://doi.org/10.1177/1098214015626295.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
4
involves both listening to grantees and integrating their feedback throughout the evaluation process.
It involves a coconstructive process in which “legitimate knowledge” comes in a variety of forms from
multiple sources. Active and meaningful grantee engagement is an essential component of carrying
out respectful research with historically and continually underfunded organizations providing services
to underserved communities.
Furthermore, engaging grantees allows for a better understanding of the cultural context in which a
project operates and ensures evaluation processes and findings reflect the voices, traditions, and
experiences of participating communities and key groups. These values support the NEA to “honor
the cultural context in which an evaluation takes place by bringing needed, shared life experience,
and understandings to the evaluation tasks at hand”. 4 Culturally responsive and community engaged
approaches enhance the ethics, rigor, and impact of evaluation studies. 5,6,7,8
The MEL plan draws on principles from these approaches, such as shared leadership and
bidirectional learning, to guide decision making, address critical questions, and engage in learning
and evaluation in service of and in contribution to equity.
Theory of Change and Logic Model
The theory of change and logic model are visual representations of the intentions and envisioned
progress toward goals. The purpose of the theory of change is to broadly describe overall
aspiration(s) and how the initiative will achieve them. It illustrates the underlying assumptions and
context and lays the groundwork for the logic model, which articulates the connections between the
activities (i.e., three pillars), outputs, and outcomes. The logic model serves as a blueprint for priority
indicators and data collected throughout the implementation of this MEL plan.
______
4
Harry T. Frierson, Stafford Hood, and Gerunda B. Hughes, “A guide to conducting culturally responsive evaluation,” in The 2002
User-Friendly Handbook for Project Evaluation, (National Science Foundation, 2002), 63.
5
Amy Besaw, Joseph P. Kalt, Andrew Lee, Jasmin Sethi, Julie Boatright Wilson, and Marie Zemler, The context and meaning of
family strengthening in Indian America. (Annie E. Casey Foundation by the Harvard Project on American Indian Economic
Development, 2004).
6
Lisa R. Thomas et al., “The Community Pulling Together: A Tribal Community–University Partnership Project to Reduce Substance
Abuse and Promote Good Health in a Reservation Tribal Community,” Journal of Ethnicity in Substance Abuse 8, no. 3 (August 18,
2009): 283–300, https://doi.org/10.1080/15332640903110476.
7
Lisa Rey Thomas et al., “Research Partnerships between Academic Institutions and American Indian and Alaska Native Tribes and
Organizations: Effective Strategies and Lessons Learned in a Multisite CTN Study,” The American Journal of Drug and Alcohol
Abuse 37, no. 5 (August 22, 2011): 333–38, https://doi.org/10.3109/00952990.2011.596976.
8
Emily J Tomayko et al., “Healthy Children, Strong Families 2: A Randomized Controlled Trial of a Healthy Lifestyle Intervention for
American Indian Families Designed Using Community-Based Approaches,” Clinical Trials 14, no. 2 (January 9, 2017): 152–61,
https://doi.org/10.1177/1740774516685699.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
5
These visuals are undergirded by the program design plan and were developed in collaboration with
NEA, South Arts, and committees with representatives from the five partnering RAOs. A Technical
Working Group (TWG) comprised of seven individuals from a diverse range of backgrounds and
experiences in the arts and cultural sector, including research, evaluation, and arts administration,
also provided feedback on the theory of change and logic model.
Reflective of the developmental nature and commitment to iterative learning undergirding the
initiative, these visuals will be used throughout the implementation to assess, reflect, and document
how, if at all, the initiative was implemented as intended and how shifts and course corrections were
informed by continuous learning. The theory of change and logic model will be revisited annually and
revised, as appropriate.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
6
Exhibit 1. Theory of Change
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
7
Exhibit 2. Logic Model
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
8
Domains and Research Questions
The MEL plan is guided by a set of consensus-driven research questions organized around eight
key domains of interest. Developed in collaboration with NEA, South Arts, partnering RAO
committees, and the ArtsHERE Technical Working Group (TWG), the research questions align with
key activities, outputs, and outcomes articulated in the theory of change and logic model. For more
detailed information on the alignment between the research questions and logic model components,
please see Appendix A. Indicator Summary Table.
The domains and research questions for the MEL plan are listed below.
Domain 1: Organizational characteristics of applicants and grantees
1.1
What was the process for determining which organizations received grants?
1.2
What are the characteristics, at time of application, of organizations that apply for and
those that receive grants? How do they compare by key descriptive characteristics (e.g.,
high-poverty census tracts, majority race/ethnicity of census tracts, rurality, communities
engaged, arts versus non-art organizations)?
Domain 2: Description of communities engaged by grantees
2.1. What are the key descriptive characteristics (e.g., high-poverty census tracts, majority
race/ethnicity of census tracts, rurality) of communities engaged by grantees?
Domain 3: Grantee programs
3.1. How have grantees engaged underserved communities prior to their grants?
3.2. How do capacity-building efforts provided through learning opportunities support grantee
engagement with underserved communities during the grant? What works well? What
challenges or barriers do grantees experience?
3.3. In what ways do grantees demonstrate commitment to equity in meeting the
needs/interests of their communities?
3.4. What are organizations doing to integrate arts/culture into programming with their
communities? How does this vary across NEA-defined disciplines?
3.5. What other priorities and/or programs are addressed through the grants?
Domain 4: Organizational capacities of grantees
4.1. What are grantee organizational capacities prior to ArtsHERE?
4.2. What do grantees view as community needs/interests that they meet or address through
their ArtsHERE capacity building project?
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
9
4.3. What changes or developments, whether positive or negative, can be attributed to
ArtsHERE in terms of organizational or program growth?
Domain 5: Grantee connections
5.1. What is the role(s) of grantees in their community arts ecosystem?
5.2. What connections are grantees able to form or strengthen in their communities, within a
broader arts ecosystem, with other grantees, and with public funders, including other
RAOs and the NEA?
5.3. How, if at all, does ArtsHERE support grantees in connecting with their communities,
within a broader arts ecosystem, with other grantees, and with public funders, including
other RAOs and the NEA?
Domain 6: Grantee learning
6.1. What did the learning opportunities provision look like under ArtsHERE and who
participated in services?
6.2. How do grantees experience participation in learning opportunities?
Domain 7: Grantee funding
7.1. How, if at all, did not requiring a match benefit grantees?
7.2. In what ways, if any, did receiving funding support grantee priorities and programs?
Domain 8: ArtsHERE lessons learned
8.1. What overall lessons can be shared with funders and the arts ecosystem about the
pillars (investment, learning, and evaluation)?
8.2. What lessons could inform the NEA’s own grantmaking processes, but also for those of
RAO partners? (Grantmaking can be considered in the widest sense—inclusive of
communications/outreach, customer service, technical assistance, etc.)
8.3. What opportunities were available to provide input, feedback, and overall thoughts
regarding the development of ArtsHERE?
Data Collection Methods
This developmental and descriptive MEL plan will use mixed-methods to address the research
questions. Multiple data collection strategies will be used to comprehensively capture quantitative
and qualitative data to enable analyses that address (1) how ArtsHERE was conceptualized and
implemented, (2) the experiences of key stakeholders implementing and participating in this pilot
initiative, and (3) lessons and practices to inform future public arts funding and national strategy. The
plan aims will be addressed by using a combination of approaches tailored to each research
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
10
question. The Indicator Summary Table in Appendix A specifies the methodological approach for
each individual question.
Description of Data Sources
To address the research questions, data will be collected through both primary and secondary data
collection. Primary data collection will include surveys of all regional review panelists, learning
opportunities facilitators and coaches, and grantees participating in both required and optional
evaluation activities. At least 35 grantees are expected to consent to participate in the optional
evaluation activities, which include virtual interviews with a smaller subset of 15 grantees to inform
case studies, a final grantee survey, and participation in the TWG. Additionally, the first contractor
will engage in ongoing observation and feedback loops, periodic learning logs, and interviews with
NEA and South Arts staff overseeing the development and implementation of ArtsHERE, as well as
RAO program design and implementation partners.
A document review will be conducted on existing data developed or being collected on an ongoing
basis as part of ArtsHERE implementation. Primary data sources have been identified for each
research question and domain (see Appendix A. Indicator Summary Table).
Primary data collection
Primary data will be collected using surveys and interviews. Details regarding these data sources
are provided below.
Surveys. Web-based surveys will be developed and administered to regional review panelists, the
ArtsHERE planning group (NEA, SA, RAOs, committee chairs, and others), learning opportunities
facilitators and coaches, and grantees during the implementation period. This method of data
collection ensures a broad sample of perspectives are represented in the overall analysis and
strengthens the generalizability and validity of findings. All surveys will be developed for ORA
approval prior to OMB submission. Questions will be entered into a FedRAMP compliant secure
online survey platform (e.g., Qualtrics). Specific surveys are described below.
•
Review Panelist Feedback Survey. A web-based survey will be administered to application
review panelists who participated in phase 2 of the review process at one timepoint immediately
following completion of the review panel process (September 2024 at the latest). The survey will
consist of open- and closed-ended questions that capture panelists’ demographic
characteristics, experience serving on prior review panels, and perspectives on the panel review
process. A link will be sent to each panelist for completion.
•
Grantee Baseline Survey. A one-time web-based survey will be administered to all grantees
upon acceptance of a grant award (October 2024). The survey will consist of open- and closedended questions that capture grantee self-assessment of foundational organizational
characteristics and capacities, community needs and priorities, and program and community
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
11
demographics. A link will be sent to each grantee for completion. One survey response will be
submitted by each grantee and should reflect input from the core group involved in local planning
and implementation.
•
Grantee Learning Opportunities Quarterly Survey. A brief web-based survey will be
administered to all grantees quarterly, beginning after month 3 of the grant period (January 2025
through April 2026). It will consist of open- and closed-ended questions that capture selfassessment of learning opportunities received, including cohort convenings, one-on-one
coaching, and topical expert workshops. Grantees will indicate their satisfaction with learning
opportunities: engagement, quality, relevance, and effectiveness of cohort-based and 1:1
learning opportunities, as well as perceptions on how services can be improved.
•
Learning Opportunities Tracking Form. Following each organizational service occurrence
(November 2024 through April 2026), a brief web-based tracking form will be completed by
learning opportunity providers. The form will consist of open- and closed-ended questions to
capture the supportive services provided to grantees, including cohort convenings, one-on-one
coaching, and topical expert workshops. The tracking form will cover topics including service
type, content of service provision, participating partners, engagement experience, facilitators,
and challenges.
•
Grantee Final Survey. A web-based survey will be administered to all grantees at one timepoint
before the end of the award period (June 2026). The survey and specific topics addressed will
be developed as part of a second OMB package (submitted March 2025), prepared by the first
contractor in close collaboration with the second contractor. Survey questions will be based on
emergent learning from the grantee applications, grantee baseline survey, and discussions with
the planning group and the TWG. It will consist of open- and closed-ended questions that
capture self-assessment of the following potential topics: experiences with engaging in the
feedback process, most useful supports for building capacity, what has worked well in engaging
underserved populations, accomplishments and challenges in engaging underserved
populations, program commitment to equity, how grantees center arts and cultural activities, and
their role(s) in their arts ecosystem. A link will be sent to each grantee for completion. One
survey response will be submitted by each grantee and should reflect input from the core group
of individuals involved in local planning and implementation. Tokens of appreciation, in the
amount of $30 or as appropriate based on the length of the survey, will be distributed to each
survey respondent via email using electronic gift cards.
Interviews and Group Discussions. The second contractor will conduct interviews with grantees,
and the first one will hold group discussions with the ArtsHERE planning group. To help facilitate and
manage the flow of the discussion, protocols will be developed for all interviews. They will include
general questions and probes, while allowing the interviewer flexibility to explore emergent themes.
The NEA, South Arts, RAOs, and the TWG will have the opportunity to review and provide feedback
on protocols. Interviews and discussion groups will be recorded and transcribed for analysis and no
longer than 90 minutes.
•
Interviews with grantees. Virtual interviews will be conducted with grantees who opt into the
evaluation at one timepoint toward the end of the grant period (March through April 2026). These
interviews will be conducted, analyzed, and reported by the second contractor. Information from
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
12
interviews will elaborate on emerging findings from other data sources and will form the basis of
the development of case studies under Component 2. Evaluation Plan. Key topics explored
during interviews will include staffing, resources, challenges, community-based relationships,
and grantee experiences. To ensure a diversity of perspectives, the semi-structured interviews
will be conducted using a purposive sample of 15 grantee organizations (for a total of 15
interviews with up to 60 participants). Each interview will include up to four individuals per team
to allow the contractor to interview key informants representing a variety of roles such as
program manager, volunteer, and other staff knowledgeable of partnership dynamics and
impacts of arts organizations. Each participant will receive a token of appreciation of $75 per
hour. This is a nonprobability sampling technique that relies on researcher judgment to select the
sample with the goal of focusing on population characteristics most salient to key study research
questions. Grantees will be selected based on a variety of organizational characteristics, which
may include service and/or budget size (small, medium, large), urban/rural locations, artistic
disciplines, grantee experiences implementing different arts and cultural strategies, priority
underserved communities engaged, and intersections with arts in other sectors (e.g., health,
education, community development, repair of harm from systemic injustices). NEA and South
Arts will provide input on the final sample.
o
•
Cultural prompts. Prior to conducting the virtual interviews with grantees, the second
contractor will engage them in a cultural prompt activity. Cultural prompts are a research
technique with open-ended activities used with participants to uncover the emotional and
evocative thoughts associated with a topic of interest. A potential topic proposed for the
cultural prompts includes “What does equity in the arts mean to you?” This prompt is
meant to identify the way(s) in which arts and cultural organizations funded through
ArtsHERE understand the meaning of equity. All grantees recruited for the interviews will
be asked to respond using any method of their choosing, including narrative responses,
poems, photographs, and visual art. Interviewers will then discuss the responses during
the 15 interviews.
Group discussion with planning group. Virtual group discussions will be facilitated with NEA,
South Arts, and RAO representatives involved in the oversight, planning, and implementation.
The purpose is to obtain planning group reflections on lessons learned related to planning and
implementation. NEA anticipates the group discussions will occur at two timepoints: following the
panel review process (October 2024) and at the end of the grantee award period (September
2026) to gather lessons learned.
Learning logs. The first contractor will implement learning logs with the planning group (NEA, South
Arts, and RAOs) to document and reflect on their own experiences and learning. Each log will
consist of 4 open-ended prompts intended to facilitate ongoing reflection on experiences and
‘emergent learning’ after key program activities/milestones (see learning log schedule below).
Learning logs will be administered through a FedRAMP compliant platform and accompanied by
facilitated discussions by the evaluator as part of the learning plan.
Learning logs will be administered to the planning group on the following schedule:
•
Learning log topic: panel/selection process in May 2024
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
13
•
Learning log topic: analyses of application and award data in June 2024
•
Learning logs topic: Grantee Learning Opportunities Quarterly Survey results every 3 months
from January 2025 through January 2026
•
Learning log topic: mid-pilot, APR reactions/reflections in December 2025
Secondary data collection
Secondary data will be tracked and collected using a variety of existing data sources. This method
helps reduce grantee burden and leverages the wealth of existing information created and tracked
for the purposes of implementation.
Tracking information from grantees, the Cooperator program data, and other program data
sources. Some of the information needed to answer research questions about implementation will
be captured through program data sources. These data will be tracked and/or reviewed regarding
overall activities, awarding the subgrants, and learning opportunities design and implementation.
These data sources include—
•
Cooperator Program Data (e.g., number and type of organizations applying and awarded)
•
Learning Opportunities Provider Data (e.g., schedule of services, attendance/roster sheets for
learning opportunities participation)
Reviewing existing materials. A narrative document review will be conducted to gather descriptive
information from materials already created for ArtsHERE. These are described below.
•
ArtsHERE parts I and II applications for all awarded grantees will be reviewed for information
on baseline organizational characteristics such as organizational missions, structures and
operations (e.g., years of operation, number of board members, overall budget size), staffing
capacities, capacity building needs, sectors of focus, characteristics of communities being
engaged, and other organizational grantee practices (e.g., priorities and practices of
organization, programs in place).
•
Grantee annual progress reports for all awarded grantees will be reviewed for information on
distinct grantee practices (e.g., integration of arts/culture into programming), successes and
barriers to engaging underserved communities, and ArtsHERE experience (e.g., impact of
ArtsHERE participation on future activities). The emergent learning from the annual progress
reports will inform the learning component (under Component 3) as well as the development of
case studies (under Component 2).
•
Final Descriptive Reports for all awarded grantees will be reviewed for information
organizational characteristics and any changes made. Topics of interest will include distinct
grantee practices (e.g., updated approaches to strategies to enhance programming), successes
and barriers to engaging the community, organizational practices (e.g., updated
program/services in place, organizational capacity, knowledge gained from project), and overall
organizational or program growth that occurred as a result of funding. Additionally, the
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
14
Geographic Location of Project Activity (or GEO) portion of the final report will be used to better
understand and track the location of activities and who is likely to benefit from them.
Development of data collection instruments and piloting
Once an initial draft of the MEL plan is approved, development of data collection instruments will be
conducted in conjunction with its development. The theory of change, logic model, and program
design will guide the review of existing forms and measures. Development of all data collection
forms will require close collaboration and input from NEA, the Cooperator, relevant RAO committees
(i.e., the Organizational Services Committee), and the TWG. To the extent possible, the forms will
align with existing NEA reporting forms to ensure data can be aggregated or compared across
initiatives.
Primary data collection instruments (e.g., surveys, tracking forms, interview protocols) identified
and/or developed to address evaluation questions will be brief, approximately 30 questions or less,
to keep respondent burden low and increase the likelihood of completion.
Following NEA approval, each form/instrument will undergo cognitive testing to gain information on
time needed to complete the instrument and gain information on comprehension, usability, and
overall user experience to improve the quality and reliability of the evaluation instruments. This will
be completed with up to five individual volunteers. Volunteers who are not in the actual grantee
sample but whose characteristics closely match selected grantees (e.g., individuals from arts and
cultural organizations and affiliates) will participate in cognitive testing of the instruments.
Additionally, ArtsHERE Evaluation Committee members will participate in cognitive testing of
learning logs and any other relevant instruments and protocols. A brief report of the results and
implications from cognitive testing will be shared with the NEA for discussion, and subsequent
revisions will be approved by the NEA.
Data Analysis
The primary and secondary data collected will be analyzed using both quantitative and qualitative
methods. Examination of data from a variety of sources will provide a cross-check on the different
data collection activities and may point to issues to be further explored in subsequent data collection
activities or analyses. To ensure accuracy, validity, and reliability, a protocol for data analysis has
been established. These approaches are described in detail below.
Qualitative data analysis
Standard qualitative procedures will be used to analyze and summarize information from the
grantees and federal and RAO stakeholders. Qualitative data analysis software will be used to
organize, code, triangulate, and identify themes. In preparation for qualitative analysis, the second
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
15
contractor will use standardized templates to organize and document the information abstracted
from data sources. Qualitative data will be integrated with quantitative data and analyzed together
when practicable. This full integration will facilitate data triangulation.
For primary data collected through interviews, a hybrid process of deductive (literature driven) and
inductive (data driven) methods is proposed to analyze the information collected 9. The second
contractor will develop a codebook containing code names for each of the research questions and
key learning constructs; the process will also include open coding to identify concepts that emerge in
the data. Codes will allow the contractor to search for topics across the data to identify patterns and
outliers 10. The contractor will conduct interrater reliability checks, whereby samples of interview
transcripts are coded by two analysts at the outset of the coding process to gauge agreement
among coders. The coded text will be searched to gauge consistency and triangulated across
participants and data sources. This process will reduce large volumes of qualitative data to a
manageable number of topics/themes/categories 11 that can be analyzed to address the research
questions.
Qualitative analysis of secondary data will be more targeted, as it will draw from specific variables
within each identified data source (e.g., qualitative data will be pulled directly from applications,
annual progress reports, and final descriptive reports to answer research questions). The data will
be entered into the standardized templates and will be systematically reviewed and categorized
according to the pre-established indicators.
Quantitative data analysis
All quantitative data tracked or received will be reviewed for completeness and accuracy of entry.
For secondary data sources, such as Cooperator program data, the activities conducted by the
planning group, grantees, and learning opportunities providers will be summarized by type and
frequency.
For quantitative data generated from web-based surveys such as the baseline and learning
opportunities grantee forms, frequency distributions will be calculated to summarize trends and
patterns across survey items and to examine variability in the data. The second contractor will
produce descriptive statistics to summarize relevant quantitative items and groups of items. For
______
9
Jennifer Fereday and Eimear Muir-Cochrane, “Demonstrating Rigor Using Thematic Analysis: A Hybrid Approach of Inductive and
Deductive Coding and Theme Development,” International Journal of Qualitative Methods 5, no. 1 (March 2006): 80–92,
https://doi.org/10.1177/160940690600500107.
10
Paul Mihas and Odum Institute, “Learn to Build a Codebook for a Generic Qualitative Study,” SAGE Research Methods, March
27, 2019, https://doi.org/10.4135/9781526496058.
11
Amanda Coffey and Paul Atkinson, Making Sense of Qualitative Data Complementary Research Strategies (Thousand Oaks,
Calif: Sage, 2013).
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
16
instance, survey items that rate each grantee’s self-perceived level of engagement in learning
opportunities activities will be tabulated as means and percentages. The survey data will be
examined across all grantees participating in the evaluation, as well as by key descriptive
characteristics (e.g., organization budget size, organization or program activity location, new
grantees, disciplines) to learn more about grantee perceptions and experiences.
Limitations of the MEL Plan
The proposed developmental evaluation approach is a flexible method that is designed to be
adapted in future evaluation phases as the initial information gathered and analyzed helps generate
new thinking. 12 By using rigorous discussion protocols and measures, as well as soliciting rich
explanatory narratives from ArtsHERE planning partners, the evaluation approach will describe how
activities have increased the capacity of organizational sites with attention to each unique site’s
approach to its engagement with the NEA. The data collection methods proposed are designed to
generate reflection on the relationship between NEA goals and organizational site experiences
relative to the communities being engaged. By embedding equity-focused prompts across data
collection tools, evaluators will obtain insights about who is (and is not) invited to participate in the
grant opportunity, how responsive are capacity-building supports (i.e., one-on-one coaching, cohort
learning, and topic-based support), who has (and who should have) decision-making powers, etc.
Threats to generalizability
One limitation in the scope of this assessment is related to the qualitative case study interviews. The
second contractor will interview up to 15 grantee organizations to learn more in-depth data about
their experiences. With less than 15 percent of the total grantees being interviewed, there is a
potential that the findings from interview data will not fully capture the experiences of all grantees.
This is particularly a concern because of the expected heterogeneity of NEA grantees. This issue will
be addressed through purposive sampling (with criteria selected in consultation with NEA, the
Cooperator, and the TWG) to ensure the interviews include a wide variety of grantees. The benefit of
the case study approach is the opportunity for a more contextualized, in-depth understanding of
constructs and trends seen in the survey and program data, as well as the reporting forms
completed by all grantees. Additionally, based on initial themes emerging from the interviews, the
second contractor will develop a grantee final survey to learn more about these experiences from all
grantees prior to the end of the grant period. This will help to obtain a larger sample of grantees
participating in the evaluation from which to summarize ArtsHERE experiences and will help to
identify which case study examples are more representative, and which are outliers.
______
12
Michael Quinn Patton, Developmental Evaluation.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
17
OMB Clearance
This is a descriptive study which uses a mixed-methods approach to understand the implementation,
participation, and overall lessons learned from the ArtsHERE pilot initiative. NEA estimates up to 35
grantees who will self-select to participate in the evaluation will be engaged in primary data
collection through survey data. A smaller subsample of 15 will participate in in-depth interviews,
which will inform case studies. The first contractor will prepare two OMB clearance packages for
data collection instruments. Package #1 will be submitted in April 2024 and will include the following
data collection instruments: Grantee Baseline Survey, Review Panelist Feedback Survey, Grantee
Learning Opportunities Quarterly Survey, Learning Opportunities Tracker, learning logs, Annual
Progress Report, and Final Descriptive Report. Package #2 will be submitted by the first contractor
in March 2025 and will include Final Grantee Survey, grantee interview protocol, and any other data
instruments developed over the course of the evaluation. A separate generic clearance package will
be completed by NEA for the planning group discussion topics and grantee reflective prompts
through Mentimeter as necessary.
For each OMB clearance package submission, Supporting Statements A and B with input from the
NEA ORA Contracting Officer’s Representative (COR) will include a short description of the purpose
and use of data, an estimate of the burden, and information about the respondents. Each
submission will include a copy of the survey questions and interview protocols that will be used for
data collection. The request for approval will be submitted to OMB by the COR.
Timeline
The period of performance for the MEL plan is 48 months: approximately 12 months of planning
time, 24 months of subgrants data collection, followed by 12 months of data analysis and reporting.
The high-level timeline provided in exhibit 3 below provides a visual of the adjusted timeline that
includes program implementation and evaluation stages and milestones throughout the 48-month
period. However, an additional 8 months is recommended (October 1, 2026, through May 31, 2027)
to ensure that final analyses, reporting, and dissemination tasks can be completed by the second
contractor through Task Order #2. A more detailed project timeline is included in Appendix B.
ArtsHERE Detailed Project Timeline. The schedule included in Appendix B will be used to complete
key monitoring, evaluation, and learning activities. Dates will be finalized in collaboration with NEA.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
18
Exhibit 3. MEL Plan High-Level Timeline
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
19
Process for Updating MEL Plan
The MEL plan guides efforts to monitor progress, make midcourse corrections if necessary, and
evaluate outcomes of the ArtsHERE pilot initiative. At each stage of the process, MEL plan
development will be both inclusive and participatory to facilitate partners having greater ownership of
the evaluation as their interests are clearly reflected. The plan will be reviewed annually with NEA,
the Cooperator, TWG, and other relevant stakeholders and will provide suggestions for updates to
the NEA as needed. Activities and data sources will inform the process for identifying and
determining necessary changes. For instance, the theory of change and logic model serve as a
foundation for planning and designing the MEL plan and will be revisited annually in collaboration
with NEA, Cooperator, RAO committees, and learning opportunities providers. Changes may have
implications for the plan which can be revisited in tandem.
Learning logs will capture the reflections of planners and organizational service providers on the
processes and learning at key milestones of the initiative (e.g., post award, end of year 1 grantee
reporting, annual evaluation reports). Reflection results will also serve as a tool for iterative reflection
to identify necessary MEL plan changes.
All revisions to the MEL plan will be approved by NEA. Throughout implementation, a Change Log
(included at the start of this document) will be maintained to capture the nature, timing, and rationale
for changes over time.
Lastly, it is important to note that although the developmental evaluation provides a mechanism to
codesign, collaborate, and approach learning through an emergent design, some midcourse project
changes will not be possible. Due to costs and time associated with OMB and Institutional Review
Board (IRB) amendments, shifting the study midcourse or guaranteeing an entirely new approach
can be integrated (e.g., new data collection tools, adding questions that are beyond the scope of
what was approved by OMB and IRB) will not be feasible.
Component 1. Monitoring Plan
Monitoring Plan Purpose and Scope
The overarching goal for Component 1. Monitoring Plan is to document and monitor processes and
activities contributing to progress toward intended goals and objectives at the NEA, Cooperator,
RAO, and learning opportunities provider levels. The second contractor will be responsible for this
component.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
20
A mixed-method design, which integrates quantitative and qualitative data, will be used to obtain a
full picture of program activities, outputs, and relevant contextual information of ArtsHERE
implementation.
The second contractor will share analyses with partners and grantees participating in the evaluation
and will provide opportunities for these groups to inform interpretation of the analyses. Analyses will
focus on understanding implementation, program performance, and providing near real-time
feedback to inform processes and continuous improvement efforts.
Monitoring Plan Questions
Monitoring plan questions inform the following overarching research domains:
•
Domain 1: Organizational characteristics of applicants and grantees
•
Domain 6: Grantee learning
•
Domain 7: Grantee funding
•
Domain 8: Lessons learned
Monitoring questions inform both overall research questions as well as specific subquestions aimed
at tracking and assessing implementation progress and continuous improvement. Primary questions
of focus for monitoring are listed in exhibit 4.
Exhibit 4. Primary Research Questions for the Monitoring Plan
Research Questions (RQ)
RQ 1.2. What are the characteristics at time of application of organizations that apply for
and those that receive a grant? How do they compare by key descriptive characteristics
(e.g., high-poverty census tracts, majority race/ethnicity of census tracts, rurality,
communities engaged, arts versus non-art organizations)?
•
How many organizations express interest in ArtsHERE by state and region?
•
Were the anticipated number of applicants (i.e., 300 +/-) invited to complete part II of the
application process? How many organizations submitted part II applications in total and by
state and region?
•
How many grants were awarded in total and by state and region?
•
What was the funding range received by grantees? Did the funding range vary by region?
RQ 6.1. What did the learning opportunities provision look like under ArtsHERE and who
participated in the services?
•
What types of learning opportunities were provided?
•
What were the topics of learning opportunities offered?
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
21
•
To what extent did grantees participate in learning opportunities (dosage in hours and by type
and topic)?
•
To what extent were learning opportunities responsive to grantee feedback?
RQ 7.2. In what ways, if any, did receiving funding support grantee priorities and
programs?
•
What grantee program expenses did grant contribute to?
RQ 8.3. What opportunities were available to provide input, feedback, and overall thoughts
regarding the development of ArtsHERE?
•
How frequently was the Technical Working Group (TWG) engaged in providing input?
•
In what ways was the TWG engaged in providing input on ArtsHERE?
•
How many opportunities for feedback and sharing on evaluation findings are provided
throughout the grant period?
Monitoring Methods
Program performance monitoring methods will include grantee surveys, direct observation, and
review of existing project documentation provided by RAOs and the Cooperator. Primary data
sources will be—
•
Application Part 2 and Project Budget Form attachment includes information on allowable
grant expenses approved and the categories/types of activities and costs supported by the
funding.
•
Cooperator Program Data on grants includes but is not limited to the number and range of
awards.
•
Grantee Learning Opportunities Quarterly Survey tracks grantee feedback and perceptions
of participation in learning opportunities activities.
•
Documentation of TWG meetings uses both notes provided by the first contractor and
information gathered by the second contractor from direct participation in the TWG meetings.
•
Learning Opportunities Tracker includes facilitator and coach forms for tracking topics and
frequency of learning opportunities by cohort.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
22
Data collection guidance for ArtsHERE partners (e.g., documentation, reports)
During the instrument design and development process, instructions for ArtsHERE partners
responsible for each data source will be outlined. This will ensure information is gathered with fidelity
to its intended purpose. The second contractor will join existing meetings scheduled by NEA, the
Cooperator, RAOs, and/or committees to provide a training or walk-through of data collection
processes.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
23
Component 2. Evaluation Plan
Evaluation Plan Purpose and Scope
The overarching goal for Component 2. Evaluation Plan is to describe implementation of and
participation in ArtsHERE and explore feedback for NEA, RAOs, and participating grantee
organizations that generate lessons learned for public arts funders and to inform national strategy.
This descriptive work is intended to support NEA’s understanding of how ArtsHERE pillars (e.g.,
investment, learning, and evaluation) relate to organizational practices and capacities at the funder
and grant recipient levels over the course of the pilot initiative. It will also provide descriptive
information about the organizations that received an ArtsHERE grant, to inform how future initiatives
can continue to reflect the commitment to advancing equity in arts access and funding. It may also
be used to inform future evaluation efforts.
The evaluation component assesses the outcomes identified in the logic model, while iteratively
adapting through real time implementation process learning being monitored and tracked through
the Monitoring (Component 1) and Learning (Component 3) Plans.
Evaluation Questions
The evaluation component will draw on data and learning from the monitoring and learning plans to
comprehensively inform aspects relevant to all research domains. The evaluation plan questions
explore and describe more in-depth aspects within the following domain areas:
•
Domain 1: Organizational characteristics of applicants and grantees
•
Domain 2: Description of communities engaged by grantees
•
Domain 3: Grantee programs
•
Domain 4: Organizational capacities of grantees
•
Domain 5: Grantee connections
•
Domain 6: Grantee learning
•
Domain 7: Grantee funding
•
Domain 8: Lessons learned
Evaluation plan questions encompass several overarching MEL plan research questions previously
presented. Exhibit 5 indicates key overarching research questions and priority subquestions and
topics to be explored through the evaluation component. Additional subquestions may be
determined with partners through the monitoring and learning plan implementation and feedback.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
24
Exhibit 5. Primary Research Questions for the Evaluation Plan
Research Questions (RQ)
RQ 1.2. What are the characteristics at time of application of organizations that apply for and
those that receive grants? How do they compare by key descriptive characteristics?
•
How many grantees were first-time NEA/RAO grant recipients in total and by state and region?
•
What were the organizing structures and operations of applicants and grantees?
•
How were applicant and awardee organizations staffed at time of application?
RQ 2.1. What are the key descriptive characteristics of communities engaged by grantees?
•
Who was/were the priority underserved community/communities engaged at time of application?
•
What were the characteristics of those engaged by grantees at the end of the award?
•
Which factors limited the opportunities for underserved groups/communities to benefit from arts
programming?
•
How did grantees engage underserved groups/populations throughout ArtsHERE?
•
What were grantee perspectives, reflections, and experiences of community engagement at the end
of the grant?
RQ 3.1. How have grantees engaged underserved communities prior to their grants?
•
What services or programs did grantees provide at the time of grant?
•
What were the primary "sectors" and “disciplines” of focus for funded organizational programming at
time of grant?
RQ 3.2. How do capacity building efforts provided through learning opportunities support
grantee engagement with underserved communities during the grant? What works well? What
challenges or barriers do grantees experience?
•
What strategies were used to engage underserved communities?
•
What were grantee accomplishments in engaging communities?
•
What worked well for grantees in engaging underserved communities prior to and during
ArtsHERE?
•
What were challenges faced in engaging underserved groups/communities?
RQ 3.3. In what ways do grantees demonstrate commitment to equity in meeting the
needs/interests of their communities?
•
How do grantees define equity?
•
How did grantee organizations demonstrate commitment to equity within their practices and
programming at time of application?
RQ 3.4 What are organizations doing to integrate arts/culture into the programming with their
communities? How does this vary across NEA-defined disciplines?
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
25
•
How did grantees integrate arts and cultural activities into their programming at time of application?
•
Why did organizations outside the arts and cultural sector use arts and cultural programing/
strategies to engage their communities?
RQ. 3.5. What other priorities and/or programs are addressed through the grants?
•
What were the main reasons why grantees apply for the grants?
•
What were the challenges to implementing the ArtsHERE project as planned?
RQ 4.1. What are grantee organizational capacities prior to ArtsHERE?
•
What were the staffing capacities of organizations at time of award?
•
What grantee goals were relevant to capacities/capacity needs at time of award?
RQ 4.2. What do grantees view as community needs/interests that they meet or address through
their ArtsHERE capacity building project?
•
What was the importance of the project to grantees and their communities at time of application?
•
What were grantee perspectives on the community needs/interests that their program helped to
address at the end of award?
•
What did grantees identify as additional or remaining gaps in program and/or community interests
and needs at the conclusion of funding?
RQ 4.3. What changes or developments, whether positive or negative, can be attributed to
ArtsHERE in terms of organizational or program growth?
•
How do grantees perceive the organizational capacity prior to ArtsHERE and by the end of award?
•
How does staffing relate to organizations' implementation of their ArtsHERE projects?
•
What learnings and/or practices implemented under ArtsHERE do grantees plan to sustain after the
grant?
•
How, if at all, has ArtsHERE (including participation in a learning cohort) strengthened and/or
supported progress towards grantees organizational missions and goals?
•
Do ArtsHERE grantees show early indicators toward long-term or system-level outcomes by the end
of grant project? If so, in which indicators?
RQ 5.3. How, if at all, does ArtsHERE support grantees in connecting with their communities,
within a broader arts ecosystem, with other grantees, and with public funders, including other
RAOs and the NEA?
•
What were the ways in which ArtsHERE supported grantees in fostering/strengthening connections
with communities, within a broader arts ecosystem, with other grantees, and with public funders,
including other RAOs and the NEA?
•
How could this or other future funding opportunities foster connections with communities, within a
broader arts ecosystem, with other grantees, and with public funders, including other RAOs and the
NEA?
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
26
RQ 6.2. How do grantees experience participation in learning opportunities?
•
How did grantees perceive the learning opportunities provided in terms of the following: meeting
grantees needs, being engaging, responsiveness, effectiveness, and overall quality?
•
What were the additional areas of need relevant to learning opportunities that are identified by
grantees?
•
What did grantees perceive as most useful in supporting their capacity building?
RQ 7.1. How, if at all, did not requiring a match benefit grantees?
•
What were grantee reflections on participation in the investment pillar at the end of the grant?
•
In what ways did the grant support grantee organizational priorities and programs?
RQ 7.2. In what ways, if any, did the funding support grantee priorities and programs?
•
What were grantee perceptions of how funding supported priorities/programs?
RQ 8.3. What opportunities were available to provide input, feedback, and overall thoughts
regarding the development of ArtsHERE?
•
What were grantee experiences with engaging in the feedback process throughout ArtsHERE?
•
What were the planning committee member experiences with engaging in the feedback process?
Evaluation Methods
The evaluation will collect data for the 2024–2026 ArtsHERE pilot initiative. All data collection,
analysis, and reporting described in this section is the primary responsibility of the second
contractor. The evaluation will use a mixed-method design which integrates quantitative and
qualitative data that allows flexibility to work through challenges. The second contractor will share
analyses with the first contractor, ArtsHERE planning group, and TWG and will provide opportunities
(including sensemaking sessions) to inform interpretation of the analyses. A sensemaking session,
also known as a results briefing, provides an opportunity for preliminary findings to be shared with
stakeholders for the purposes of developing a shared understanding of the findings. This facilitated
reflection assists the evaluator with translating findings into knowledge to inform program
improvement and reporting. Analyses will be centered on describing characteristics of organizations
that apply and receive funding, knowledge regarding communities being engaged by grantees,
grantee views regarding community needs/interests, learning about barriers faced by grantees,
organizational priorities/programs, and how funding contributed to organizational or program growth.
Evaluation methods will include surveys, review of existing data (including reports submitted by
grantees), and interviews. Primary data sources are described below.
•
Grantee Baseline Survey will be sent to all grantees immediately following notice of award to
better understand grantee capacity needs.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
27
•
Document review of secondary data includes grantee applications, annual progress report,
and final descriptive reports (including GEO forms) as well as program documentation developed
by the Cooperator, committees, and learning opportunities provider (e.g., program descriptions,
websites).
•
Interviews with up to 15 grantees (up to 4 individuals per grantee team) will form the basis for
the case studies (described below).
•
Grantee Learning Services Quarterly Survey tracks grantee feedback and perceptions of
participation in learning opportunities activities.
•
Facilitated group discussion with ArtsHERE planners (including 10–15 representatives from
federal and RAO ArtsHERE program planners and learning opportunities providers) provides
reflections on lessons learned.
Case Studies
Case studies are a data source and will be designed to help understand how organizational
characteristics contribute to the value of arts programs and their communities through a cultural,
social, and economic lens. NEA anticipates the grantee interview data collection effort will include 15
group interviews, primarily virtually (via phone or videoconferencing), with a maximum of 4
individuals included in each interview. Grantees will have the capability to select from their
organization who should participate in the interview. Virtual interviews shall be conducted using
consent procedures and a secure infrastructure to conduct and audio-record interviews. Interviews
will last between 60 and 90 minutes. They will be audio-recorded directly using the videoconferencing platform and professionally transcribed. The second contractor should anticipate the
need for brief follow-up and member checking with up to five grantees. This process will ensure
trustworthiness and credibility in the second contractor’s analysis and interpretations of the interview
notes.
In addition to interviews, which will be conducted with up to 4 individuals per grantee organization,
case studies will also include document review as a data source to obtain additional information on
the organization, interactions with their communities, and their arts ecosystem.
The design process will include integration of qualitative data collection methods (e.g., partner
interviews, evaluator reports) that will be adjusted with feedback from the TWG and NEA. The data
will be analyzed using a blend of content analysis and theme identification. Content analysis is a
flexible method for analyzing text data from documents and transcripts generated from the
organizational site interviews. Coding will be both deductive (a priori) and inductive (emergent). The
analysis will be guided by the research questions and the categories of questions in the interview
protocols, but also allow for new categories to be identified in the data. Theme identification
continues the process by using axial coding to identify and classify the data categories and themes
that emerge from the codes. The results will generate case studies to capture and facilitate best
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
28
practices for various types of arts and cultural organizations seeking to engage more effectively with
underserved groups/communities.
Recruitment, Consenting, and Communication
The second contractor will consult with the NEA and the TWG to determine the sample and
appropriate recruitment and data collection protocols. Through this process, the contractor will
develop criteria for obtaining diverse viewpoints (e.g., positive and negative experiences in the
program; using and not using the program, including organizational supports). These discussions will
describe the intended respondents (including inclusion criteria), projected sample size, and
recruitment and contact strategies. NEA will also include email survey invitation templates and
follow-up processes for maximizing response rates. Through the consent process and ongoing
communication, the goals of data collection will be transparently described to participants. The
second contractor will develop a plan for collecting and storing sensitive information that ensures
privacy both for staff and participants. In the dissemination phase to protect grantee and participant
privacy, the second contractor will pay careful attention to how data are stored, analyzed, and
shared. Evaluation staff will schedule interviews, get consent from participants, conduct interviews,
manage transcription, and process data (e.g., quality check, clean transcripts).
Protection of Human Subjects
The first contractor will prepare the two required Paperwork Reduction Act clearance packages for
full OMB approval (for the monitoring and evaluation plans) with sufficient time to accommodate the
60-day and 30-day clearance processes. They will create a draft of supporting statements A and B,
as well as all necessary attachments. Potential attachments include email language and scripts for
all recruitment. The OMB package attachments will also include the IRB exemption/approval letter,
consent forms, guidance, and all data collection instruments. A draft of the package will be submitted
to the NEA for review and approval before data collection is expected to begin. After all necessary
revisions have been made based on NEA recommendations, NEA will submit the final package for
OMB approval. The first contractor will address any further revisions to the plan or instruments as
needed until approval is received from OMB. Once the OMB packages are submitted, the contractor
will catalog comments received by the NEA during the 60-day notice period and write a response to
each remark, describing either changes made or rationale for decisions. If comments are received
by OMB during the 30-day notice period, the contractor will write a response to each remark,
describing either changes made or rationale for decisions. NEA will prepare and submit the generic
clearance package for the ArtsHERE planning group discussion topics.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
29
IRB approval must be obtained prior to OMB review to ensure the rights of human subjects are
protected throughout the evaluation. Although NEA anticipates an exempt status due to the minimal
risk posed by the evaluation questions and activities, the first contractor will submit the initial data
collection protocol and tools to WCG IRB to ensure human subjects protection is adequate.
Informed Consents to Participate in the Evaluation
The ArtsHERE descriptive process evaluation is not highly sensitive and presents no more than
minimal risk of harm to subjects. The proposed research questions and methods involve no
procedures for which written consent is normally required outside of the research context. None of
the evaluation procedures or questions should cause great discomfort, and grantees may decline to
participate in any of the voluntary evaluation activities. Upon applying for the grant, applicants will
receive an evaluation disclaimer describing how application data and all required data collection (i.e.,
the Grantee Baseline Survey, Grantee Learning Services Quarterly Survey, Annual Progress Report,
Final Descriptive Report) will be used for MEL plan purposes. Following notice of the grant award,
grantees will be asked to consent to participation in optional activities, including interviews. Grantees
may withdraw from these additional evaluation activities at any time without jeopardizing their grant
or any other current grants from the NEA or the cooperator, or pending or future grant applications to
the NEA or the cooperator.
Evaluation consent language will be presented at the beginning of all primary data collection
instruments to remind participants about the nature of the evaluation, ensure transparency on how
the data collected will be used, and will include language that explains why decisions related to
participation will not affect grantee eligibility or competitiveness for future NEA or Cooperator grants.
For voluntary evaluation interviews, each interview will begin with a verbal description of study risks
and benefits and confirmation of consent. Each interview will be recorded (audio-only), transcribed,
and de-identified prior to analysis. All interview and discussion group participants will be verbally
reminded at the start of the discussion that participation is voluntary and will have no impact on their
grant funding, employment, or involvement in the ArtsHERE work. Any questions will be answered at
this time. It will be emphasized that there are no costs associated with participation in the study, and
individuals who participate in the optional grantee interview will receive a token of appreciation of
$75 per hour. 13 Since all data will be collected virtually, virtual consent will be obtained for all
______
13
Grantees will be compensated for all time-intensive, voluntary evaluation activities. The NEA has identified the grantee interviews
as a specific evaluation activity that will require compensation for grantee time. Up to four individuals will participate in each grantee
interview for the case studies, and each individual will be compensated. Given the purpose and objectives of ArtsHERE, utilizing an
equity-based approach that supports closing the gap between underrepresented communities while also enhancing consistent
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
30
surveys and verbal consent for interviews and discussion groups. Additionally, each grantee team
that completes the optional final grantee survey will receive a token of appreciation of $30, or
another appropriate amount based on the survey length. Evaluation team members will be assigned
as interviewers in a way that optimizes study execution.
Any subject-identifiable information (including names, contact information, etc.) will not be released
without a participant’s explicit permission and review of the content, to the extent permitted by law.
The evaluator may ask to identify a participant to be able to attribute direct quotes or case studies in
reports, presentations, or other materials. A participant may choose to remain anonymous in any
reports, presentations, or other materials.
Risks, Data Security, and Privacy
Strict procedures will be maintained by both contractors to prevent confidential data from
inadvertently being released. Deidentified data will be stored on a secure server and will be available
only to the evaluation staff through password protection and encryption keys. All discussion group
transcripts containing identifiers (e.g., name, title) will be deidentified, replaced with a unique
identification number, encrypted, and password protected. Only clean, edited data files will be
provided to staff responsible for analyzing the data. Transcripts will not be shared outside of the
evaluation contractor teams. All contractor staff will have received data security and human subjects
research training. Staff are asked to sign a pledge of confidentiality. Security is maintained on the
database by a confidential system of user identifiers and passwords. Data will not be made available
to users external to the study. Identifiable information will be destroyed at the end of the study.
Reporting and Dissemination Strategies
All evaluation reporting and dissemination/feedback loops will be the responsibility of the second
contractor. They will develop a reporting plan which incorporates sensemaking sessions with the
ArtsHERE team and the TWG at minimum, and potentially with other stakeholders. The second
contractor will collaborate with the planning group to develop evaluation products for dissemination,
including case study information, resources, and tools to capture and facilitate best practices for arts
and cultural organizations to engage more effectively with these communities. The emphasis on
engaging the planning group in the development of dissemination materials underscores the
commitment to aligning the evaluation efforts with the strategic goals and priorities. Additionally, the
______
engagement of underrepresented populations requires appropriate tokens of appreciation that can offset financial burden caused by
structural inequities. OMB has provided guidance to offer $50 to $75 per hour for interviews and focus groups. To address these
areas of concern, NEA intends to use a higher stipend of $75/hour to engage stakeholders within the arts community in evaluation
activities.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
31
evaluators will engage the TWG in discussions about evaluation to identify best practices in
messaging and product development to ensure findings reach arts and cultural organizations, as
well as other intended audiences.
Component 3. Learning Plan
Learning Plan Purpose and Scope
The purpose of Component 3. Learning Plan is to generate questions that can be used to assess the
assumptions of the theory of change and logic model and to identify knowledge gaps. The questions
are identified in consultation with the planning group through regular meetings, as well as through
quarterly discussions with the TWG. As grantees implement ArtsHERE within their communities,
there are opportunities to better understand how these programs work within and for underserved
groups and communities to support organizational capacity and foster arts engagement. Paired with
monitoring and evaluation data, the learning plan will support documentation of implementation and
effectiveness. Data and reflections gleaned from the learning plan and various components of the
MEL plan outlined above can be used by the NEA to assess progress toward goals, make
midcourse corrections, and/or advocate for replication and scale-up of effective practices and
components.
The learning plan is intended to facilitate the development of, and respond to, learning questions
from the team, inclusive of NEA and the Cooperator, to potentially inform decision making and
improvement. Priority learning areas exist in a dialectic relationship with the NEA’s core mission to
address topics of importance in the strategic and equity action plans and ensure learning activities
build on each other and yield useful results.
Learning Plan Questions
While the entirety of ArtsHERE and the MEL plan centers on learning, the primary focus of this
component is on questions which inform the following overarching ArtsHERE research domains:
•
Domain 1: Organizational characteristics of applicants and grantees
•
Domain 3: Grantee programs
•
Domain 4: Organizational capacities of grantees
•
Domain 5: Grantee connections
•
Domain 6: Grantee learning
•
Domain 7: Grantee funding
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
32
Domain 8: Lessons learned
•
Several overarching research questions are incorporated. Exhibit 6 indicates key overarching
research questions, priority subquestions, and topics to be explored through the learning plan
component.
Because of the developmental focus and the learning plan, learning questions will be generated in
collaboration with and by the partners (including grantees via the TWG) through the ongoing
reflection and analysis of MEL plan data and emergent results. Learning methods and established
feedback loops (described below) will facilitate this iterative and adaptable process.
Exhibit 6. Primary Research Questions for the Learning Plan
Research Questions (RQ)
RQ 1.1. What was the process for determining which organizations receive grants?
•
What were the recruitment/promotional practices to encourage applications? And did it differ from
prior efforts?
•
How did the panel process work?
•
How did the panel composition differ from prior panel compositions?
•
What was the experience of panelists in the review process?
RQ 3.1. How have grantees engaged underserved communities prior to their grants?
•
What services or programs did grantees provide at the time of the grant?
•
What did grantees anticipate learning from their projects?
RQ 3.2. How do capacity building efforts provided through learning opportunities support
grantees engagement with underserved communities during the grant? What works well? What
challenges or barriers do grantees experience?
•
What were grantee accomplishments in engaging communities?
•
What strategies were used to engage underserved communities?
•
What challenges were faced in engaging underserved groups/communities?
•
What worked well for grantees in engaging underserved communities prior to and during
ArtsHERE?
RQ 3.4. What are organizations doing to integrate arts/culture into the programming with their
communities? How does this vary across NEA-defined disciplines?
•
How did grantees integrate arts and cultural activities into their programming at time of application?
Why did organizations outside the arts and cultural sector use arts and cultural programing/
strategies to engage their communities?
RQ 3.5. What other priorities and/or programs are addressed through the grants?
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
33
•
What priority topics were identified by grantees in their applications for their capacity building
projects?
•
What capacity-building activities did grantees apply for? How do grantees see these capacities as
strengthening their organization's capacity?
•
What grantee activities were supported through funding?
•
What were the challenges to program implementation?
RQ 4.1. What are grantee organizational capacities prior to ArtsHERE?
•
What were the organizational capacities at the time of the grant award?
•
What grantee goals were relevant to capacities/capacity needs at time of grant award?
RQ 5.1. What is the role(s) of grantees in their community arts ecosystem?
•
What were grantee perspectives on the role(s) of their organizations within their community arts
ecosystem?
RQ 5.2. What connections are grantees able to form or strengthen in their communities, within a
broader arts ecosystem, with other grantees, and with public funders, including other RAOs and
the NEA?
•
What were grantee perspectives on their connections with their communities, other grantees, public
funders, including others RAOs and the NEA? How, if at all, do these change throughout the grant?
RQ 5.3. How, if at all, does ArtsHERE support grantees in connecting with their communities,
within a broader arts ecosystem, with other grantees, and with public funders, including other
RAOs and the NEA?
•
What were the ways in which ArtsHERE supported grantees in fostering/strengthening connections
with communities, within a broader arts ecosystem, with other grantees, and with public funders,
including other RAOs and the NEA?
•
How could this or other future funding opportunities foster connections with communities, within a
broader arts ecosystem, with other grantees, and with public funders, including other RAOs and the
NEA?
RQ 6.1. What did the learning opportunities provision look like under ArtsHERE and who
participated in services?
•
What did recruitment/onboarding/training look like for learning opportunities providers?
•
What were provider experiences of learning opportunities delivery successes?
•
What were provider perspectives of areas for growth relevant to providing learning opportunities?
•
How responsive were learning opportunity providers to grantee needs?
•
What were the characteristics of organizations who most frequently participated in learning
opportunities? What were the characteristics of those who least frequently participated in learning
opportunities?
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
34
RQ 6.2. How do grantees experience participation in learning opportunities?
•
How did grantees perceive the learning opportunities provided in terms of the following: meeting
grantees needs, being engaging, responsiveness, effectiveness, and overall quality?
•
What were grantee perspectives on areas for improvement for learning opportunities?
•
What were the additional areas of need relevant to learning opportunities that are identified by
grantees?
•
What did grantees perceive as most useful in supporting their capacity building?
RQ 7.1. How, if at all, did not requiring a match benefit grantees?
•
What were grantee reflections on participation in the investment pillar at the end of the grant?
RQ 7.2. In what ways, if any, did receiving funding support grantee priorities and programs?
•
What were grantee perceptions of how funding supported their priorities/programs?
RQ 8.1. What overall lessons can be shared with funders and the arts ecosystem about the
pillars (investment, learning, and evaluation)?
•
What are key lessons learned from the ArtsHERE initiative and their implications for specific
processes/pillars?
RQ 8.2. What lessons could inform the NEA’s own grantmaking processes, but also those of
RAO partners?
•
What aspects of ArtsHERE have worked well, and what are areas for growth?
•
Did ArtsHERE work as originally designed? If not, what modifications were needed to improve the
model?
RQ 8.3. What opportunities provided input, feedback, and overall thoughts regarding the
development of ArtsHERE?
•
What were grantee experiences with engaging in the feedback process throughout ArtsHERE?
•
What were planning committee member experiences with engaging in the feedback process?
Learning Methods
The learning plan draws on data collection and sources relevant to the monitoring and evaluation
components and includes the introduction of learning logs, surveys, and direct observation; the
review of project documentation from RAO committees, the learning opportunities provider, and the
Cooperator; and reports submitted by grantees. Specific data sources are described below.
•
Grantee Baseline Survey will be sent to all grantees immediately following notice of award to
better understand characteristics of organizations funded and their capacity needs.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
35
•
Review Panelist Survey will track perceptions of panelist perspectives, understanding, and
experience with goals for the panel; representation on panel; process flow; overall perspectives
on process and timing; involvement in decision making; how applicants were narrowed down;
confidence in selecting organizations with desired characteristics; final reflections on successes,
areas for improvement, etc.
•
Group discussion with ArtsHERE communications committee will gather perspectives,
understanding, and experience with program recruitment/promotional practices to encourage
applications; how it went; lessons learned; and reflections on whether this was different than
prior panels.
•
Grantee Learning Opportunities Quarterly Survey will track grantee feedback and
perceptions of participation in learning opportunities activities.
•
Facilitated group discussion with ArtsHERE planners will bring together 10–15
representatives from federal and RAO ArtsHERE program planners and learning opportunities
providers to review the logic model and theory of change, as well as provide reflections on
lessons learned.
•
Learning Opportunities Grantee Feedback Form will track grantee feedback and perceptions
of participation in learning opportunities activities, emergent needs, and areas for learning
opportunities model improvement.
•
Document review of secondary data will include application data from unsuccessful applicants
and grantees, annual progress report, and final descriptive reports (including GEO forms) as well
as ArtsHERE program documentation developed by Cooperator, committees, and learning
opportunities provider (e.g., panel training materials, panel information, etc.).
•
Evaluator direct observation of project planning and implementation will include direct
participation of the first contractor in one panel review meeting with each individual RAO site (n =
6) as well as ongoing participation in ArtsHERE committee cochair meetings.
•
Learning logs and strategic learning discussions will document and prompt ongoing
reflection on the ArtsHERE planning group’s experiences and learning while implementing
activities and components.
•
Interviews will be conducted with up to 15 grantees (up to 4 individuals per grantee team).
Reporting and Feedback Loops
A developmental approach will be used to address the goals of the learning plan. The second
contractor will share timely data collection and analyses ongoing basis through monthly snapshots
and quarterly activity briefs, and the first contractor will engage the NEA, Cooperator, and TWG in
strategic learning discussions and activities during each key phase to foster growth and innovation.
As part of the developmental evaluation approach, facilitation of dialogue will provide opportunities
for feedback loops that will be put in place through ongoing participation in team meetings, learning
logs, and data reporting to articulate and reflect on learnings, as well as more formal monthly,
quarterly, and annual reports. These feedback loops will ensure that timely input is provided on what
is and is not working in the initiative, and guidance on implementation and application of findings is
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
36
shared. These practices will integrate diversity, equity, inclusion, and accessibility (DEIA) priorities
and support the development and implementation of actionable next steps for NEA and RAOs.
Key MEL activities that will both inform and be informed by the learning plan include annual revisiting
and updates (as needed) to the theory of change, logic model, and learning agenda to ensure a
nimble and developmental approach is taken to identifying gaps in knowledge and priority learning
questions. The first contractor will participate in planning committee meetings and engage in realtime identification of what is and is not working; document and assess strategic decisions and track
outcomes and impacts through the monitoring and evaluation components; facilitate and document
strategic learning sessions with the team, learning opportunities provider, and the TWG. During Task
Order #1 (spanning the 48-month contract period), the first contractor will prepare three annual
reports and one final report summarizing key findings and lessons learned from the learning
component. The planning group and NEA leadership will be engaged in the review and discussion of
reflections from the annual learning reports.
Learning memos will summarize team reflections and learning generated through the learning logs
and evaluator direct observations. Memos will document the planning and implementation and
provide practical guidance. They will be drafted in collaboration with NEA and Cooperator and
distributed to the team at least quarterly or following key implementation milestones.
The team proposes to facilitate voluntary prompts throughout the grant award period to invite
grantees to share various topics to improve learning opportunities and program implementation.
These open-ended prompts would provide ongoing anonymous feedback from grantees about which
topics they may be interested in learning; their vision for their capacity-building project; meanings
associated with "capacity building"; or potential hopes for what they might to learn from their peers.
The prompts would be facilitated using, e.g., Mentimeter wordcloud or thought bubble features, and
any questions asked through Mentimeter for the purpose of improving ArtsHERE implementation
would be submitted for generic PRA clearance. Mentimeter is an interactive presentation and polling
tool that enables real-time audience engagement and feedback gathering. Grantees will receive a
link to these prompts and can respond anonymously. The first contractor will download the data
periodically to assess trends/shifts in thinking over time. Results will be shared with NEA/RAOs on
an ongoing basis. While this is mostly for learning purposes, it can also inform the formation of
evaluation questions about lessons learned.
Comprehensively, these activities will provide NEA with information about how learnings from this
initiative might be applied to other NEA programs and initiatives. The schedule of proposed learning
activities is included in the project timeline (see Appendix B. ArtsHERE Detailed Project Timeline).
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
37
Stakeholder Engagement Plan
Engaging stakeholders in the evaluation process is fundamental to achieving equitable evaluations.
It not only enhances the quality and relevance of the evaluation but also promotes a more inclusive
and participatory approach that aligns with principles of equity. Evaluation participants and
stakeholders reflect diverse roles and perspectives across multiple levels, including funders,
program administrators, technical advisory members, and organizational site participants. More
specifically, all key groups, including the NEA, South Arts, five participating RAOs, grant review
panelists, the TWG, and the sample of ArtsHERE funded organizations that consent to participation
in the evaluation project will be engaged. Having a robust approach to engaging collaborative
partners provides an opportunity to listen to participants and integrate their feedback throughout the
evaluation process. This approach will enable the evaluation contractors to engage the arts
community from multiple perspectives (e.g., federal, regional, local engagement).
This plan proposes active engagement of key groups, including grantees, within the implementation
of the evaluation to gain insight into how the project increases organizational capacity for community
organizations that may have been overlooked within prior federal funding opportunities. This
approach to community engagement and culturally responsive evaluation is informed by frameworks
for community-based and equitable evaluation. 14,15,16 The evaluation will draw on principles from
these approaches, such as shared leadership and bidirectional learning to guide decision making,
address critical questions, and engage in evaluation in service of and in contribution to equity.
Stakeholder Engagement Plan
Authentic engagement is intentional work with communities that seeks to lift the voices of
marginalized communities. The stakeholder engagement plan seeks to include perspectives from
interested parties involved in arts programs at federal, state, and local levels by intentionally
engaging with them and integrating their perspective at every stage. Evaluation contractors will
utilize existing partnerships between the NEA and RAOs, as well as the participation of grantees to
connect the contracted evaluator with potential stakeholders who can guide the direction of this
work. The three-pronged authentic engagement approach is described below.
______
14
Naomi C. Z. Andrews et al., "Research and evaluation with community-based projects: Approaches, considerations, and
strategies," American Journal of Evaluation 40, no. 4, (April 16, 2019): 548–561, https://doi.org/10.1177/1098214019835821.
15
Abraham Wandersman, "Moving Forward with the Science and Practice of Evaluation Capacity Building (ECB): The Why, How,
What, and Outcomes of ECB," American Journal of Evaluation 35, no. 1 (October 16, 2013): 87-89,
https://doi.org/10.1177/1098214013503895.
16
Equitable Evaluation Initiative’s Equitable Evaluation Framework, available at https://www.equitableeval.org/framework.
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
38
•
Ongoing engagement of the TWG. This group is comprised of researchers, artists, arts
administrators, arts participants, and arts funders to provide input that informs the design,
feasibility, and appropriateness of the ArtsHERE initiative and the evaluation. The composition
will shift from arts researchers and evaluators in the first cohort to being comprised of primarily
grantees following the first 12-month cycle. It will provide input during all stages of the
evaluation, including data collection, analysis, reporting, and product development.
•
Intermittent engagement of grantees. This group includes arts representatives, researchers,
community members, community partners, and program staff. It will be engaged through
evaluation presentations and open-ended prompts during cohort meetings, learning
opportunities feedback forms, and evaluation findings webinars.
•
Strategic engagement of planning group members (i.e., NEA, South Arts, RAOs, and
committee chairs). In addition to holding group discussions with the planners for data
collection/evaluation purposes, this group will be engaged strategically at key timepoints to
review and discuss MEL plan findings to establish regular and timely feedback loops as well as
for the group to share important updates with the evaluation team. Please see the Learning Plan
discussion for more details.
Principles of Stakeholder Engagement
The development of a plan for ongoing, intermittent, and strategic stakeholder engagement was
informed by the following general principles:
1. Obtain critical and relevant input from each group of stakeholders.
2. Create structures for engagement to ensure all voices can be heard.
3. Be mindful of burden for stakeholders and avoid unnecessary requests for input (i.e., avoid
soliciting input that cannot be incorporated to influence study elements).
4. Utilize input from stakeholders to inform all phases of the study.
5. Recognize constraints of time and study resources and utilize research staff time efficiently in
gathering stakeholder input.
Exhibit 7 depicts key active engagement strategies that will be used for the MEL plan.
Exhibit 7. Stakeholder Engagement Strategies
Groups engaged
Venue
Purpose
Feedback mechanism
Planning phase: 12 months (October 2022 through September 2023)
TWG members:
arts researchers,
evaluators, and
practitioners
(ongoing)
Quarterly
TWG
meetings
facilitated by
evaluator
Inform and review logic model
and theory of change, MEL
plan, draft data collection
instruments, and evaluation
recruitment materials for OMB
clearance package #1
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
Verbal feedback during
virtual TWG meetings,
written feedback between
meetings; ad hoc
calls/discussions as needed
39
NEA and South
Arts (SA)
(strategic)
Monthly
scheduled
meetings
Share information and
progress updates, provide
input on evaluation from
funder and Cooperator
Verbal feedback during
virtual meetings, written
feedback following
meetings
ArtsHERE
planning group:
NEA, SA, RAOs,
committee chairs,
and others
Virtual group
meetings
scheduled
prior to key
evaluation
deliverables
or milestones
Provide updates; review
drafts; provide input on
evaluation processes
Verbal feedback during
virtual meetings; written
feedback following
meetings
(strategic)
Planning and implementation phase: 12 months (October 2023 through September 2024)
TWG members:
arts researchers,
evaluators, and
practitioners
Quarterly
TWG
meetings
Inform and review MEL plan;
review and provide feedback
on instruments for PRA
package #1
Verbal feedback during
virtual TWG meetings;
written feedback between
meetings; ad hoc
calls/discussions as needed
NEA and South
Arts (SA)
(Strategic)
Monthly
scheduled
meetings; cochair
meetings as
scheduled
Share information and
progress updates on
implementation; provide input
on evaluation
Verbal feedback during
virtual meetings; written
feedback following
meetings
ArtsHERE
planning group:
NEA, SA, RAOs,
committee chairs,
and others
(strategic)
Virtual group
meetings
scheduled
prior to key
evaluation
deliverables
or milestones;
learning logs
Provide updates on
implementation; review draft
data collection instruments
and memos; provide input on
evaluation processes; provide
learning reflections
Verbal feedback during
virtual meetings; written
feedback following
meetings and memo
review; ad hoc calls as
needed; written reflections
in learning logs
(Ongoing)
Pilot implementation with grantees: 24 months (October 2024 through September 2026)
TWG transitions to
primarily
ArtsHERE
grantees
Quarterly
TWG
meetings
Review and provide input on
MEL plan findings; help
troubleshoot challenges to
data collection; pilot data
collection instruments for
PRA package #2
Verbal feedback during
virtual TWG meetings;
written feedback between
meetings; ad hoc
calls/discussions as needed
Surveys
quarterly;
interviews in
Provide reflections on
satisfaction with learning
opportunities; provide grantee
Survey responses (both
open- and closed-ended);
verbal feedback during
(ongoing)
ArtsHERE
grantees
(intermittent)
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
40
year 2; openended
Mentimeter
prompts
experiences and growth;
provide feedback on topics of
interest to grantees
virtual interview; openended Mentimeter prompts
ArtsHERE
grantees
(intermittent)
Grantee
cohort
meetings (will
occur
periodically,
as needed)
Contractors will participate in
grantee cohort sessions to
share learnings
Verbal feedback during
meetings; written feedback
following meetings
ArtsHERE
evaluation
committee
(strategic)
Monthly
scheduled
meetings
Provide feedback on
evaluation implementation
Evaluation chair or
contractors will provide
verbal updates during
virtual meetings; written
feedback following
meetings
NEA and SA
(strategic)
Monthly
scheduled
meetings
Share information and
progress updates on
implementation; provide input
on evaluation
Verbal feedback during
virtual meetings; written
feedback following
meetings
ArtsHERE
planning group:
NEA, SA, RAOs,
committee chairs,
and others
Monthly
virtual RAO
cochair
meetings or
via email;
learning logs
Receive evaluation updates;
provide learning reflections;
engage in sensemaking
process as MEL plan findings
are shared
Evaluator will obtain verbal
feedback during virtual
meetings; written reflections
in learning logs; quarterly email summary of learning
opportunities feedback from
grantees; ad hoc
calls/discussions as needed
(strategic)
Final analysis and dissemination – 8 months (October 2026 through May 2027)
TWG members:
Arts researchers,
evaluators, and
practitioners and
grantees
Quarterly
TWG
dissemination
meetings
Inform and review findings,
help identify dissemination
channels
Verbal feedback during
virtual TWG meetings;
written feedback between
meetings; ad hoc
calls/discussions as needed
Final
webinars
(example: 4
conducted
between
January
Share preliminary and final
findings in a series of
webinars for grantees
Verbal feedback during
virtual meetings; written
feedback following
meetings
(ongoing)
ArtsHERE
grantees
(intermittent)
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
41
through May
2027)
NEA and SA
(strategic)
Monthly
scheduled
meetings
Share information and
progress updates; obtain
input on evaluation analysis
and dissemination from
funder and Cooperator
Verbal feedback during
virtual meetings; written
feedback following
meetings
ArtsHERE
planning group:
NEA, SA, RAOs,
committee chairs,
and others
Virtual group
meetings
scheduled
following key
evaluation
deliverables
or milestones
Receive evaluation updates;
review draft final reports and
dissemination products;
engage in sensemaking
process as MEL plan findings
are shared
Verbal feedback during
virtual meetings; written
feedback following
meetings
(strategic)
ArtsHERE Monitoring, Evaluation, and Learning Plan v4
42
File Type | application/pdf |
Author | James Bell Associates |
File Modified | 2024-04-19 |
File Created | 2024-04-19 |