ArtsHERE Supporting Statement B

ArtsHERE Supporting Statement B.pdf

ArtsHERE Monitoring, Evaluation, and Learning Plan Data Collection: NEA Pilot Equity Initiative

OMB: 3135-0148

Document [pdf]
Download: pdf | pdf
The National Endowment for the Arts
ArtsHERE Grant Program Forms
OMB Information Collection Request - New Collection
Justification – Part B Supporting Statement
Last updated: April 19, 2024

1

Table of Contents
B1.

Objectives...................................................................................................................................... 3

B2.

Methods and Design ..................................................................................................................... 4

B3.

Design of Data Collection Instruments ......................................................................................... 5

B4.

Collection of Data and Quality Control ......................................................................................... 7

B5.

Response Rates and Potential Nonresponse Bias ......................................................................... 8

B6.

Production of Estimates and Projections ...................................................................................... 9

B7.

Data Handling and Analysis ........................................................................................................... 9

B8.

Contact Person(s) ........................................................................................................................ 10

Table of Attachments .............................................................................................................................. 11

2

B1.

Objectives

Project Objectives
The purpose of the proposed information collection is to monitor, evaluate, and generate lessons
learned from the activities of the ArtsHERE pilot initiative. Motivating this evaluation project is an
interest in understanding whether and how, by participating in this pilot initiative, subgrantee
organizations have strengthened their engagement with underserved groups/communities that have
rich and dynamic cultural identities. The National Endowment of the Arts (NEA) is also interested in
learning from this initiative how it might support similar field-building initiatives in other areas of its
portfolio.
Generalizability of Results
The developmental, descriptive study is intended to present an internally valid description of the
implementation of ArtsHERE. Results are not intended to promote statistical generalization to other
service populations.
Appropriateness of Study Design and Methods for Planned Uses
The study design for the ArtsHERE pilot initiative will apply a developmental, descriptive approach to
achieve the study objectives described above. Developmental evaluation, pioneered by Michael Quinn
Patton1, involves diverse collaboration through inclusive co-design, requiring time for relationshipbuilding, capacity-building, creativity, and consensus-building among partners. Flexibility is essential to
adapt evaluations to the complex and evolving conditions within communities. For this reason, the
ArtsHERE evaluation design will involve more than one information collection request (or amendment to
the first PRA package) to ensure that these principles can be applied to the development of future data
collection instruments.
The initiative’s Monitoring, Evaluation, and Learning (MEL) plan, shown in Attachment E, will use mixed
methods to address the research questions proposed in Supporting Statement A. Multiple data
collection strategies will be used to comprehensively capture quantitative and qualitative data. The
study design is well-suited for evaluating ArtsHERE as it effectively captures developmental processes,
aligning closely with the initiative's pilot stage of implementation and the NEA’s prioritized evaluation
questions and needs. The methods and measures are carefully sequenced to produce learning about the
ArtsHERE pillars, outputs, and outcomes (see ArtsHERE logic model in Supporting Statement A). The MEL
plan adheres to codesign and culturally responsive evaluation frameworks, prioritizing continuous
adaptation and learning. These frameworks are well-suited for assessing pilot programs, focusing on
understanding implementation dynamics, providing timely feedback, and fostering continuous learning.
The NEA will use the information collected to answer the research questions and provide insight into the
overarching study objectives, documenting the implementation of the ArtsHERE pillars and learning
from stakeholder (i.e., grantees, NEA, South Arts, RAOs) experiences throughout the initiative to inform
future considerations for program and evaluation improvements. In addition, findings from this study
will be shared with Technical Work Group (TWG) members and grantees. Beyond learning how ArtsHERE

Michael Quinn Patton, Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use
(New York, NY: Guilford, 2011).

1

3

implementation might improve its own activities, the NEA plans to harvest lessons that can be shared
with other funders, organizations, and cultural practitioners seeking to do this work.
The descriptive study will provide data on grantee, service provider, and ArtsHERE planning group
perspectives and is not intended to be an impact evaluation. As noted in Supporting Statement A, this
information is not intended to be used as the principal basis for public policy decisions and is not
expected to meet the threshold of influential or highly influential scientific information.

B2.

Methods and Design

Target Population
Information will be collected from (1) the cohort of 95 grantees who are participating in ArtsHERE as
part of their grant activities; (2) approximately 100 review panelists who participated in the second
phase of the application review process; (3) approximately 15 learning opportunities providers
(ArtsHERE technical assistance coaches and facilitators); and (4) up to 15 members of the ArtsHERE
planning group (i.e., NEA, South Arts, and RAO representatives).
Sampling
The findings from this data collection and the larger process study are not intended to be generalizable;
therefore, probability sampling is not a priority.
Required evaluation data collected from grantees (i.e., Grantee Baseline Survey, Grantee Learning
Opportunities Quarterly Survey) will be collected by sending an invitation email to project directors (or
designees) from all 95 ArtsHERE grant recipients. Recipients will receive a link to these web-based
surveys in the invitation. In addition, required grantee reporting forms (i.e., the Annual Progress Report
and the Final Descriptive Report) will also be used for monitoring by South Arts in addition to evaluation,
and will be administered via GO Smart, a cloud-based grant application system.
Voluntary evaluation data collected from review panelists (i.e., Review Panelist Survey) will be collected
by sending an invitation email to all panelists who participated in Phase II of the review panel process. 2
Recipients will receive a link to this web-based survey in the invitation.
Required evaluation data collected from learning opportunities providers (i.e., Learning Opportunities
Tracker) will be collected from all facilitators and coaches providing technical assistance to ArtsHERE
grantees, and from staff and consultants directly involved in the delivery of support services to grantees.
Voluntary evaluation data collected from the ArtsHERE planning group (i.e., Learning Logs) will be
collected by sending an invitation email to staff from the NEA, South Arts, and RAOs who are involved in
the planning of the initiative. It is expected that up to 15 individuals will complete the logs at each
administration point. Learning logs will be administered to the planning group on the following
schedule:

ArtsHERE is using a two-phase approach to ensure access and minimize the burden of a lengthy single application
for grantees. In Phase I, organizations completed a short Expression of Interest, which was reviewed by at least
three panelists identified as screeners in Phase I. Selected organizations from Phase I were invited to move on to
Phase II to complete a full application, which review panelists will examine and discuss in review panel meetings.

2

4

B3.

•

Learning log topic: panel/selec�on process in May 2024

•

Learning log topic: analyses of applica�on data in June 2024

•

Learning logs topic: Grantee Learning Opportuni�es Quarterly Survey results every 3 months
from January 2025 through January 2026

•

Learning log topic: mid-pilot, APR reac�ons/reflec�ons in December 2025

Design of Data Collection Instruments

Development of Data Collection Instruments
The data collection instruments were developed based upon the essential data needed to answer the
ArtsHERE research questions. The evaluation contractors engaged staff from NEA, South Arts, MidAmerica Arts Alliance (M-AAA), and other RAOs in various iterations of draft instrument review and
worked to address all feedback provided.
All data collection instruments underwent cognitive testing to gauge comprehension, usability, and
overall user experience of the instruments. The cognitive testing process followed a structured approach
to obtain feedback on the surveys, forms, and other data collection instruments that will be used to
gather information from ArtsHERE grantees, application review panelists, service providers, and
members of the ArtsHERE planning group. Participants were asked to complete one or more
instruments of their choosing along with the instrument review form for each instrument, allowing for
real-time feedback on comprehension, usability, and overall user experience to improve the quality and
reliability of the evaluation instruments. Testing was conducted in two parts, one set of instruments
with 6 testers and another set with 4 testers, using two samples of informants in positions similar to
intended respondents (e.g., artists, administrators, researchers, educators, and/or evaluators); no
instrument was tested by more than 6 testers. These tests were instrumental in determining the burden
estimates. Following the tests, the instruments were refined to minimize burden and improve utility.
The revised instruments were subject to review and feedback by key stakeholders, including staff from
NEA, South Arts, M-AAA, and other RAO representatives before they were finalized.
The Review Panelist Survey was developed using existing panel review overview and training
documents as well as through discussions with South Arts and NEA. The purpose of the tool is to
understand the experience of panelists in Phase II of the panel review process and identify areas for
improvement. This voluntary web-based survey will be administered to application review panelists
immediately following completion of the review panel process (September 2024 at the latest). The
survey will consist of open- and closed-ended questions that capture panelists’ demographic
characteristics, experience serving on prior review panels, and perspectives on the panel review process.
A link will be sent to each panelist for completion.
The two required grantee reporting forms (i.e., the Annual Progress Report and the Final Descriptive
Report) were adapted from previously approved and field-tested instruments and ArtsHERE application
questions. The evaluation contractors identified questions from the application materials that would
benefit from follow-up to identify change over time. Additionally, standard descriptive questions from
the NEA’s reporting forms were included to ensure consistent data collection across NEA grant
programs, enabling potential comparisons.

5

•

Annual Progress Reports for all awarded grantees will be reviewed for information on distinct
grantee practices (e.g., integration of arts/culture into programming), successes and barriers to
engaging underserved communities, and experience of ArtsHERE (e.g., impact of ArtsHERE
participation on future activities). The emergent learning from the annual progress reports will
inform the learning component of the MEL plan as well as the development of case studies.

•

Final Descrip�ve Reports for all awarded grantees will be reviewed for informa�on
organiza�onal characteris�cs and any changes made. Topics of interest will include dis�nct
grantee prac�ces (e.g., updated approaches to strategies to enhance programming), successes
and barriers to engaging the community, organiza�onal prac�ces (e.g., updated
program/services in place, organiza�onal capacity, knowledge gained from project), and overall
organiza�onal or program growth that occurred as a result of funding. Addi�onally, the
Geographic Loca�on of Project Ac�vity (or GEO) por�on of the final report, which is already
cleared through OMB control number 3135-0140, will be used to beter understand and track
the loca�on of ac�vi�es and who is likely to benefit from them.

Two instruments were designed to measure experiences with providing and/or receiving technical
assistance through ArtsHERE learning opportunities:
•

The Learning Opportuni�es Tracking Form was developed to capture the suppor�ve services
that are provided to grantees, including cohort convenings, one-on-one coaching, and topical
expert workshops. As the provider of learning opportuni�es, M-AAA was closely involved in the
development and planning of the tracker. Following each organiza�onal service occurrence
(November 2024 through April 2026), this required web-based tracking form will be completed
by learning opportuni�es providers. The form will consist of open- and closed-ended ques�ons
that will cover topics including service type, content of service provision, par�cipa�ng partners,
engagement experience, facilitators, and challenges.

•

The Grantee Learning Opportuni�es Quarterly Survey was developed to understand grantees’
self-assessment of learning opportuni�es received, including cohort convenings, one-on-one
coaching, and topical expert workshops. As the provider of learning opportuni�es, M-AAA was
closely involved in opera�onalizing key domains assessed in the survey. This required web-based
survey will be administered to all grantees quarterly, beginning a�er month 3 of the grant period
(January 2025 through April 2026). It will consist of open- and closed-ended ques�ons that
capture grantees’ sa�sfac�on with learning opportuni�es, including: engagement, quality,
relevance, and effec�veness of cohort-based and one-on-one organiza�onal services, as well as
percep�ons on how services can be improved.

The Grantee Baseline Survey was developed as a needs assessment for grantees and also will collect
specific grantee background and demographic information to obtain a comprehensive grasp of grantees'
baseline characteristics. This includes organizational strengths, capacities, community connections, NEA
and peer relationships, partner involvement, and capacity-building goals. This required one-time webbased survey will be administered to all grantees upon acceptance of a grant award (approximately
October 2024). One survey response will be submitted by each grantee and should reflect input from
the core group involved in local planning and implementation.
The Learning Logs were developed to capture the reflections of ArtsHERE planners on the processes and
learning at key milestones of the initiative. These voluntary reflection results will also inform group
discussions and MEL plan changes. The learning logs are part of the ArtsHERE learning plan, which is
intended to facilitate the development of, and respond to, learning questions from the team, inclusive
6

of NEA and South Arts, to potentially inform decision making and improvement. Members of the
ArtsHERE Evaluation Committee provided input on the learning logs during cognitive testing. Each log
will consist of 4 open-ended prompts intended to facilitate ongoing reflection on experiences and
‘emergent learning’ after key program activities/milestones.

B4.

Collection of Data and Quality Control

The evaluation contractors will administer all web-based evaluation instruments (i.e., Review Panelist
Survey, Learning Opportunities Tracking Form, Grantee Learning Opportunities Quarterly Survey, and
Grantee Baseline Survey) electronically through a FedRAMP compliant platform. Each respondent will
receive an invitation to participate in the survey via email (Attachment B). Email invitations will be
addressed to each staff person by name and each individual will receive a unique survey link. If required
by IRB, informed consent will be obtained electronically before the respondent proceeds to the survey
questions. Respondents can complete the survey at a time that is convenient for them, including over
multiple sittings, if needed. By using unique survey links, the evaluation contractors can troubleshoot
any challenges that individual respondents may have accessing the survey and schedule email reminders
to only respondents who have not completed the survey. The evaluation contractors will send up to 2
email reminders (Attachment B) to staff who have not started or completed the survey.
The Annual Progress Report and Final Descriptive Report are required grant reporting forms that will be
administered through the GO Smart grants management platform. The evaluation contractor and GO
Smart team will use the GO Smart system to email all grantees at their GO Smart profile addresses,
directing them to the grant portal to complete required reports. Grantees will log in using existing
credentials to access their private account, which they used to submit their statement of interest and
application. Grantees will find the Annual Progress Report and Final Descriptive Report linked to their
original application. They can click START to begin or EDIT for forms in progress, and may return to the
portal as needed before the deadline to submit the reports. Once submitted, they will receive an email
confirmation with a copy of their completed report. After submission, reports cannot be modified by the
applicant unless granted additional access by an ArtsHERE administrator or evaluation contractor. All
report responses will be accessible via PDF and CSV spreadsheet to ArtsHERE and GO Smart
administrators as well as the evaluation contractor.
The Learning Logs will be administered through a secure shared workspace that is a FedRAMP compliant
platform. Respondents will receive a link to access the log, which will be accessible to other members of
the ArtsHERE planning group. Responses will be shared anonymously in the workspace. Information
collected in the learning logs is not to be distributed outside of the planning group. Responses provided
through this form will inform follow-up discussions with the planning group.
Data will be monitored for quality and consistency through weekly data quality check reports. These
reports identify duplicates, missing values, validations, date inconsistencies and valid ranges. Team
members will follow up on any quality issues identified in the reports with the respondent, if
appropriate.

7

B5.

Response Rates and Potential Nonresponse Bias

Response Rates
Maximizing response rates is critical to the administration of these data collection efforts. The content
and format of the instruments were developed in close consultation with key stakeholders, and the
grantee reporting forms were informed by previously developed, OMB-approved instruments. Though
these data collection activities are not designed to produce statistically generalizable findings and
participation in the optional evaluation data collection activities is wholly at the respondents’ discretion,
response rates will be collected when applicable and possible for quality improvement purposes. Based
on previous evaluator experiences with similar data collection instruments, the following response rates
are estimated:
•

For the review panelist survey, the target response rate is 90%, with an expected response rate of
70%.

•

For op�onal grantee evalua�on surveys and forms, the target response rate is 75%, with an
expected response rate of 50%.

•

For required grant repor�ng forms, the target response rate is 100%, with an expected response
rate of 90%.

•

For service providers' learning opportuni�es tracking forms, the target response rate is 90%, with
an expected response rate of 70%.

Data collection strategies that emphasize flexibility, privacy, and a respect for the respondent’s time
facilitate timely participation. The following strategies will be implemented to maximize participation in
the data collection:
o

Introduction and notification: Strategies to introduce and no�fy respondents about data
collec�on are used for several instruments. The purpose of each instrument is clearly stated in
the introductory text, as is the �me es�mate for comple�on of each instrument.

o

Timing of data collection: Individualized discussions were held with stakeholders to determine
op�mal periods for data collec�on to minimize respondent burden and to facilitate recall.

o

Administration: For surveys, reminder emails will be sent (per discussion above) to promote
par�cipa�on and a high response.

o

Alternate response methods: Respondents will be given the op�on to use an alternate method
for responding to surveys or interviews, such as submi�ng a paper or PDF version with writen
responses to ques�ons, or by submi�ng responses over the phone, if this method helps to
increase par�cipa�on.

o

Assurances of data privacy: Respondents to all surveys and interviews will be assured that
reported data are aggregated and not atributable to individuals or specific grantee
organiza�ons. The following text is included in the introductory text of all grantee evalua�on
instruments:
“These data will be made available to the program Evaluator and will not be shared with the
NEA, South Arts, and RAOs except as described below. Information collected for evaluation
purposes, including individual information deemed sensitive in nature, is considered confidential
and will remain anonymous and private to the extent permitted by law. When results of the

8

ArtsHERE evaluation are shared with the public via reports, presentations, and other materials,
these results will only be shared in aggregate form (percentages, means, summaries) to protect
the identity of participants. Any subject-identifiable information (including names, contact
information, etc.) will not be released without a participant’s explicit permission. The Evaluator
may ask to identify a participant to attribute direct quotes or case studies to it in reports,
presentations, or other materials, and the participant may choose to remain anonymous.
“Reporting on grant activities, including annual progress and final reports, and completing forms
or surveys intended to collect information or feedback that can inform ArtsHERE services is
required of all grantees. Your responses in this survey will not impact your current or future
awards from the NEA or its partners. You will not receive any compensation for responding to
the survey. You may decline to answer any question you wish. Under the Paperwork Reduction
Act of 1995, no persons are required to respond to a collection of information unless such
collection displays a valid Office of Management and Budget (OMB) control number. The OMB
control number for this survey is OMB No. 3135-XXXX, which expires XX/XX/XXX.”

NonResponse
As participants will not be randomly sampled and findings are not intended to be representative, nonresponse bias will not be calculated. The evaluation contractor will, however, track refusal rates and
refusal demographics, to gain an understanding of potential patterns in data collection participation and
refusal. For some data collections, respondent demographics (i.e., non-identifiable grantee organization
descriptors) will be documented and reported in written materials associated with the data collection.
B6. Production of Estimates and Projections
The data will not be used to generate population estimates, either for internal use or dissemination.
B7. Data Handling and Analysis
Data Handling
The evaluation contractors will be responsible for collection, storage, and maintenance of the data.
Exceptions include data collected in GO Smart (i.e., the Annual Progress Report and Final Descriptive
Report); those data will be securely transferred to the evaluation contractors for handling and analysis.
All sensitive and personally identifiable information will be stored and maintained in accordance with
NEA requirements; the evaluation contractors have capabilities for the safe storage of sensitive
information meeting federal guidelines.
Once the data have been received, the evaluation contractors will utilize statistical software, such as
SAS, to process and clean the data. This involves renaming variables, converting character variables to
numeric, cleaning dates, and cleaning data entry errors. Next, the evaluation contractors will remove
duplicates, recode missing variables, create clean variables, assign labels, and recode write-ins. The
contractors will create any needed analysis variables. The qualitative data from open-ended fields in
surveys will be retained verbatim in analysis files (if answers are collected by hard copy or by phone,
responses will be entered verbatim into the analysis files by an evaluator).

9

Data Analysis
The primary and secondary data collected will be analyzed using both quantitative and qualitative
methods. Examination of data from a variety of sources will provide a cross-check on the different data
collection activities and may point to issues to be further explored in subsequent data collection
activities or analyses.
•

•

Qualita�ve Data. Standard qualita�ve procedures will be used to analyze and summarize
informa�on from the grantees and federal and RAO stakeholders. Qualita�ve data analysis
so�ware will be used to organize, code, triangulate, and iden�fy themes. In prepara�on for
qualita�ve analysis, evaluators will use standardized templates to organize and document the
informa�on abstracted from data sources. Qualita�ve data will be integrated with quan�ta�ve
and analyzed together when prac�cable. This full integra�on will facilitate data triangula�on.
Qualita�ve analysis of secondary data will be more targeted, as it will draw from specific
variables within each iden�fied data source (e.g., qualita�ve data will be pulled directly from
applica�ons, annual progress reports, and final descrip�ve reports to answer research
ques�ons). The data will be entered into the standardized templates and will be systema�cally
reviewed and categorized according to the pre-established indicators.
Quan�ta�ve Data. For secondary data sources, such as Cooperator program data, the ac�vi�es
conducted by the planning group, grantees, and learning opportuni�es providers will be
summarized by type and frequency. For quan�ta�ve data generated from web-based surveys
such as the Grantee Baseline Survey and Grantee Learning Opportuni�es Quarterly Survey,
frequency distribu�ons will be calculated to summarize trends and paterns across survey items
and to examine variability in the data. The evalua�on contractors will produce descrip�ve
sta�s�cs to summarize variances and means for relevant quan�ta�ve items and groups of items.
For instance, survey items that rate each grantee’s self-perceived level of engagement in learning
opportuni�es ac�vi�es will be tabulated as means and percentages. The survey data will be
examined across all grantees par�cipa�ng in the evalua�on, as well as by key descrip�ve
characteris�cs (e.g., organiza�on budget size, organiza�on or program ac�vity loca�on, new
grantees, disciplines) to learn more about grantee percep�ons and experiences.

Data Use
Evaluators do not intend to create a public-use data file based on the information collected. Information
collected may be aggregated and incorporated into documents or presentations that are made public
such as through conference presentations, websites, or social media. The following are some examples
of ways in which the evaluation contractor may share information resulting from these data collections:
learning opportunity/technical assistance plans, brief memos (to the ArtsHERE planning group, TWG,
grantees), presentations, infographics, project-specific reports, or other documents relevant to
stakeholders such as NEA leadership and staff. In sharing findings, evaluators will describe the project
methods and limitations with regard to generalizability and as a basis for policy.
B8. Contact Person(s)
James Bell Associates is conducting this information collection and developed the plans for data
collection in collaboration with the NEA under Task Order Call # 59310522F0015. For questions around
how data will be collected and analyzed, please contact Connie Park, [email protected].

10

Table of Attachments
Attachment A: Instruments
•
•
•
•
•
•
•

Instrument 1: Review Panelist Survey
Instrument 2: Grantee Baseline Survey
Instrument 3: Annual Progress Report
Instrument 4: Grantee Learning Opportuni�es Quarterly Survey
Instrument 5: Learning Opportuni�es Tracker
Instrument 6: Learning Logs
Instrument 7: Final Descrip�ve Report

Attachment B: Email Invitation and Reminder Language
•
•
•
•
•

Template 1: Review Panelist Survey Email Template
Template 2: Grantee Baseline Survey Email Template
Template 3: Grantee Learning Opportuni�es Quarterly Survey Email Template
Template 4: Learning Opportuni�es Tracker Email Template
Template 5: Learning Logs Email Template

Attachment C: Cognitive Testing Report
Attachment D: IRB Determination Letter
Attachment E: MEL Plan

11


File Typeapplication/pdf
AuthorChi Connie Park
File Modified2024-04-19
File Created2024-04-19

© 2024 OMB.report | Privacy Policy