Attachment C: Cognitive Testing Report

Attachment C. Cognitive Testing Report.pdf

ArtsHERE Monitoring, Evaluation, and Learning Plan Data Collection: NEA Pilot Equity Initiative

Attachment C: Cognitive Testing Report

OMB: 3135-0148

Document [pdf]
Download: pdf | pdf
Attachment C: Cognitive Testing Report

REPORT | March 2024

ArtsHERE Cognitive Testing Report
James Bell Associates

ArtsHERE Cognitive Testing Report

iii

Table of Contents
Table of Contents ............................................................................................................................ iii

Overview .................................................................................................................................... 1
Review Panelist Survey ............................................................................................................. 4
Grantee Baseline Survey ........................................................................................................... 7
Annual Progress Report .......................................................................................................... 12
Learning Opportunities Tracker ............................................................................................... 15
Grantee Learning Opportunities Quarterly Survey .................................................................. 17
Learning Logs .......................................................................................................................... 20
Final Descriptive Report .......................................................................................................... 23
Appendices .............................................................................................................................. 27
A. Cognitive Testing Email(s) ............................................................................................... 27
B. Pilot Review Questions .................................................................................................... 29

ArtsHERE Cognitive Testing Report

ii

Overview
The ArtsHERE Cognitive Testing Report, prepared by James Bell Associates, documents the
examination and assessment of seven data collection and reporting instruments. The cognitive
testing process was designed to gauge comprehension, usability, and overall user experience of the
instruments. The purpose of this report is to provide insightful feedback aimed at enhancing the
quality and reliability of these assessment tools. Through rigorous cognitive testing, areas of strength
and areas for improvement are illuminated, ultimately contributing to the refinement of these
instruments for more effective evaluation practices.

Testing Processes
The cognitive testing process followed a structured approach to obtain feedback on the surveys,
forms, and other data collection instruments that will be used to gather information from ArtsHERE
grantees 1, application review panelists, service providers, and members of the ArtsHERE planning
group (i.e., the NEA, South Arts, Regional Arts Organizations, and evaluation contractors).
Participants were asked to complete one or more instruments of their choosing along with the
instrument review form for each instrument, allowing for real-time feedback on comprehension,
usability, and overall user experience to improve the quality and reliability of the evaluation
instruments.
The cognitive testing process for the data collection instruments of the ArtsHERE Pilot involved five
professionals from the arts and cultural sector who were recruited from the ArtsHERE Technical
Working Group and volunteered to participate. This report provides a comprehensive overview of the
feedback received from the participants during the testing phase.
The participants in the cognitive testing process were selected based on their expertise and
experience within the arts and cultural sector. Their diverse backgrounds encompassed various
roles such as artists, administrators, researchers, educators, and evaluators, ensuring a wellrounded perspective on the data collection instruments.
The following instruments were tested:
1. Review Panelist Survey
______
1

Subgrants will be awarded to a range of organizations, such as those that center arts and cultural activities within their

communities; work at the intersection of the arts and other domains (such as community development, health/well-being, or
economic development); and are diverse in terms of geography, scale of operations and programming, and budget size. Throughout
this document, ArtsHERE subgrants are referred to as grants, and ArtsHERE subgrant recipients will be referred to as grantees.

ArtsHERE Cognitive Testing Report

1

2. Grantee Baseline Survey
3. Annual Progress Report
4. Grantee Learning Opportunities Quarterly Survey
5. Learning Opportunities Tracker
6. Learning Logs
7. Final Descriptive Report
The feedback received from the cognitive testing process provides valuable insights into the
strengths and areas for improvement of the data collection instruments for the ArtsHERE Pilot. By
incorporating the suggestions and recommendations provided by the participants, the instruments
can be refined to better serve the needs of stakeholders within the arts and cultural sector.

Reviewers
This report provides a comprehensive overview of the feedback received from ArtsHERE planners
(the NEA, South Arts, M-AAA) and volunteer reviewers during the cognitive testing phase. The
participants in the cognitive testing process were selected based on their expertise and experience
within the arts and cultural sector. Their diverse backgrounds encompassed various roles such as
artists, administrators, researchers, educators, and evaluators, ensuring the perspectives of
grantees and the organizational service/learning opportunity providers are reflected in the testing of
the data collection instruments.
The cognitive testing process for the ArtsHERE data collection instruments was conducted in two
parts; no instrument was tested by more than six testers. For the cognitive testing:
1. The instruments that will be administered to grantees or panelists (i.e., all instruments except
the learning logs) were tested among participants selected for their expertise and experience
in the arts and cultural sector. JBA extended invitations to select members of the Technical
Working Group (TWG) and external arts researchers/evaluators to participate in the review
of the data collection forms. A final sample of 6 individuals participated in the cognitive
testing of these instruments. Coming from diverse regions, such as the Mid-Atlantic, Pacific
Northwest, Great Lakes, and Southwestern areas of the United States, the reviewers brought
a range of perspectives to the ArtsHERE reviewing process. These experts, including artists,
administrators, educators, actors, and authors, applied their collective expertise in
evaluation, research, academic pursuits, practice, and arts administration to conduct a
comprehensive review. This sample ensured the thorough capture of varied viewpoints from
potential grantees and recipients of organizational service and learning opportunities.
2. Learning logs were tested by reviewers recruited from the ArtsHERE Evaluation Committee,
as the intended respondent for the logs would be the ArtsHERE planning group. This group
consists of staff from NEA, the Cooperator, RAOs, evaluation partners and learning

ArtsHERE Cognitive Testing Report

2

opportunities services providers. A final sample of 4 RAO staff participated in the cognitive
testing of the logs.

ArtsHERE Cognitive Testing Report

3

Review Panelist Survey
The Review Panelist Survey is a voluntary web-based survey administered to all ArtsHERE
application review panelists who participated in the second stage of the review process. This survey
will be sent at one timepoint following completion of the review panel process (September 2024 at
the latest).
Number of questions: 10 numbered questions

How long did it take to complete the form?
•

Time range: 6-20 minutes

Recommendations: Adjust the time estimate in the introductory paragraph to reflect the time
estimates provided by reviewers. Revised sentence will read, “We realize how limited your time is;
the survey should take an average of 10 to 20 minutes to complete.

Is the purpose of this data collection instrument/form/protocol clear?
•

It is unclear why the respondent should take the survey when their feedback may inform the
implementation of the program.

•

Should include a clear statement to the purpose of the survey. Absent this, the purpose is
not clear.

•

Underline expected time.

Recommendations:
•

Revise the language to state, “Additionally, the evaluation will inform the implementation of
the ArtsHERE program.”

•

Modify the text in the second introductory paragraph to include a sentence that states, “The
purpose of the survey is to hear from you about your perspectives, understanding, and
experience with the panel review process.”

•

The estimated time to complete the survey will be underlined.

Were there any terms or phrases that are confusing or unclear? Consider any
cultural or language aspects that you think might affect the interpretation of
certain questions.
•

No.

Recommendations: No change needed.

ArtsHERE Cognitive Testing Report

4

Are there any questions that you are not confident you could answer as a
respondent? If so, please specify which ones and why they were difficult to
answer.
•

Question 1
o

Explain why the survey is requesting the respondent’s name.

Recommendations: The request for respondent names aims to align panelists' responses with
information in the grant panelist intake form database, which contains additional demographic details
for each panelist. If evaluators have access to a panelist roster with email addresses for survey
distribution, the name field can be omitted. Responses will then be linked to the email addresses to
which the survey links are sent.
•

Question 8a
o

This question may be too general to capture specific data to inform areas of
improvement. The follow-up question for neutral to very unsatisfied responses may
capture data to improve the process.

Recommendations: No change. The table includes specific components and activities within the
panel review process that NEA and RAOs would like feedback on. Question 8a will prompt
respondents to explain neutral to very unsatisfied ratings.
•

Question 10
o

Provide suggestions for improvement.

Recommendations: Question 10 will be re-worded to read, “Please provide suggestions for
improving the panel review process.”

Did you feel that the response options provided were appropriate and covered the
range of possible answers?
•

Question 5
o

Clarify if key partners are government-based, non-profit, or private.

Recommendations: It may be helpful for NEA and RAOs to know the breakdown of reviewers and
experiences by these categories. An additional question can be added that states, “Please select the
category that best describes your affiliation: Government-based/Non-profit/Private”.
•

Question 11
o

Add a question to collect data on a panelist’s review experience.

ArtsHERE Cognitive Testing Report

5



Suggestion: Hypothetically, based on your overall experience as a reviewer
for the ArtsHERE application process, would you participate as a panelist
reviewer for a similar grant program in the future? Would you recommend a
colleague to accept an invitation to be a reviewer? Please explain.

Recommendations: This suggested question may not be suitable because it asks respondents to
speculate about future actions rather than focusing on their current experiences and feedback.
Additionally, responses to hypothetical scenarios may not accurately reflect reviewers' actual
intentions or behaviors.

How would you rate the overall user experience of the evaluation form,
considering clarity, simplicity, and level of effort to complete?
•

This should be shorter, only ask the necessary questions.

Recommendations: No change. Questions 1-6 are completed using a drop-down menu, which is
not expected to expend much time. The questions regarding panelist experiences help to address
ArtsHERE research questions and will be important information to inform future panel review
processes.
•

User friendly and easy to complete.

Recommendations: No change needed.

Other Feedback
•

Intro Paragraph 3
o

Add who this data will made available to.

Recommendations: Revise typographical error in the first sentence of the third introductory
paragraph to state, “These data will be made available to Regional Arts Organizations and the NEA.”
•

Question 9
o

Add “panelist review” before “process.”

Recommendations: Revise Question 9 to read, “What worked well with the panel review process?”.

ArtsHERE Cognitive Testing Report

6

Grantee Baseline Survey
The Grantee Baseline Survey is a required web-based survey sent to grantees to complete at one
timepoint upon grant acceptance in October 2024. This survey collects information on grantees’
organizational characteristics, capacities, community needs, priorities, and program demographics.
The survey will be administered on a FedRAMP compliant platform, and grantees will receive a link
to complete the survey by email. Responses should reflect input from staff at the grantee
organization involved in the organization’s ArtsHERE grant.
Number of questions: 19 numbered questions

How long did it take to complete the form?
•

Time range: 10-30 minutes.

Recommendations: The fourth introductory paragraph currently states, “…the survey should take
an average of 15 to 20 minutes to complete”. Based on the estimates provided, this will be revised to
read, “…the survey should take an average of 10 to 30 minutes to complete”.

Is the purpose of this data collection instrument/form/protocol clear?
•

Introduction
o

o
o

Introductory sentence could be split into two sentences.
Possible typo in paragraph 2: organization’s to organizations’
Underline the expected time.

Recommendations:
•

The introductory sentence can be broken up to now read, “The National Endowment for the
Arts (NEA) is conducting an evaluation of ArtsHERE. The evaluation aims to understand the
project activities supported through this program and how grantees approach the work.”

•

JBA will correct the typo, which occurs in the first sentence of the second introductory
paragraph. The sentence should now read, “…organizations’ characteristics and capacities,
community(ies) needs and priorities, and program and community demographics…”

•

The length of time to complete the survey will be underlined in the introductory paragraph.

•

Need to be more explicit in how the data is being used to better serve historically and
continuously underfunded cultural organizations in the future.

Recommendations: The following sentence will replace the second sentence in the second
introductory paragraph, “Your responses will enable ArtsHERE funders, including Regional Arts

ArtsHERE Cognitive Testing Report

7

Organizations and the NEA, to gain a clearer understanding of your organization's characteristics
and experiences. This understanding will inform the development of learning and information
services tailored to meet the specific needs and interests of ArtsHERE-awarded organizations,
particularly those historically underfunded cultural organizations.”

Were there any terms or phrases that are confusing or unclear? Consider any
cultural or language aspects that you think might affect the interpretation of
certain questions.
•

Question 2
o

Clarify the term “program,” (i.e. standard or grant funded programming).

Recommendation: The questions in this section of the survey are focused on characteristics of the
grantee’s overall organization. The question will be clarified to ensure that the respondent
understands this question is referring to the standard programming and not just the grant-funded
programming.
•

Question 7
o
o

Remove the prompts and have the respondents answer as they interpret the
question.
Clarify the phrase “organizational capacities,” (also applies to Question 8). Perhaps
provide examples.

Recommendations: Include prompts for open-ended survey question that can provide structure and
guidance to ensure that respondents cover important aspects of their staffing and/or board structure
while allowing for flexibility and depth in their responses. The current prompts for Questions 7 and 8
mention “organizational capacities”, which have not yet been clearly defined per the ArtsHERE FAQ
document. Therefore, respondents may not have a clear and consistent understanding of what is
meant by this term in the prompts. The prompts for Question 7 can be re-worded and simplified to:
a. What has helped your staffing and/or board structure to be successful in these areas?
b. How, if at all, do you envision ArtsHERE will support your staffing and board structure to be
successful?
o

Clarify the phrase “underserved groups/communities.”

Recommendation: This term appears in Questions 13, 16, 17, and 19. A definition will appear the
first time this term appears (Question 13) to say, “The term “underserved” refers to those whose
opportunities to experience the arts are limited relative to: geography, race/ethnicity, economics, or
disability.”

ArtsHERE Cognitive Testing Report

8

•

Question 13
o

Define fiscal year (i.e. government FY or organizations’ own FY).

Recommendation: Grantees have the flexibility to reflect on their participation trends based on their
own fiscal year. This will be clarified to include the following statement, “Please respond based on
your organization's fiscal year.”
•

Question 15a
o

Add community strengths to end of parenthetical.

Recommendation: “Community strengths” will be added to the end of the parenthetical in the
question prompt.

Are there any questions that you are not confident you could answer as a
respondent? If so, please specify which ones and why they were difficult to
answer.
•

Question 3
o

Clarify whether “mentorship/observation” is a distinct service delivery model.

Recommendation: No change needed. An “other” category is provided to capture other distinct
service delivery models that are not represented in the list. Respondents may write in any other
category relevant to this question.
•

Question 10
o

Add reverse of this question: What financial or fundraising strengths have you
identified, if any?

Recommendations: Add another question prompt following Question 10 to include an open-ended
question regarding fundraising strengths.
•

Question 13
o

Add follow-up question about the grantee’s perception of growth trends.

Recommendations: The evaluator is mindful of adding questions, which would increase respondent
burden. Perception of growth trends may be captured in Question 14, which asks respondents to
share about community needs. JBA does not recommend adding a follow-up question.

Did you feel that the response options provided were appropriate and covered the
range of possible answers?
•

Question 4

ArtsHERE Cognitive Testing Report

9

o

Add screen-based programming for computers, tablets, or phones, but not require
internet for areas that have limited or non-existent internet activity.

Recommendations: Screen-based programming that does not require internet may be used for
both in-person and virtual methods. JBA advises against including this in Question 4 to avoid
potential confusion.
•

Question 5b
o

Clarify whether “hours” means the numbers of hours for each category or each
organization’s relative assignment of hours to each category.

Recommendations: Question 5b does not ask respondents to specify hours for each staffing
category. This question is asking for the number or count of full and part-time staff in each category.
Instructions will specify that respondents should provide a count of number of individuals for each
category.
•

Question 18
o

Clarify if key partners are government-based, non-profit, or private.

Recommendations: Although including details about government-based, non-profit, or private
status would offer more insight about the grantees, evaluators are cautious about increasing the
survey burden with additional questions. JBA does not recommend adding this information.

How would you rate the overall user experience of the evaluation form,
considering clarity, simplicity, and level of effort to complete?
•

Review if all questions are necessary for the betterment of these organizations of the future.

Recommendations: No change needed. All questions in the survey tie back to ArtsHERE research
questions of interest to NEA and RAOs.
•

Seems user friendly and straightforward.

Recommendations: No change needed.

Other Feedback
•

Add brief thank you for participation at the end.

Recommendations: A thank you message will be added at the end of the survey.

ArtsHERE Cognitive Testing Report

10

•

The questions should also be respondents understand themselves and their work better from
answering the questions. Rework the questions so they have focuses on strengths and areas
for improvements.

Recommendations: The survey will establish a baseline understanding of ArtsHERE awardees and
inform the types of support that could benefit these organizations. While this data collection may
prompt some self-reflection among grantees, much of this will occur throughout ArtsHERE with the
provided learning opportunities. JBA advises against altering the questions. The descriptive
information gathered in this survey will be valuable to grantees and their peers.

ArtsHERE Cognitive Testing Report

11

Annual Progress Report
The Annual Progress Report is a required web-based reporting form to be completed by awarded
grantees at one timepoint at the end of their first year of their award, administered in the GO Smart
grant management system. Its purpose is to gather information on grantee organizations’ practices,
community roles, successes, barriers, connections, and experiences with ArtsHERE learning
opportunities. The information from this data collection will be used by the NEA for program
monitoring and by the evaluator(s) for monitoring, evaluation, and learning reports and feedback
loops, including sample selection for the evaluation case studies.
Number of questions: 6 numbered questions

How long did it take to complete the form?
•

Estimates for grantees to complete the form start at two hours.

•

Since this would be a year’s worth of activities, this may need more time and multiple people
to complete.

•

Likely will not be finished in one sitting.

Recommendations: For estimating the time needed to complete the report, JBA proposes a range.
JBA recommends 2 hours for the upper end, which was suggested by one respondent. Additionally,
JBA suggests allocating approximately 10 minutes per open-ended question to determine the lower
end of the estimate. There are 5 open-ended questions, and an additional 6 prompts within these
questions. JBA estimates the total time for completing them to be 110 minutes (17 questions x 10
minutes each).

Is the purpose of this data collection instrument/form/protocol clear?
•

Yes.

Recommendations: No change needed.

Were there any terms or phrases that are confusing or unclear? Consider any
cultural or language aspects that you think might affect the interpretation of
certain questions.
•

All terms or phrases are understandable and clear.

Recommendations: No change needed.

ArtsHERE Cognitive Testing Report

12

Are there any questions that you are not confident you could answer as a
respondent? If so, please specify which ones and why they were difficult to
answer.
•

All questions could be answered by a respondent.

Recommendations: No change needed.
•

Questions 2, 5, and 8 are redundant of each other.

Recommendations: The evaluator suggests bolding key words within each question (Q2, 5, and 8)
to differentiate the intent of each of these questions.

Did you feel that the response options provided were appropriate and covered the
range of possible answers?
•

Yes.

Recommendations: No change needed.
•

Question 1
o

Add another box that includes Board Development and Fundraising to include a
project type for small to mid-sized organizations.

Recommendations: No change needed. The response options provided in the report align with
options available in the ArtsHERE grant application form. Any additional response options could be
typed into the “Other” response option, which will have an open-ended space to type in additional
categories.

How would you rate the overall user experience of the evaluation form,
considering clarity, simplicity, and level of effort to complete?
•

This has been an improvement and overall user experience should be very good.

•

The form is not simple for users and requires a higher level of effort for only a year’s worth of
activities.

Recommendations: Reviewers provided conflicting feedback regarding user experience. The report
form will be housed on the GO Smart platform, which will have a more user-friendly format than the
Word documents that testers reviewed. The purpose of the closed and open-ended report format is
to provide a comprehensive overview of grantees’ activities over the past year. The depth of
information gathered through open-ended questions allows for a nuanced understanding of
achievements, challenges, and areas for improvement. Although the template may require a higher

ArtsHERE Cognitive Testing Report

13

level of reflection and consideration compared to more streamlined forms, this investment in time
pays off by capturing valuable insights to address the goals of the evaluation. By encouraging
respondents to provide detailed responses, the impact of ArtsHERE can be better assessed and can
inform decisions moving forward.

Other Feedback
•

This could be difficult to collect as a survey because there may be more than one person
completing this.

Recommendations: No change needed. The Annual Progress Report will be programmed into the
GO Smart web-based platform, and grantee teams may access the form at any point for completion
individually or as a team.

ArtsHERE Cognitive Testing Report

14

Learning Opportunities Tracker
The Learning Opportunities Tracker is a required web-based tracking form completed by
organizational service providers (RAO staff and contractors) after every learning opportunity session
provided to grantees from November 2024 and April 2026. The form gathers information about the
learning opportunity services respondents provided to grantees, such as cohort convenings,
coaching, and workshops, and covering service type, content, partners, engagement, facilitators,
and challenges.
Number of questions: 15 numbered questions

How long did it take to complete the form?
•

Approximately 7 minutes.

Recommendations: No change needed.

Is the purpose of this data collection instrument/form/protocol clear?
•

The instructions were not clear in describing that the form was being used to track activities.

Recommendations: The second sentence of the first paragraph in the introductory text will be
revised to state, “As part of the evaluation, this survey will help track the learning opportunities
provided to ArtsHERE grantees on an ongoing basis.”

Were there any terms or phrases that are confusing or unclear? Consider any
cultural or language aspects that you think might affect the interpretation of
certain questions.
•

No.

Recommendations: No change needed.

Are there any questions that you are not confident you could answer as a
respondent? If so, please specify which ones and why they were difficult to
answer.
•

No.

Recommendations: No change needed.

ArtsHERE Cognitive Testing Report

15

Did you feel that the response options provided were appropriate and covered the
range of possible answers?
•

Yes.

Recommendations: No change needed.

How would you rate the overall user experience of the evaluation form,
considering clarity, simplicity, and level of effort to complete?
•

This is straightforward. Providers may end up using stock answers to fill in narrative content.

Recommendations: No change needed.

Other Feedback
•

It seems duplicative of something that the organization may already have for their own
reflection and refinement.
o

Unclear why this collection is helpful for grantees.

Recommendations: This tracking form was modeled from past technical assistance tracking forms
used by RAOs. While not a direct benefit to grantees, this data collection will help provide rich
information on the supports provided to them throughout the grant, and will help improve the
technical assistance and learning opportunities provided to all ArtsHERE grantees. This information
will also inform the information that will be pre-filled into grantee satisfaction surveys.
•

Question 2 notes that it covers questions 3-3a, but question 3a doesn’t exist.

Recommendations: The mention of Question 3a will be removed from the skip logic instructions
under Question 2.
•

Question 12 notes that it is optional, does that mean all other questions are required?

Recommendations: No questions will be required or programmed for forced response. However,
the evaluator recommends to mark Questions 12-15 as optional to indicate that these are lower
priority for respondents to complete.

ArtsHERE Cognitive Testing Report

16

Grantee Learning Opportunities Quarterly
Survey
The Grantee Learning Opportunities Quarterly Survey is a required web-based survey administered
to all grantees quarterly, beginning after month 3 of the ArtsHERE grant period (January 2025
through April 2026). The survey consists of questions that capture grantees’ self-assessment of
learning opportunities received, including cohort convenings, one-on-one coaching, and topical
expert workshops. Grantees will indicate their satisfaction with learning opportunities: engagement,
quality, satisfaction, relevance, and effectiveness of cohort-based and 1:1 services as well as
perceptions on how services can be improved.
Number of questions: 11 numbered questions

How long did it take to complete the form?
•

Time range: 6-12 minutes

Recommendations: Adjust the time estimate in the introductory paragraph to reflect the time
estimates provided by reviewers. Revised sentence will read, “We realize how limited your time is;
the survey should take an average of 10 to 15 minutes to complete.

Is the purpose of this data collection instrument/form/protocol clear?
•

Yes.

Recommendations: No change needed.

Were there any terms or phrases that are confusing or unclear? Consider any
cultural or language aspects that you think might affect the interpretation of
certain questions.
•

None.

Recommendations: No change needed.
•

The term learning opportunities is not intuitive, suggest adding the definition throughout the
survey to reaffirm its meaning.

ArtsHERE Cognitive Testing Report

17

Recommendations: A text box that defines learning opportunities will appear at the top of each
page of the web-based survey. The text box will read, “’Learning opportunities’ include monthly
cohort sessions, one-on-one meetings with your assigned coach, and workshops with topic-based
experts”.

Are there any questions that you are not confident you could answer as a
respondent? If so, please specify which ones and why they were difficult to
answer.
•

None.

Recommendations: No change needed.

Did you feel that the response options provided were appropriate and covered the
range of possible answers?
•

Question 5a
o

Consider changing to elicit a qualitative response.


Suggestion: Provide any feedback on how engaging the learning
opportunities were.

Recommendations: Question 5a will be re-worded to read, “Please provide any additional thoughts
or insights you have about how engaging the learning opportunities you received during the past
month were for you.”
•

Question 8
o

Add “N/A” or “we did not use” as a scale option.

Recommendations: The instructions for this question will indicate that only the learning
opportunities that respondents indicated they received in Question 2 will be shown here.
•

Question 10
o

Change language to be open ended.


Suggestion: Describe any other types of learning opportunities that would be
beneficial to you for building knowledge, skills, connections, or capacity?
Please share below.

Recommendations: Question 10 will be re-worded to read, “What other types of learning
opportunities would you find valuable for enhancing your knowledge, skills, connections, or
capacity?”

ArtsHERE Cognitive Testing Report

18

•

Question 11
o

Change language to be open ended.


Suggestion: What, if any, additional feedback would you like to share about
the learning opportunities and support you received in the past month?

Recommendations: Question 11 will be re-worded to read, “What, if any, additional feedback would
you like to share about the learning opportunities and support you received in the past month?”.

How would you rate the overall user experience of the evaluation form,
considering clarity, simplicity, and level of effort to complete?
•

Simple to complete.

•

Very straightforward, good examples help explain what is meant by the constructs in the
stems, e.g., relevance, engagement, etc.

Recommendations: No change needed.

Other Feedback
•

Questions 2 and 3
o

Would the respondent receive either question 2 or 3? Or would they receive both?

Recommendations: No change needed. The instructions for Question 3 indicate that the question
only receive this question if they did not receive learning opportunities in the past month.
•

Question 2c
o

Do the questions following a no answer only show up for the third time this question
is asked? The first 3 questions under Q2 are the same except for this addition under
2c.

Recommendations: The survey instructions will be revised to indicate that the prompt under
Question 2 (“Please indicate the type(s) and topic(s) of learning opportunity activities you
participated in during the past month”) will only appear after all documented learning opportunities
(pre-filled based on monthly provider data) have been presented.

Other Notes
•

The survey was tested as a monthly survey, and its frequency was reduced from monthly to
quarterly at the request of the NEA in order to reduce burden for respondents.

ArtsHERE Cognitive Testing Report

19

Learning Logs
The voluntary learning logs will capture the reflections of members of the ArtsHERE planning group
(i.e., the NEA, South Arts, Regional Arts Organizations, and evaluation contractors) at key
milestones of the initiative (e.g., post award, end of year 1 grantee reporting). Reflection results will
also serve as a tool for iterative reflection to identify necessary MEL plan changes.
Number of questions in each topic: 4 numbered questions

How long did it take to complete the form?
•

Time range: 5-30 minutes

Recommendations: Revise sentence in the introduction to reflect the estimated time range
provided by reviewers. Text will be revised to, “We realize how limited your time is; the log should

take an average of 5 to 30 minutes to complete”

Is the purpose of this data collection instrument/form/protocol clear?
•

Yes.

Recommendations: No change needed.

Were there any terms or phrases that are confusing or unclear? Consider any
cultural or language aspects that you think might affect the interpretation of
certain questions.
•

Distinguish the word “lessons” from observations of what went well and what the challenges
were. It is not clear how they differ.

Recommendations: While lessons learned encompass a broader spectrum of insights derived from
both challenges and facilitators, observed challenges and facilitators are more specific and
immediate in nature, focusing on identifying obstacles and supportive factors during project
execution. All three concepts are valuable for continuous improvement and informed decisionmaking. The fourth question in each topic could be re-worded to emphasize the future focus of
lessons learned. The question can be revised to, “What insights are you learning from the [topic of
learning log] that could guide future work for NEA and RAOs?” We suggest removing the second
part of the question (“Please also think about any unexpected outcomes or successes.”) because

it is not directly tied to future-focused lessons learned.

ArtsHERE Cognitive Testing Report

20

•

The term organizational services is not intuitive, suggest adding the definition throughout the
survey to reaffirm its meaning.

Recommendations: In Learning Log Topic 3, replace all mention of “organizational services” with
“learning opportunities” as this is the new term used for ArtsHERE technical assistance and
capacity-building activities. Revise the final sentence of the Learning Log Topic 3 instructions to say,
“Please fill out this learning log based on what you learned from the information that has recently
been shared with you regarding recent Learning Opportunities Survey results. The term “learning
opportunities” refers to any type of topic-based workshop, one-on-one coaching or consultations,
and peer cohort convenings provided to increase the knowledge, skills, connections, and/or capacity
of grantee organizations to work toward their own project and organizational goals.”

Are there any questions that you are not confident you could answer as a
respondent? If so, please specify which ones and why they were difficult to
answer.
•

As the term lessons is not clearly distinguished from observations of what went well and
challenges

Recommendations: No additional change needed.

Did you feel that the response options provided were appropriate and covered the
range of possible answers?
•

Yes

Recommendations: No change needed.

How would you rate the overall user experience of the evaluation form,
considering clarity, simplicity, and level of effort to complete?
•

Fine.

•

9 out of 10.

Recommendations: No change needed.

ArtsHERE Cognitive Testing Report

21

Other Feedback
•

The learning logs should include the grant development process. This would be helpful for
future grant planning and development. It also would shed light on how we can work better
together in the event there is another national/RAO-type grant.

Recommendations: No change needed. Learning logs should be implemented in real-time. As
such, the window period for capturing reflections on the grant development process has passed.
However, as part of the learning plan there will be discussions held with the planning group to reflect
on various aspects of the ArtsHERE. The Evaluator Annual Report also includes reflections from the
grant and evaluation development process that can be further discussed with the planning group.

ArtsHERE Cognitive Testing Report

22

Final Descriptive Report
The Final Descriptive Report is a required web-based reporting form to be completed by awarded
grantees at one timepoint, at the end of their grant. Its purpose is to gather information on
organizational characteristics, perceptions, experiences, and outcomes during the grant award
period. It will be administered in the GO Smart grant management system. The report will help track
the project's impact.
Number of questions: 12 numbered questions

How long did it take to complete the form?
•

Estimates for grantees to complete the form start at 90 minutes.

•

Since this would be a year’s worth of activities, this may need more time and multiple people
to complete.

Recommendations: To estimate the length of time to complete the report, JBA suggests starting at
the low end of 90 minutes (as was suggested by one respondent) and allotting 10 minutes per openended question to calculate the upper end.

Is the purpose of this data collection instrument/form/protocol clear?
•

Yes.

Recommendations: While reviewers indicated the purpose of this data collection instrument is
clear, we noticed that there is no introductory text. Introductory text will be added to align with the
text in the Annual Progress Report.

Were there any terms or phrases that are confusing or unclear? Consider any
cultural or language aspects that you think might affect the interpretation of
certain questions.
•

Define “cultural strategies.”

ArtsHERE Cognitive Testing Report

23

Recommendations: This comment is in reference to Question 8 (“Outcomes of capacity-building
projects supported by ArtsHERE may not be evident during or immediately after a grant project’s
period of performance. What early indications of change, if any, could your organization see by the
end of your grant project?”). One of the response options for this question is, “Improved cultural
strategies that engage communities my organization serves.” We recommend providing further
clarification within this response option that states, “Cultural strategies may include ways in which
your programs or services are centered in cultural activities, traditions, and identities.” There is
currently not a definition of “cultural strategies” in the ArtsHERE program guidelines, but activities,
traditions, and identities were listed in reference to culture throughout the guidelines. It is
recommended that NEA and South Arts provide feedback on this proposed definition.
•

Define “organizational culture.”

Recommendations: The term "organizational culture” is not used in the report template. No change
needed.
•

Question 11
o

Change “race/ethnicity” to “racism” as a limitation.

Recommendations: The wording of Question 11 is included as a standard question for research
purposes and data analysis, and includes Congressionally mandated descriptors of underserved
populations engaged. JBA recommends keeping the question as-is, but this feedback brings up an
important flaw in the wording of this question. Upon reflection, the current categories offered do not
align with the survey question and do not address the root causes of limited opportunities for
historically underrepresented communities to benefit from arts programming. It is suggested that an
additional question be added with response options that include categories that address the
following bolded topics (descriptions would also be provided for each option):
•

Access and representation: Barriers to accessing arts programming due to geographical
location, economic constraints, or lack of representation within the arts sector. Limited
access to arts institutions, museums, theaters, and galleries.

•

Financial barriers: Cost associated with participating in arts programming.

•

Cultural relevance: Arts programming that does not resonate with the cultural backgrounds
and experiences of underrepresented communities. Lack of representation of diverse voices,
stories, and art forms.

•

Educational disparities: Disparities in arts educational opportunities for underrepresented
communities. Limited resources for arts education.

•

Systemic discrimination: Discrimination and bias within the arts sector for individuals from
underrepresented communities, including limited opportunities for employment, exhibition,

ArtsHERE Cognitive Testing Report

24

and recognition. Structural inequities within funding mechanisms, hiring practices, and artistic
programming.
•

Language and communication: Language barriers; lack of materials and communications
are provided in languages spoken by the community.

•

Transportation and infrastructure: Limited access for individuals to physically access arts
venues and events.

Are there any questions that you are not confident you could answer as a
respondent? If so, please specify which ones and why they were difficult to
answer.
•

Question 8
o

Drop the first sentence since it sets up the grantee to feel like they must check off
boxes.

Recommendations: Remove the first sentence of Question 8 (“Outcomes of capacity-building
projects supported by ArtsHERE may not be evident during or immediately after a grant project’s
period of performance.”).

Did you feel that the response options provided were appropriate and covered the
range of possible answers?
•

Question 1
o

Add another box that includes Board Development and Fundraising to include a
project type for small to mid-sized organizations.

Recommendations: No change. There is an “Other capacity-building activities” response option
where the respondent can specify other capacity-building project types.
•

Question 4
o

Decrease to 3 choices: Not at all, To some extent, To a large extent. Five choices
are too much.

Recommendations: No change. It is preferred that we maintain the current response options to
better detect variability and subtlety in responses. It is not expected that the additional response
options will significantly impact respondent fatigue.

How would you rate the overall user experience of the evaluation form,
considering clarity, simplicity, and level of effort to complete?
•

Overall user experience should be very good.

ArtsHERE Cognitive Testing Report

25

•

The form is not simple for users and require a higher level of effort for only a year’s worth of
activities.

Recommendations: Reviewers provided conflicting feedback regarding user experience. The report
form will be housed on the GO Smart platform, which will have a more user-friendly format. The
purpose of the closed and open-ended report format is to provide a comprehensive overview of
grantees’ activities over the past year. The depth of information gathered through open-ended
questions allows for a nuanced understanding of achievements, challenges, and areas for
improvement. Although the template may require a higher level of reflection and consideration
compared to more streamlined forms, this investment in time pays off by capturing valuable insights
and facilitating meaningful reflection for the grantee. By encouraging respondents to provide detailed
responses, we can better assess the impact of ArtsHERE and make informed decisions moving
forward.

Other Feedback
•

Reframe questions to ask the respondents to tell the story of their yearly activities.

•

Add questions exploring challenges and lessons learned.

Recommendations: The final report offers descriptive insights using a mix of quantitative and
qualitative questions to gather comprehensive data about the grantee while being mindful of burden.
respondent
We do not recommend adding a challenges and lessons learned section, as these will be captured
in other data sources. Although constrained by question number and format in this document, the
evaluation will draw from diverse data sources (including document review, interviews, discussion
prompts, and surveys) to fully capture the grantees' experiences.
•

This could be difficult to collect as a survey because there may be more than one person
completing this.

Recommendations: No change needed. The Final Descriptive Report will be programmed into the
GO Smart web-based platform, and grantee teams may access the form at any point for completion
individually or as a team

ArtsHERE Cognitive Testing Report

26

Appendices
A. Cognitive Testing Email(s)
Exhibit A1. Cognitive Testing Request Email
Version Date: 1/22/24 
Subject: Reviewing ArtsHERE Data Collection Instruments for the National Endowment for the Arts
Email Body: 
Hello [Enter Name],
In our last Technical Working Group meeting we mentioned that we would be looking for help to
improve our evaluation and review our data collection forms for ArtsHERE. We are now soliciting
feedback from individuals like you to obtain feedback on comprehension, usability, and overall user
experience to improve the quality and reliability of the evaluation instruments.
We are asking members of the Technical Working Group to volunteer to provide feedback on one (or
more) of the surveys, forms, and other data collection instruments that will be used to gather
information from ArtsHERE grantees, panelists, and service providers:
1. Baseline Survey with Grantees
2. Review Panelist Feedback Survey
3. Organizational Services Grantee Survey Form
4. Organizational Services Tracker
5. Grantee Annual Progress Report
6. Grantee Final Descriptive Reports
Note: This is an optional activity for the Technical Working Group, although testing instruments will
count as one of the 4 meetings that members are expected to participate in during the year. If you
participate, someone from our JBA staff will contact you with follow-up instructions.
Thank you for your consideration in volunteering to review data collection instruments for ArtsHERE!
Our team thinks you will be able to provide valuable insight and would love your feedback.
We would share the document(s) with you on January 29th and would need your feedback by
February 9th, 2024.

ArtsHERE Cognitive Testing Report

27

Please let us know no later than this Friday, January 26th, 2024, if you will be available to
provide review and of how many forms (there are 6 forms total). I can be reached by e-mail at
[Insert email].
Sincerely,
[Insert Sign Off] 
Thank you for your continued support for this project!  

Exhibit A2. Follow-Up Cognitive Testing Email- Link Provided
Version Date: 1/22/24
Subject: Instructions for ArtsHERE Data Collection Instrument Reviewing for the National
Endowment for the Arts
Email Body: 
Hello [Enter Name],
Thank you for volunteering to review data collection instruments for ArtsHERE!
Our team thinks you will be able to provide valuable feedback for the [Insert Data Collection
Instrument Name].
The purpose of this instrument is to [Insert Description from Instrument Itself].
The data collection form along with the Review Questions are attached to this email, please look
over the review questions before beginning the data collection forms!
Once you have reviewed the instrument, we ask that you respond to the attached 6 questions
regarding comprehension, usability, and your overall experience. We ask that you complete this
review and submit the completed Word document by February 9, 2024.
We are interested in learning your thoughts about the length of the survey, questions asked in the
survey, and any other information that may help with data collection.  
Thank you for your continued support for this project!  
[Insert Sign Off] 

ArtsHERE Cognitive Testing Report

28

B. Pilot Review Questions
Exhibit B1. ArtsHERE Instrument Pilot Review Questions
Name of instrument reviewed: ___________________________________
Date instrument was reviewed: ___________________________________
The following questions aim to gather valuable feedback on comprehension, usability, and overall
user experience to improve the quality and reliability of the evaluation instrument that you recently
reviewed. Please provide a brief response to the following questions.
1. How long did it take to complete the form?
2. Is the purpose of this data collection instrument/form/protocol clear?
3. Were there any terms or phrases that are confusing or unclear? Consider any cultural or
language aspects that you think might affect the interpretation of certain questions.
4. Are there any questions that you are not confident you could answer as a respondent? If so,
please specify which ones and why they were difficult to answer.
5. Did you feel that the response options provided were appropriate and covered the range of
possible answers?
6. How would you rate the overall user experience of the evaluation form, considering clarity,
simplicity, and level of effort to complete?

ArtsHERE Cognitive Testing Report

29


File Typeapplication/pdf
AuthorKristine Neurauter
File Modified2024-04-19
File Created2024-04-19

© 2024 OMB.report | Privacy Policy