Download:
pdf |
pdfSupporting Statement for Pilot Study of Participant Outcomes
Survey for the Creative Forces®: NEA Military Healing Arts
Network Community Arts Engagement Grant Program, Part B
Last Updated: June 30, 2022
Table of Contents
B.1 Respondent universe and sampling methods ........................................................................... 3
B.2 Procedures for the collection of information............................................................................ 6
B.3 Methods to maximize the response rates and to deal with nonresponse .................................. 7
B.4 Test of procedures or methods to be undertaken...................................................................... 9
B.5 Individuals consulted on statistical aspects & individuals collecting and/or analyzing data . 12
Table of Attachments
Attachment A: Participant Outcomes Survey and other Instruments
Attachment B: Participant Outcomes Survey Source Scales
2
B.1 Respondent universe and sampling methods
Describe (including a numerical estimate) the potential respondent universe and any
sampling or other respondent selection method to be used. Data on the number of entities
(e.g., establishments, State and local government units, households, or persons) in the
universe covered by the collection and in the corresponding sample are to be provided in
tabular form for the universe as a whole and for each of the strata in the proposed sample.
Indicate expected response rates for the collection as a whole. If the collection had been
conducted previously, include the actual response rate achieved during the last collection.
For the pilot test of the Participant Outcomes Survey, data collection employing
statistical methods consists of pre- and post-surveys of participants of community arts
engagement programs, with a minimum of 350 participants across up to 10 grant projects. This
section outlines the selection criteria that defines the sample for the study and describes the
potential respondent universe and anticipated response rates.
B.1.1 Sample for pilot testing: programs and participants
The study’s sample will be based on the selection of grant recipients for the first cycle of
the Creative Forces Community Engagement Grant Program. These grantees will be selected
through a competitive panel process administered by the Mid-America Arts Alliance (M-AAA),
acting as a cooperator for the National Endowment for the Arts. Grant applications are reviewed
by an independent review panel coordinated by M-AAA consisting of arts professionals,
including artists and creative arts therapists, targeted military-connected individuals, and
laypersons, who reflect a wide geographic, ethnic, and minority representation as well as diverse
aesthetic and cultural points of view. M-AAA will award approximately 25-30 matching grants
for arts-based community engagement projects that engage targeted military-connected
communities.
As noted in Supporting Statement Part A, the sample for this pilot test will consist of at
least 350 participants across up to 10 grant projects. All participants in the selected programs will
be asked to complete the survey at the beginning (pre) and end (post) of the program. Grant
3
projects will be selected for the pilot test based on the following criteria:
•
Organization provides non-clinical arts engagement to one or more of the following
military-connected populations exposed to trauma: active-duty service members,
guardsmen, reservists, veterans, military and veteran families, caregivers, and health
care workers providing care for military service members and veterans.
•
Organization implements non-clinical arts engagement activities utilizing one of the
following service delivery models: 1
o Ongoing Class/Other Ongoing Engagement: a class group, and/or ensemble that
meets regularly for a distinct time period
o Ongoing Drop-in Program: ongoing, drop-in programs, such as an open-studio,
where participation may or may not be consistent
•
Organization has planned activities during the pilot study implementation period
(January through June 2023).
•
Organization plans to collect enrollment data from participants in advance of the nonclinical arts engagement activities.
Funded activities may be in person, virtual, or offered through hybrid delivery and take
place in a clinical, community, or virtual setting. Projects will be selected to ensure to the extent
possible that a range of artistic disciplines and military populations (i.e., Active Duty Service
Members, veterans, Guardsmen and/or reservists; military and veteran families; and caregivers
and health care workers providing care for military service members and veterans) are
represented within the pilot testing.
1
These service delivery models are supported by the Community Engagement Grant Program.
4
B.1.2 Response rates
The target response rate for the pre survey is 60%, the threshold where potential biases
are acceptably small. The research team will administer the survey to respondents of the pre
survey. Due to anticipated participant attrition from program activities and from the pilot study,
response rates for the post survey are expected to be lower. There is little research from relevant
programs to provide estimates of post-survey response rates. In a study of Creative Forces
Community Projects, 2 response rates varied across sites, but several achieved rates of
approximately 60%. We have chosen 50% as our estimated post-survey response rate. The pilot
study design incorporates several specific strategies to help boost response rates (see section B3).
Other instruments are expected to have higher response rates. We anticipate that 83% of
participants will grant permission for their enrollment form data to be shared with the contractor,
and that 100% of grantee project directors will agree to participate in a debrief interview
following the end of the survey administration period.
2 Community Connections Projects study March 31, 2020, unpublished report. In this study of early Creative Forces community
engagement projects, participants and program staff reported that participants are often reluctant to participate in surveys. The
two main reasons were concerns about privacy and general survey fatigue among veterans and other military-connected
individuals. However, participants also reported a high level of commitment to their Creative Forces community engagement
program and strong relationships with staff and other participants. During the evaluation, some members of focus groups cited
these two factors as their primary reasons for their participation in the focus group. Outreach methods take these concerns and
factors into account.
5
Exhibit 1. Estimated Number of Entities and Respondents for Pilot Survey and Other
Instruments
Data
Source
Enrollment
Form
Timing of
Data
Collection
Prior to the
beginning of
the program
Respondents
Community
Engagement
Program
Participants
Participant
Outcomes
Survey –
Pre
Just after the
beginning of
the program
Community
Engagement
Program
Participants
Participant
Outcomes
Survey –
Post
Interview
Protocol
Just prior to
the end of the
program
Community
Engagement
Program
Participants
Grantee
Project
Director
After end of
survey
administration
period
# Participating
Organizations
Anticipated # of
Respondents
Response
Universe
Estimated
Response
Rate
10
290
350
83%
10
210
350
60%
10
175
350
50%
10
10
10
100%
B.2 Procedures for the collection of information
Describe the procedures for the collection of information, including
•
•
•
•
•
statistical methodology for stratification and sample selection,
estimation procedure,
degree of accuracy needed for the purpose described in the justification,
unusual problems requiring specialized sampling procedures, and
any use of periodic (less frequent than annual) data collection cycles to reduce
burden.
The pilot study of the Participant Outcomes Survey is a one-time data collection using a
web-based survey. The survey has a pre/post design matched at the individual level. As
described in the previous section, the sample will be selected at the grantee organization level
and will be based on a set of selection criteria described earlier. Information about grantees
organizations will be obtained from grant applications and the Mid-America Arts Alliance (MAAA), the agency’s cooperator. Contact and demographic information about participants will be
collected through a program enrollment form administered by grantee organizations and shared
6
with the contractor provided the participant provides explicit permission to do so.
B.3 Methods to maximize the response rates and to deal with nonresponse
Describe methods to maximize response rates and to deal with issues of non-response. The
accuracy and reliability of information collected must be shown to be adequate for
intended uses. For collections based on sampling, a special justification must be provided
for any collection that will not yield "reliable" data that can be generalized to the universe
studied.
Multiple methods will be used to maximize response rates for the pilot study.
1. Require organizational commitment to the pilot test: As part of the Creative Forces
Community Engagement Grant Program guidelines, grant recipients are required to
participate in a robust structure of technical assistance provided by Creative Forces and MAAA to facilitate project-level capacity building, including for monitoring, evaluation, and
learning and data collection. Grant recipients are also required to utilize/complete
participant/audience surveys developed and provided as part of the grant program.
Participation in the pilot study is voluntary, however, and project directors will be invited to
participate and asked to commit their organization to the six-month study period in advance
of any data collection.
2. Directly support the role of program directors: Several steps will be taken to encourage
maximum engagement of program directors. Before the pre-survey, directors will be invited
to attend a remote training session led by the external evaluator at which they will receive a
description of the origins and future use of the survey, learn about the purpose and process
for the pilot testing, learn how to recruit participants and administer the enrollment form
prior to participants’ engagement in activities, and have an opportunity to ask questions.
One researcher will be assigned to each organization to provide continuity in the
collaboration and cultivate support for the project. The research team will be available to the
directors throughout the study.
7
3. Manage survey administration externally: Although directors provide support and
participants’ enrollment data, they will not be responsible for administering the survey. The
survey will be fully conducted by the research team rather than site-level staff to reduce
burden to the staff, streamline the data collection process, and ensure confidentiality of
responses.
4. Customize outreach and administration to program design: Outreach and implementation
plans will be customized for each project, taking into consideration the service delivery
model (i.e., ongoing class/other ongoing engagement, ongoing drop-in program) and
timeframe. Programs will vary in length and may run at any time between January and June
2023. Rather than conducting the post survey on a common date, the post survey will occur
immediately prior to the end of each program before contact with the participant is lost.
5. Use direct participant recruitment with ease of survey access: Individuals participating in the
pilot study will be contacted individually by the research team via email and will be
provided with a personalized link to the survey. Contact and demographic information on
participants will be collected through an enrollment form provided to participating grant
projects.
6. Provide email reminders to participants: Participants will receive at least three email
reminders for both the pre and post survey according to the timeline for each project.
7. Monitor survey completion rates by program: Follow up with directors to troubleshoot
issues (e.g., program dropout) to minimize low response rates.
8. Provide high levels of privacy and confidentiality: For respondents attending program
activities requiring pre-enrollment, the invitation to the survey and the introduction in the
survey will explain that an individual’s survey access code is unique to this survey. When
8
the link is used, individual responses are entered into a database that does not include
participants’ names but links the responses made on surveys at the beginning and end of the
program. Only the contract researchers have access to the database and responses. The
connection between individuals and the unique link is kept in a separate, password protected
file that only the researchers can access. No person associated with an individual’s program
or Creative Forces will be able to see an individual’s survey or even know whether they
participated in this study.
9. Inform respondents of the nature of the items: The survey includes arts-related items, as well
items pertaining to non-arts outcomes, such as social connectedness. Based on the results of
cognitive testing, the introduction now alerts respondents to non-arts items related to other
aspects of their lives.3
10. Avoid survey fatigue: The survey is estimated to take about 10 minutes, and items that share
the same response scales are grouped to increase ease of responding to a subject area.
11. Show association with Creative Forces and the National Endowment for the Arts: The
Creative Forces and National Endowment for the Arts logos will be used on
communications with program staff and participants and on the survey. This will confer the
significance and legitimacy of the survey.
B.4 Test of procedures or methods to be undertaken
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an
effective means of refining collections of information to minimize burden and improve
utility. Tests must be approved if they call for answers to identical questions from 10 or
During cognitive testing of the survey, two of the nine interviewees raised concerns when the survey shifted from
arts-related items to those reflecting the target outcomes. They recommended informing respondents of this shift in
the introduction; otherwise, they might discontinue the survey. Adding the following statement appeared to reduce
concerns for the remainder of the interviews: People who have participated in community arts programs sometimes
report that the program affects other areas of their lives not directly related to making art, such as their health, how
connected they feel to other people, how they feel about themselves, or how they handle challenges. Thirty-two items
focus on these things.
3
9
more respondents. A proposed test or set of tests may be submitted for approval separately
or in combination with the main collection of information.
In October 2021, the pilot survey underwent cognitive testing with 9 members of the
military-connected population who were also involved in community arts engagement programs
as staff members. The objectives of the testing were to detect issues of usability, clarity, and
readability in the survey instrument. Changes were made to the survey instrument following the
completion of cognitive testing.
Statistical analyses for this pilot data collection will include response analysis, item and
scale analysis, and exploratory outcomes analysis, including by subgroup (i.e., military
connected population, service delivery model) provided the participant count exceeds 50 for any
subgroup.
B.4.1 Response and Item Analysis
The steps listed above to increase response rates will help minimize nonresponse bias.
Overall response rate will be calculated for each grantee organization and for each survey by
dividing the number of pre and post survey respondents by the number of participants reported
by the organization. To test for nonresponse biases, the demographics (age group, military
connection, race/ethnicity, gender) of participants collected through the enrollment form will be
compared with demographics of pre and post survey respondents. If statistically significant
differences arise between respondents to the pre-survey and those who complete the post-survey,
weighting will be employed in the outcomes analyses. Other potential sources of nonresponse
bias, such as service delivery model and level of participation, will also be considered, and itemlevel response rates and patterns will be calculated. Depending on the results of these analyses,
statistical procedures will be used to mitigate the impact of biases and missing data.
B.4.2 Scale Analysis
10
The existing scales for Social Connectedness, Resilience, and Independence and
Successful Adaptation to Civilian Life have established psychometric properties, must remain
intact, and therefore require limited psychometric analyses. Cronbach’s alpha will be calculated
for those scales to determine whether the pre-existing, validated scales maintain the
psychometric properties within the population and program context. Cronbach’s alpha will also
be used to test the internal consistency of the Creative Expression pilot items and which items do
not contribute to the scale.
B.4.3 Outcomes Analysis
To assess changes in the four outcome areas, change scores will be calculated from the
post to pre for each of the subscales and two stand-alone items that address general health and
well-being. We propose paired t-tests to analyze pre/post change, assuming the data meets the
psychometric criteria. Outcomes will also be disaggregated and analyzed by demographic and
program variables to determine whether change over time differs for subgroups or program
features.
B.4.4 Implementation procedures and methodology
The pilot study will also test the following procedures and methodology:
1. Coordination with program directors: This study and the future data collection will
require coordination with program directors to access participants through
enrollment form data. In both cases, funding agreements will require cooperation
with the survey. During the pilot study, the contractor will provide information and
instructions to guide program director involvement. At the end of the study,
feedback from directors about the process will be obtained through an interview.
2. Method of matching pre and post survey responses: Enrollment data will provide the
11
contract researcher with contact information for participants, including an email
address. Individual, unique codes will be generated for individual participants to
access the survey. The unique code will allow the contract researcher to link the pre
and post survey data.
3. Survey technology: The survey will be administered electronically and will be
formatted for administration on mobile devices and computers.
4. Administration of the survey across multiple programs that use different
implementation models and timeframes: The survey methodology needs to be
flexible enough to accommodate different program models and timeframes, while
protecting the survey’s statistical integrity. The pilot study will document these
program variables and impacts on administration.
B.5 Individuals consulted on statistical aspects & individuals collecting and/or analyzing
data
Provide the name and telephone number of individuals consulted on statistical aspects of
the design and the name of the agency unit, contractor(s), grantee(s), or other person(s)
who will actually collect and/or analyze the information for the agency.
The Arts Endowment contracted ProgramWorks to develop the Participant Outcomes
Survey and to conduct cognitive testing. Veritas Management Group is contracted to conduct the
pilot testing of the survey.
Exhibit 2. Individuals Consulted
Name
Title (Project Role)
Organizational Affiliation
and Address
Phone Number
Parties doing the data collection and analysis for the development of the survey and cognitive testing.
Shawn Bachtler
Project manager
ProgramWorks; 8155 13th Ave
SW, Seattle, WA 98106
206-595-5878
Candace Gratama
Co-project manager
ProgramWorks; 8155 13th Ave
SW, Seattle, WA 98106
206-229-8530
Kari Peterson
Statistical expertise and
analysis
ProgramWorks; 8155 13th Ave
SW, Seattle, WA 98106
206-697-1473
12
Name
Title (Project Role)
Organizational Affiliation
and Address
Phone Number
Parties doing the data collection and analysis for the Participant Outcome Pilot Survey.
Debra Holden
Project director
Veritas Management Group; 970
Autumn Close Alpharetta, GA
30004
919-824-0369
Gal Hal-Miller
Technical Assistance Plan
Lead
Veritas Management Group; 970
Autumn Close Alpharetta, GA
30004
919-824-0369
Uduak Bassey
Project Management and
Evaluation Support
Implementation Plan Lead
Veritas Management Group; 970
Autumn Close Alpharetta, GA
30004
919-824-0369
National Endowment for the Arts staff consulted
Patricia Moore
Shaffer
Deputy Director | Research
and Analysis
National Endowment for the Arts
400 7th Street SW | Washington DC
20506
202-682-5535
Melissa Menzer
Program Analyst | Research
& Analysis
National Endowment for the Arts
400 7th Street SW | Washington DC
20506
202-682-5548
13
File Type | application/pdf |
File Title | Supporting Statement for OMB No |
Author | USDA |
File Modified | 2022-07-06 |
File Created | 2022-07-06 |