Attachment C_NASAA CARES Act Survey Cognitive Testing Report

Attachment C_NASAA CARES Act Survey Cognitive Testing Report Final May 26-6.9.21-clean.pdf

Regional and State Arts Agency ARP Funding Survey

Attachment C_NASAA CARES Act Survey Cognitive Testing Report

OMB: 3135-0144

Document [pdf]
Download: pdf | pdf
ProgramWorks
Seattle
NATIONAL ASSEMBLY OF STATE ARTS
AGENCIES CARES ACT FUNDING SURVEY
COGNITIVE TESTING SUMMARY REPORT
MAY 26, 2021
This summary report includes:
Project description (p.1)
Methodology (p.1)
Analyses and Results (p. 3)
Recommendations (p. 8)
Appendix (p. 11)

Project Description
In 2020, the National Endowment for the Arts (Arts Endowment) distributed funding to State Arts
Agencies (SAAs) through the 2020 Coronavirus Aid, Relief and Economic Security (CARES) Act. The
National Assembly of State Arts Agencies (NASAA) is developing the CARES Act Funding Survey to
understand the impact of that funding. As part of survey development, NASAA contracted with
ProgramWorks (PW) for cognitive testing of survey items with a representative sample of interviewees.
The findings will be used for the Paperwork Reduction Act clearance needed for the survey.

Methodology
PURPOSE OF COGNITIVE TESTING
The purpose of cognitive testing of data collection instruments is to reduce measurement error
associated with respondents, by evaluating the quality of questions and the extent to which the
questions solicit the desired information. The goal for cognitive testing of the CARES Act Funding
Survey is to ensure future respondents interpret the items as NASAA intends and the survey ultimately
yields data that is easy to manage and straightforward to interpret.

COGNITIVE TESTING SAMPLE: CHARACTERISTICS AND RECRUITMENT
Throughout this report, “participant” refers to the eight SAA representatives who participated in
interviews. “Grantee” refers to the entities who received funding from the SAAs. One representative
from a regional arts agency (RAO) was also contacted to provide feedback on the survey instrument.
NASAA developed a pool of participants for cognitive testing, with the sample representing a range of
SAA characteristics including:

1







SAA geographical location;
SAA budget size;
Staff position with SAA (e.g., Executive Director, Deputy Director, grants officers);
Methodology for collecting data; and
How CARES Act funds were administered.

NASAA and ProgramWorks collaborated to recruit and schedule eight participants for cognitive testing.
Each participant was assigned to one of two ProgramWorks interviewers. In addition to the SAA
representatives, a RAO representative was invited to provide feedback on the instrument.

COGNITIVE TESTING PROTOCOL
Cognitive testing participants completed the 12-item CARES Act Funding Survey online.
Simultaneously, they completed the Interviewee Recording Form (see Appendix), responding to five
prompts for each numbered survey item:
1.
2.
3.
4.
5.

Is the data/information available?
Can you answer the question with the available data/information?
Comprehension: Is the question fully understandable?
My confidence in accurately answering this is: (rating scale)
I have additional comments.

After completing the survey and the Interviewee Recording Form, each participant submitted the
completed recording form to the ProgramWorks interviewer (PW). PW also obtained the survey data
entered by the participants.
Prior to each interview, PW reviewed the participant’s Interviewee Recording Form and survey data and
identified items for targeted follow-up questioning. This included items where:






The data/information was not unavailable or could not be answered with the available data;
The participant reported a problem with comprehension;
The participant did not have full confidence in accurately answering the item;
The participant indicated they had comments; and/or
The participant’s survey data did not meet expectations for content.

During the interview, PW and the participant reviewed each item. First, the participant read the item
silently. If the item was not flagged for follow-up questioning, PW asked the participant if they had any
comments or questions about the item. If the item was flagged for questioning, PW followed up with
prompts such as:









What information do you need to answer that question?
How did you arrive at that answer?
Why is the confidence rating for this item 1/2/3?
Can you rephrase the question (or response choice) in your own words?
Please describe your thinking.
What makes that item confusing/ambiguous/difficult to answer?
How could that question be improved?
Do you have any additional comments?

2

After reviewing all items, the participant was asked to comment on their experience with the survey
format, including the ease of accessing and completing the survey and whether any other issues arose.
PW made notes of the participants’ responses throughout the interview.

Analyses and Results
PARTICIPANT ACCESS AND USE OF THE ONLINE SURVEY
Participants found the survey platform easy to use and no technical issues arose when accessing or
completing the survey. There was a recommendation to alert future participants to the information that
will be requested in advance through an option to preview the survey online or by circulating the items
in advance. They also recommended being able to stop and restart the survey and to return to previous
questions.
TIME TO COMPLETE THE SURVEY
Participants reported the survey took 29.4 minutes, on average, to complete, with a range of 5 to 90
minutes. Those who reported longer times said they needed to consult with colleagues and/or extract
data. Those who reported shorter times acknowledged that it would take longer to complete the survey
when they had all of their data to report. All participants reported that additional time will be needed to
extract information from spreadsheets, applications, and/or final reports. Time estimates for extracting
and compiling this data ranged from 2 to 35 hours. As noted in the Results section, some participants
were not certain they could provide the requested data, completely and accurately, based on their
existing data.
ANALYSES
As noted above, the Interviewee Reporting Forms guided the interviews for cognitive testing. Data from
the eight Interviewee Reporting Forms are aggregated in Table 1. As this table shows, there was
considerable diversity in response to the prompts with no discernable pattern across items.
PW reviewed participants’ interview answers for each item to identify both common and unique
responses regarding the construction of the survey, overall, and of the individual survey items.

3

Table 1.
Aggregated Participant Responses from the Interviewee Recording Form

INTERVIEWEE RECORDING FORM
AGGREGATED PARTICIPANT RESPONSES (N = 8*)
Is this
data/information
available?
Not
Yes No
applicable

Item #

1
2
3
4
5
6
7
8
9
10
11
12

8
8
3
1
3
3
6
7
8
4
5
2

4
4
5
5
2
1

1
3

2
2

2
2

Can you answer the
question with the
available data/info?
Yes

8
8
3
2
3
3
4
7
8
4
6
6

No

Comprehension: Is
the question fully
understandable?
There’s a
Yes
problem

8
7
7
7
7
5
5
8
7
4
7
8

5
6
5
5
4
1
3
1
2

1
1
1
1
1
2
1
4

My confidence in
accurately answering
this question is:
1
4
2
3
Low
High

1
2
1
2
2

1

3

1
2
1
2

1
1

1
3
1
1

1
2
2
1
4

I have
additional
comments.
Check 
if true

8
4
7
4
4
4
4
6
5
4
3
7

*where an item total within a column does not equal 8, one or more participants did not provide a response

RESULTS
Aggregated analyses of the Interviewee Recording Forms, survey data, and interviews identified 1) six
factors that adversely affected participants’ responses across multiple items and 2) challenges related to
the interpretation of individual items.
The six factors that adversely affected participants’ responses across multiple items pertained to a
disconnect between the data requested in the survey and the data that has been collected by SAAs.
Those factors are summarized here, rather than at the item level, because they impact multiple items.
Given the approach to the distribution of funding to the SAAs, this information does suggest a few item
changes, but may be more relevant for interpretation of the survey data.

1. The Arts Endowment did not specify data collection requirements for the CARES Act funding.
All participants reported that they could have provided the requested data, had they been
informed of the requests at the outset. This would have enabled them to include the data points
in the grantee applications or final reports. As an aside: Participants believed there were
minimal data collection requirements because the Arts Endowment made every effort to
distribute the funding quickly, and there was great appreciation for that.
2. Participants were unsure how to address CARES Act funding received by SAAs and grantees
from other sources. Participants questioned whether they should include CARES Act funding
received from other sources. Further, they believed it would be difficult to distinguish CARES Act
funding from different sources in grantee data.

4

4
6
5
5
4
5
3
3
7
2
2

3. Isolating CARES Act funding in grantee data is difficult. Participants indicated it may be difficult
to isolate the utilization and impact of CARES Act funding, depending on the approach to
recording funding and expenditures in the final reports.
4. The data are not yet available, as the deadlines for grantee final reports are in May or June.
SAA grantees typically submit annual final reports in May or June, at which time the data will
become available. This limits reporting on the creation and retention of staff positions, as well
as expenditures on facilities/infrastructure, at this time.
5. Current SAA data requirements for CARES Act funding recipients do not align with data
requested in the survey. While all SAAs collect data from CARES Act funding recipients, it may
not align with the data requested in the survey. One participant said, “The ‘total dollar amount
that your grantee invested in facilities/infrastructure [item 7],’ we did not ask the question that
way. I don’t know if we will get that specific information in the final reports.” Some SAAs collect
categorical data (check boxes) for funding in sub-categories of infrastructure and operations,
such as rent or utilities, but do not track dollar amounts. They can determine how the funds are
used but are unable to provide an estimate of the amount of CARES Act funding used for
facilities/infrastructure. This resulted in entering a 0 for item 7.
There are similar issues for the survey items that address staff positions (items 3 through 6).
One person commented, “We asked about the numbers of people laid off, the total loss of
revenue, cancelled programs. We asked for losses, not retention.” While they will have data on
staffing levels before and after the CARES Act funding, they did not ask whether the retention or
addition of staff is a result of that funding. Other participants asked grantees whether the
funding supported jobs, but they did not track the number of jobs, differentiate between
created and retained positions, or differentiate between full-time and part-time positions.
Some SAAs also reported challenges related to the type of entity that received CARES Act
funding. For example, one SAA distributed funds to organizations to support infrastructure and
employees but considered it an “infrastructure” grant and did not collect data staffing. They also
provided grants to individual artists for “loss of gross income” based on tax returns. This data for
individual artists is available, so they reported “Yes” on item 8 and could provide the data for
item 9 about the artists supported. However, they did not report this as retained positions in
items 5 and 6 because the funding did not go to organizations. Reflecting on this, the participant
said, “We gave grants to artists and organizations. This survey seems to be only looking at
organizations. Those who make grants to both can’t parse this out. I actually collected more
information from the artists than the organizations, but there is no useful way to provide that
information in this survey.”
6. Data may exist but are not easily accessed. For example, three SAAs reported that data for
creation and retention of staff positions (items 4 and 6), the amount of funding grantees
invested in facilities or infrastructure (item 7), and specific benefits (item 11) will be embedded
in narrative sections of grantees’ final reports. Extracting the data would require considerable
staff time and may ultimately not be reliable. Two participants believed they could, at best,
create estimates based on narrative and numerical data. One participant’s response reflected
those of several others:
“The survey was easy. However, I could get frustrated because we weren’t given guidance
as to what data to collect. We could track jobs and facilities on the front end, and it would

5

have been helpful for them. We could go back and tweak the data to support the
impact…We have it, but we don’t have an easy way to get it. Anecdotally, we know that
this had an impact on retaining jobs and paying bills. It’s just hard to pull that in a clean,
easy way because we did not know we were reporting on it.”
It is important to note that some SAAs used their standard process and data collection systems for
existing grants programs to administer the CARES Act funding. They reported that these systems are
more likely to include the variables represented in the survey (e.g., separate data points for contractors
and staff positions), which will enhance their ability to provide valid and reliable data.
Table 2 summarizes the results of the cognitive testing by item number. Items with asterisks have
corresponding recommendations for revisions in the Recommendations section (see Table 3). The
information in Table 2 provides the rationale for the recommendations in Table 3.
Table 2.
Survey Item – Results of Cognitive Testing

ITEM #
Intro
1
2*

3*
4*

5*
6*

FEEDBACK
Understandable and straightforward.
Item is understandable and straightforward.
Participants had questions/comments about formulating an answer.
 The item allows multiple responses but does not indicate “select all that apply” or
“select only one.” That information should be included.
 If only one response is allowed, provide criteria for determining the top selection;
for example, base the selection on the total amount of funding, total number of
grants, or other parameter.
Item is understandable and straightforward.
 Data may not be readily available or valid.
Item is understandable and straightforward.
 Data may not be readily available in requested categories.
 Data may be available but not fit the categories
 One participant was uncertain about the meaning of “match” for the fourth entry
field and suggested simplifying to “…numbers for the above categories…”
 To improve interpretation of the last option (the estimate), participants
recommended adding a field to explain how the estimate was calculated.
Item is understandable and straightforward.
 Data may not be readily available or valid.
Item is understandable and straightforward.
 Data may not be readily available in requested categories.
 Data may be available but not fit the categories
 One participant was uncertain about the meaning of “match” for the fourth entry
field and suggested simplifying to “…numbers for the above categories…”
 To improve interpretation of the last option (the estimate), participants
recommended adding a field to explain how the estimate was calculated.

6

7*

8*

9*

10*

11*

12

Item is understandable but participants had questions/comments about formulating an
answer.
 Does the item refer exclusively to CARES Act funding received directly from the Arts
Endowment, or does it include Arts Endowment CARES Act funding from other
sources?
 What does “invested” mean? For example, is paying rent an investment?
 Is there a way to report the number of organizations using funds for infrastructure if
the dollar amount was not collected?
Item is understandable but participants had questions/comments about formulating an
answer.
 All eight participants reported collecting qualitative and/or quantitative data.
 What does “impact” mean? How should the question be answered if additional data
were collected for the Arts Endowment CARES Act funding, but it was not
specifically impact data?
Item is understandable but participants had questions/comments about formulating an
answer.
 The question was deemed “too broad,” and the type of data requested is unclear.
Participants suggested adding examples and specificity.
 Data may be qualitative/narrative only.
Item is understandable but participants had questions/comments about formulating an
answer.
 Participants requested guidance about the type and format of data that should be
submitted to ensure it is useful. For example, SAAs may have extensive narrative
data or a large spreadsheet that, in raw form, would require considerable scrutiny
by NASAA. Alternatively, some SAAs do not have resources to curate or extract
those data.
 Participants requested guidance about reporting data from grantee applications
versus final reports.
Item is understandable but participants had questions/comments about formulating an
answer. These related to the line items:
 “Maintained facilities” – Does this mean “was able to retain facility use” or “physical
maintenance”
 “Added to other relief efforts” – There was uncertainty about the scope/intention
of this item. If considered broadly, most will answer “agree.”
 “Helped support artists” – Participants were confused about this item, noting that
1) funds could not directly go to individual artists and 2) if the item includes both
direct and indirect support, all SAAs will answer “agree” based on inference.
 Items referring to types of “programs” – Participants wondered about the
relevance, as funding was intended for operating support and not programs. If the
item includes both direct and indirect support, all SAAs will answer “agree” based
on inference. Two SAAs recommended deletion for that reason.
Item is understandable and straightforward.

7

Recommendations
Recommendations for changes to the survey items are provided in Table 3. These changes are intended
to increase the accuracy, consistency, and validity of the data, ultimately making it more interpretable.
For some items, the recommendations include specific changes. Other items will require NASAA to
consider the intention behind the item and the choice of language. For those items, considerations are
offered.
Table 3.
Recommendations for Survey Item Revisions

ITEM #

ORIGINAL WORDING

2

The item does not indicate
“select only one” or “select
all that apply.” It currently
allows selection of multiple
responses.

3, 5

The item responses include
“yes,” “no,” and “I don’t
know.”

4, 6

“If you don’t have the
numbers that match the
above categories…”
“…please include an
estimate of full-time
equivalent positions.”

4, 6

7

What was the total dollar
amount that your grantees
invested in
facilities/infrastructure,
using Arts Endowment
CARES Act funding?

RECOMMENDED
REVISION
Add “select all that apply”
OR add “select only one,”
adjust the settings to allow
only one response, and
provide criteria for
determining the top
selection. For example, base
the selection on the total
amount of funding, total
number of grants, or other
parameter.
Add a response option:
“Data were collected but
lack validity.”

“If you don’t have the
numbers for the above
categories…”
“…please include an estimate
of full-time equivalent
positions and explain how
the estimate was
calculated.”
Either in the survey
introduction (recommended,
as it applies to multiple
items) or in the item, specify
whether the item/survey
refers exclusively to CARES
Act funding received directly
from the Arts Endowment, or
also includes CARES Act
funding from other sources.

8

REASON FOR CHANGE
Make the expectation for the
response fully clear to
increase accuracy of
responses and consistency
across respondents.

This option allows
respondents to indicate the
data has been collected but
lacks validity. This will assist
with interpretation of the
data in items 4 and 6.
Simplify wording to remove
uncertainty about “match.”
This information increases
interpretability.

Make the expectation for the
response fully clear to
increase accuracy of
responses and consistency
across respondents.

7

What was the total dollar
amount that your grantees
invested in
facilities/infrastructure,
using Arts Endowment
CARES Act funding?
What was the total dollar
amount that your grantees
invested in
facilities/infrastructure,
using Arts Endowment
CARES Act funding?
“…did your agency collect
any data on the impact of
Arts Endowment CARES
Act funding?”

Consider synonyms such as
“used for” or “committed to”
to remove the allusion to
actual investment.

9

“If yes, …”

Provide examples.

10

“…please attach any
grantee-level data that you
collected specifically for
the grants that included
Arts Endowment CARES
Act dollars.”

Provide guidance on the type
and format of data that will
be useful to NASAA, with
examples. If guidance is
provided, consider deleting
“any.”

11

“Maintained facilities”

11

“Added to other relief
efforts”

Edit to clarify. For example,
“Maintained facilities
(physical maintenance)” or
“Maintained facilities
(physical maintenance
and/or retained access to
facilities)”
Edit or add examples to
clarify.

7

8

Consider also collecting data
on the number of
organizations that received
money for infrastructure.

Determine whether the item
focuses specifically on
impact, or whether it
includes any/all data related
to the Arts Endowment
CARES Act funding and adjust
wording.

9

Clarify or change
terminology to increase
comprehension as well as
accuracy of responses and
consistency across
respondents.
Some SAAs may not have
data on dollar amounts. The
number of organizations
provides an alternate
statistic.
Clarify or change
terminology to increase
comprehension and to
increase accuracy of
responses and consistency
across respondents. Provide
examples of the type of data
requested.
Clarify to increase
comprehension as well as
accuracy of responses and
consistency across
respondents.
Make the expectation for the
response fully clear to
minimize burden on SAAs
and increase the likelihood
that the data NASAA receives
will be usable. Note that
some requests may require
considerable time for SAAs
to format/curate the data.
Clarify or change
terminology to increase
comprehension as well as
accuracy of responses and
consistency across
respondents.
Clarify or change
terminology to increase
comprehension as well as
accuracy of responses and
consistency across
respondents.

11

“Helped support artists”

Edit to clarify “support.” For
example, “Provided direct
support to artists,” “Helped
support artist indirectly,” or
“Helped support artists
through __X___”

11

“…continued delivery
of…arts education
programs…community arts
programs…new content…”

No recommended edits.

10

Clarify or change
terminology to reduce
inference and increase
comprehension as well as
accuracy of responses and
consistency across
respondents.
Noted here to flag for
interpretation or deletion, as
the responses are likely
based on inference.

Appendix

11

INTERVIEWEE RECORDING FORM
INSTRUCTIONS: For each item, please indicate the following.
Is the
data/information
available?
Do you (or will
you) have the
data or
information
requested?

Can you answer the
question with the
available data/info?
Does the available
data/info enable you
to answer the
question?

Interviewee name

Enter

Time to complete
survey

Enter

Item #

Is this
data/information
available?
Yes

No

Not
applicable

Comprehension: Is
the question fully
understandable?
Consider the wording
and all terms.
Check “there’s a
problem” if you have
any questions or
recommendations for
the question or
response choices.

Can you answer
the question with
the available
data/info?
Yes

No

My confidence in
accurately
answering this is:
Rate how confident
you are in
answering the
question accurately.

Comprehension: Is
the question fully
understandable?
Yes

There’s a
problem

1
2
3
4
5
6
7
8
9
10
11
12
Now that you are done, what are your reactions or thoughts overall?

12

I have additional
comments.
Check the box if you
have any additional
questions, concerns,
comments, etc. No need
to detail them here, but
we recommend making
notes to remind yourself.

My confidence in
accurately answering
this question is:
1
Low

2

3

4
High

I have
additional
comments.
Check 
if true

SURVEY INSTRUMENT

13

CARES Act Funding Impact Survey v0
(untitled)
3
In spring 2020, as a response to the COVID-19 pandemic, Congress passed the Coronavirus
Aid, Relief and Economic Security (CARES) Act, which packaged numerous relief efforts for
the American public. The CARES Act included $75 million in funds distributed through the
National Endowment for the Arts. By law, 40% of those dollars were allocated to state arts
agencies and regional arts organizations. Your agency received these Arts Endowment
CARES Act funding as a supplement to your FY19 Partnership Agreement dollars. No new
reporting requirements were associated with the provision of those funds; however, we know
that many states attempted to track the impact of the CARES Act funds. In order to better
understand the impact of these specific Arts Endowment CARES Act dollars on a national
level National Assembly of State Arts Agencies (NASAA) and the National Endowment for
the Arts are administering this short survey. Please answer the following questions to the
best of your ability, based on information you have received from your grantees. This
survey will take about 5 minutes to complete and you will be able to save and return to it
later if needed. We ask that you answer all questions to the best of your ability and to the
fullest extent possible. All results will be anonymized and reported in the aggregate.
We greatly appreciate both your efforts to answer these questions and your hard work in
allocating and reporting these funds in the first place. Please submit your response by
DATE and direct any questions to Patricia Mullaney-Loss at [email protected].

28

1. Please select your state:
Alabama
Alaska
American Samoa
Arizona
Arkansas
California
Colorado
Connecticut
Delaware
District of Columbia

Colorado
Connecticut
Delaware
District of Columbia
Florida
Georgia
Guam
Hawaii
Idaho
Illinois
Indiana
Iowa
Kansas
Kentucky
Louisiana
Maine
Maryland
Massachusetts
Michigan
Minnesota
Mississippi
Missouri
Montana
Nebraska
Nevada
New Hampshire
New Jersey
New Mexico
New York
North Carolina
North Dakota
Northern Mariana Islands
Ohio
Oklahoma
Oregon
Palau
Pennsylvania
Puerto Rico
Rhode Island
South Carolina
South Dakota
Tennessee
Texas
Utah
Vermont
Virgin Islands
Virginia

Vermont
Virgin Islands
Virginia
Washington
West Virginia
Wisconsin
Wyoming

4

2. How did your state administer Arts Endowment CARES Act funding?
As stand-alone grants, using only Arts Endowment funds
As a supplement to previously allocated General Operating Support
(GOS) grant funds
As a supplement to previously allocated grant funds other than GOS
As part of an emergency relief funding grant that included a mixture of Arts
Endowment CARES Act funds and other funds
Other - Write In

5

3. Did your agency track how many staff positions your grantees were able to
create as a result of Arts Endowment CARES Act funding?
Yes
No
I am not sure

Must be numeric
6

4. If yes, please report the total number of staff positions your grantees
created as a result of Arts Endowment CARES Act funding:
Fulltime:
Part time:
Contractors:
If you don’t have numbers that match the above categories, please include
an estimate of fulltime equivalent positions:

7

5. Did your agency track how many staff positions your grantees were able to
retain as a result of Arts Endowment CARES Act funding?
Yes
No
I am not sure

Must be numeric
8

6. If yes, please report the total number of staff positions your grantees
retained as a result of Arts Endowment CARES Act funding:
Fulltime:
Part time:
Contractors:
If you don’t have numbers that match the above categories, please include
an estimate of fulltime equivalent positions:

(untitled)
Must be currency
9

7. What was the total dollar amount that your grantees invested in
facilities/infrastructure, using Arts Endowment CARES Act funding?
0

10

8. Apart from information about staff positions created or retained or
investments in facilities, did your agency collect any data on the impact of
Arts Endowment CARES Act funding?
Yes
No
I am not sure

11

9. If yes, what data did you collect related to the impact of Arts Endowment
CARES Act funding?

Accepts up to 3 files. Allowed types: png, gif, jpg, jpeg, doc, xls, docx, xlsx, pdf, txt,
mov, mp3, mp4. Max file size: 10 MB
12

10. If yes, please attach any grantee-level data that you collected specifically
for grants that included Arts Endowment CARES Act dollars.
Browse...

13

11. Arts Endowment CARES Act funding allocated through my agency
allowed for the following benefits in my state:

Agree
Retained jobs
Created jobs
Maintained facilities
Added to other relief efforts
Helped support artists
Helped support our state’s cultural
infrastructure
Helped leverage local, state, and/or
private support
Allowed for the continued delivery of arts
education programs
Allowed for the continued delivery of
community arts programs
Allowed for the continued delivery of new
content using digital technology
Assisted organizations to sustain
themselves while shifting to
alternative/online content delivery and
programing
Enter another option
Enter another option
Enter another option

Neither
Agree
nor
Disagree

Disagree

Not
applicable

27

12. Is there anything else you would like to relay about the importance,
impact, or challenges related to Arts Endowment CARES Act funding?

Thank You!
1
Thank you for taking the CARES Act Funding Impact Survey. If you have any questions,
please contact [email protected].


File Typeapplication/pdf
File TitleProposal for .Graduation Success and Equity Initiative .RFP NO. 2016-04
SubjectPrepared for the.Office of Superintendent of Public Instruction..March 2016
AuthorShawn Bachtler
File Modified2021-06-09
File Created2021-05-27

© 2024 OMB.report | Privacy Policy