Part A Supporting Statement Final_080118

Part A Supporting Statement Final_080118.pdf

2016 Poetry Out Loud Evaluation

OMB: 3135-0139

Document [pdf]
Download: pdf | pdf
 
 
 
Supporting Statement for the Evaluation of the Poetry Out Loud 
Program, Part A 

Table of Attachments
ATTACHMENT A: CONSENT AND ASSENT FORMS
ATTACHMENT B: PRINCIPAL AND SCHOOL DISTRICT SUPERINTENDENT
LETTER OF SUPPORT
ATTACHMENT C: DATA SHARING AGREEMENT
ATTACHMENT D: PRE- POST- SURVEY
ATTACHMENT E: POL STUDENT INTERVIEW PROTOCOL
ATTACHMENT F: POL TEACHER INTERVIEW PROTOCOL
ATTACHMENT G: STATE ARTS AGENCY ADMINISTRATOR INTERVIEW
PROTOCOL
ATTACHMENT H: STUDENT FOCUS GROUP PROTOCOL
ATTACHMENT I: COGNITIVE TESTING REPORT

Part A. Justification
1. Explain the circumstances that make the collection of information necessary.
This study is a new data collection request, and the data to be collected are not available
elsewhere unless collected through this information collection. The data collection activities are
planned for September 2018 through June 2019. The study will provide the National Endowment
for the Arts (NEA) a better understanding of student-level outcomes associated with the Poetry
Out Loud program.
Since its founding in 2005, Poetry Out Loud (POL) is a national arts education program
implemented annually that encourages the study of great poetry. The program consists of a tiered
poetry recitation competition to high schools across the country supported by free educational
materials. Beginning at the classroom level typically during the fall semester, winners will
advance to a school-wide competition, then to a regional competition (if implemented in the
state), then to a state competition, and ultimately to the national finals in Washington, DC, held
in late April or early May. The program is a partnership among the National Endowment for the
Arts (NEA), the Poetry Foundation,1 and the state and jurisdictional arts agencies of the United
States. POL serves more than 3 million students and 50,000 teachers from 10,000 schools in
every state plus Washington, DC, the US Virgin Islands, and Puerto Rico.
Information about the competition and instructional resources is provided through the
Poetry Out Loud website (poetryoutloud.org). Participating teachers use the Poetry Out Loud

1

The Poetry Foundation, publisher of Poetry magazine, is an independent literary organization committed to a
vigorous presence for poetry in our culture.

toolkit (including the Teacher’s Guide and classroom posters) and online resources (including
lesson plans, learning recitation videos, and information on how to run a competition) to teach
poetry recitation and run classroom competitions. Students select, memorize, and recite poems
from an online anthology of more than 900 classic and contemporary poems. Information on
evaluation criteria and judging is also publicly available on the website.
Poetry Out Loud is implemented in schools and classrooms in generally one of two
ways—requiring mandatory student participation or allowing students to voluntarily participate
in the program. Mandatory participation means that a teacher requires his or her entire class(es)
to participate in the Poetry Out Loud program. Some schools may additionally require gradelevel participation or even school-wide participation. In contrast, some schools may opt to have
students voluntarily participate in the program. This means that students self-select to participate
in Poetry Out Loud whether this is in the classroom or in an after-school club.
Each organizing partner makes significant contribution to program planning and
implementation. Each year, the NEA and Poetry Foundation collaboratively: develop or update
the content and design of all Poetry Out Loud program materials (including the Teacher’s Guide,
anthology, poster, and website); coordinate and provides technical assistance to program
managers at the state arts agencies; plan the Poetry Out Loud National Finals; and invest in
expanding the program’s reach to new audiences. The NEA provides funding to state arts
agencies to implement the program and to run the national finals as well as support and resources
for state and local-level partners, teachers, and students. The Poetry Foundation provides funding
for the program’s prizes, travel, permissions, website, materials, and distribution of materials in
addition to support and resources for state and local-level partners, teachers, and students. Each
state arts agency is responsible for administering Poetry Out Loud in their state. This includes

publicizing the program, recruiting schools to implement Poetry Out Loud in the classroom, and
conducting a state competition. Each state arts agency receives an NEA grant of $17,500 to assist
with expenses of Poetry Out Loud program coordination.
The study supports the Agency’s FY 2018-2022 Strategic Plan, which seeks in part to
“expand and promote evidence of the value and impact of the arts for the benefit of the American
people” (Strategic Objective 3.2). The current evaluation study will be the first since 2008. The
prior implementation evaluation, which was commissioned by the Poetry Foundation, focused on
the reach, support, and engagement with POL by students and participating schools, providing
compelling evidence that the program had continued to grow (over the course of the three years)
and reach increasingly diverse students, rural schools, and schools with and without existing
strong arts programs. Additionally, the evaluation found that POL helped to facilitate both the
engagement and retention of teachers by providing them resources to bolster existing curricula.
With respect to student-level outcomes, the evaluation focused largely on poetry appreciation
and engagement. However, since the evaluation engaged only state-level POL student
champions, these study findings are not assumed to be representative of POL participants in
general.
The current evaluation was requested by NEA senior leadership and program partners
who seek to build upon the past evaluation by increasing understanding of POL’s impact on
student participants. Specifically, agency and partner staff expressed interest in understanding
the impact of POL on students who had not volunteered to participate – that is, students whose
teachers required their participation in POL (“mandatory student participation”) – in order to
reduce or eliminate the bias associated with self-selection. The study will focus on assessing
student outcomes in poetry appreciation and engagement, but also student-level outcomes

associated with social and emotional development, and academics. In order to more fully
understand the impact of POL, a quasi-experimental design was sought that established a
comparison group of students who did not participate in POL.
Program managers are also interested in understanding the effectiveness of the program
when it is implemented under conditions promoted by the POL partners as optimal. The current
study is structured as an efficacy study in order to examine the student-level benefits of this
program under these optimal conditions. Because POL programming varies across schools and not
all schools that implement POL do so under optimally conditions, the present study is not intended to
be representative of the entire universe of schools implementing POL.

The study design reflects the agency’s understanding of the program, as reflected in the
POL logic model. This logic model (see Figure 1), which was updated in 2017 following a
review of the 2008 evaluation study and a literature review, was based on the following theory of
change: by providing access to educators and students with comprehensive poetry resources,
engaging students with thoughtful curricula that encourages the performative aspect of poetry,
and creating a national competition structure to challenge students and celebrate their
accomplishments, the NEA expects that by participating in Poetry Out Loud: students’ academic
and performance skills are strengthened, students’ social and emotional health improves,
teachers’ knowledge of and confidence in teaching poetry increases, and students and their
community’s awareness and appreciation of poetry and arts programming increases.2

2

While the POL logic model signals the program’s intent to positively affect teachers and communities, the NEA
senior leaders and program managers focused the current study on student-level outcomes only since the impact on
students was considered of primary importance.

Fig. 1 Poetry Out Loud Logic Model

2. Indicate how, by whom, and for what purpose the information is to be used. Except
for a new collection, indicate the actual use the agency has made of the information
received from the current collection.
A2.1 Study Overview
The study will focus on assessing student outcomes in poetry appreciation and
engagement, as well as on social and emotional development, and academic achievement using a
rigorous quasi-experimental design combined with qualitative data collection and analysis of
program implementation in a purposive sample of 10 schools. Figures 2, 3, and 4 outline the
research questions, outcomes, constructs, indicators, and data sources for this study. Each matrix
represents one of the following three domains: (1) student academic engagement and
performance; (2) student socio-emotional development; and (3) student poetry appreciation.

Figure 2: Poetry Out Loud Academic Engagement and Performance Evaluation Matrix

Does POL have a positive
impact on students’ reading
comprehension and/or
analytical skills (particularly
regarding poetry)?

- Reading comprehension
- Analytical skills reading
poetry

- Scale scores in
standardized test
scores in reading
comprehension

Are POL students more likely
to be comfortable using
metaphor, simile, or a wider
vocabulary in writing or in
speaking after the program?

- Comfort with different
poetry forms and devices
- Vocabulary development

- Relevant results from
interviews

Student
Records

Admin
Interviews

- Academic achievement in
English classes and in
school

- Standardized ELA
scale scores
- Standardized ELA
proficiency scores
- Relevant ELA
assessments
- Student GPA

Does student participation in
POL correlate with increased
academic engagement in
English classes and/or in
school more generally?

Teacher
Interviews

- # absences
- # suspensions
- Relevant results from
interviews and surveys

Data Source

Student
Surveys
Student
Interviews

- Academic engagement in
English classes
- Academic engagement in
school
- Academic motivation in
school
- Post high school
aspirations

ELA
Proficiency

Indicators

Lit History

Constructs

Analytical Cap

Outcomes
Learn/Engage

Research Questions

Figure 3: Poetry Out Loud Social and Emotional Development Evaluation Matrix

Do students experience increased
self-confidence in: their public
speaking abilities, social skills,
intellectual abilities, or in general
after participating in POL?

- Self-confidence

- Scaled survey scores related to confidence
in public speaking
- Relevant results from interviews

Do students feel more secure,
empowered, and/or articulate in
expressing themselves after
participating in POL?

- Self-confidence
- Empowerment

- Scaled scores related to comfort with selfexpression
- Relevant results from interviews

Are students more likely to engage
in civic activities during or after
participation in POL?

Are students more likely to engage
in extracurricular activities during
or after participation in POL?

- Civic engagement and
leadership

- In- and Out-of-School
engagement

- Survey scores related to participation in
community activities
- Survey scores related to involvement in
student leadership
- Relevant results from interviews

- Survey scores related to participation in
extracurricular activities, school clubs,
and/or after school programs
- Relevant results from interviews

Teacher
Interviews
Admin Interviews

Student Interviews

Data Source
Student Surveys

Indicators
Student Records

Constructs

Art Prog

Community
Engagement

Sense of Self

Outcomes

Confidence

Research Questions

Figure 4: Poetry Out Loud Poetry Appreciation and Engagement Evaluation Matrix

Does participating in POL correlate
with students’ increasing their
likelihood of reading or writing poetry
for pleasure?
Does POL promote the sharing of
poems among students and if so, by
what means?
Do students talk about poetry or POL on
social media networks after the
participation versus before?
Does a teacher or a school’s
participation in POL correlate with
greater incorporation of poetry in
classroom/school instruction?
Does POL participation correlate with
any attitudinal changes toward poetry,
academics, public speaking/performing,
or post high school aspirations?

- Behaviors related to
reading poetry
- Behaviors related to
writing poetry

- Agreement with reading poetry
- Agreement with writing poetry
- Relevant results from interviews
and surveys

- Sharing poetry with peers
- Sharing poetry via social
media (Facebook,
Instagram)

- Frequency scale of poetry
exchanges via social media type
- Relevant results from interviews
and surveys

- Increased poetry content
in curriculum

- Frequency scale of poetry inclusion
in curriculum
- Relevant results from interviews

- Attitudes toward poetry
- Attitudes toward public
speaking
- Post high school
aspirations

- Scale of attitude toward poetry
- Scale of comfort with public
speaking
- Attitude about finishing HS
- % planning to go to college
- Relevant results from interviews

Teacher
Interviews
Admin Interviews

Student Interviews

Data Source
Student Surveys

Indicators
Student Records

Constructs

Exposure to
SAA/NEA/PF

Arts Appreciation

Outcomes
Poetry Exposure

Research Questions

The quasi-experimental design will include pre- and post-student surveys of POL
program participants and non-participants and statistical analyses of student record data (e.g. test
scores, attendance) for all students in the 10 schools selected for the study. In the analyses of
student record data, we plan to use propensity score matching to construct a comparison group of
non-POL participants who are similar to the students who participate in POL. Additional details
about this process are later described in Supporting Statement B. In addition, we will conduct
qualitative data collection in 10 schools to help understand POL program implementation and the
experiences of those in the comparison group.
 To reduce duplicative data collection and combine all data into one dataset, the research
team considered linking the administrative and survey data. However, linking these two datasets
requires a student-level unique identifier that falls under the FERPA definition of personally
identifiable data (PII). According to FERPA (34 C.F.R. 99.3), PII includes any information that,
alone or in combination, is linked or linkable to a specific student that would allow an individual
to identify the student. POL participating schools receive federal funds and will be subject to
FERPA. As a result, schools can only disclose PII with prior written consent from the parent or
eligible student. Asking for and collecting written parent consent (as opposed to the proposed
passive consent approach) from every student will greatly reduce the number of students
participating in the study, thus compromising the desired 80% survey response rate and greatly
reducing the number of students in the administrative dataset. Educational researchers can use
de-identified student information that is not subject to FERPA’s requirement. Therefore, the
study design calls for educational records from schools that do not contain PII. While this means
that the contractor will not be able to link the administrative data to the survey data, obtaining

de-identified student information will enable the study contractor to reach the desired sample
size required for the study’s design (as discussed in Supporting Statement B).
Qualitative data collection at each of the 10 schools will include one 60-minute focus
group with students, four 45-minute student interviews, two 45-minute teacher interviews, one
30-minute administrator interview (either a school principal or State Arts Agency administrator).
The number and duration of interviews described add up to six hours – the typical length of a
school day, which is the length of time planned for site visits. The combination of student focus
groups and interviews should allow us to speak with approximately 100 students and 20 teachers
across study schools.
The contractor will pursue a passive consent strategy.3 That is, parents will be informed
about the study via a letter from the research team that the school assists in distributing to
students. Parents who object to their child participating in the study can return the form signed
and that child will not be included in the study; all others will be included. Youth participating in
interviews will be informed by the interviewer that participating in the interview is voluntary,
and that they can opt out at any time. Youth taking the survey will be notified electronically, on
the first page of the survey, that taking the survey is voluntary and that they can opt out at any
time. Students will receive an individual email with a unique link to the survey. This survey will
be accessible to students on a computer, tablet, or smartphone. Ideally, teachers at each school
will dedicate class time for students to take the survey either using a classroom set of computers

3

Generally speaking, fewer people opt out of opportunities than opt in. Therefore the risk of nonresponse bias may
be lower with the proposed approach because a smaller share of the sample will opt out. Weights will be constructed
to adjust for differences between participants and nonparticipants.

or a school computer lab. However, we recognize that each school context is different, and some
students may need to take the survey outside of classroom time. Additional detail on the survey
administration process is provided in Supporting Statement B. Students will provide a verbal or
electronic assent to participate. (See Attachment A for consent and assent forms in English and
in Spanish.)
A2.2 Purpose of the Data Collection
The purpose of this evaluation study is primarily to assess the student-level outcomes
associated with the POL program. The study aims to discern specific student-level outcomes that
are associated with participation in POL. This is an efficacy study. Efficacy studies examine the
benefits of an intervention under optimal conditions for the implementation of the Poetry Out Loud
program in schools. Because POL programming varies across schools and not all schools that
implement POL do so under optimally conditions, the present study is not intended to be
representative of the entire universe of schools implementing POL. Instead, the study will observe
the outcomes of interest under optimal conditions4 maximizing the likelihood of observing program
effects.

4

Optimal conditions as determined by the Poetry Out Loud program partners are as follows: states should have an
overall count of participating students exceeding 2,500; an overall count of participating schools exceeding 20;
presence of ancillary activities supporting state finals competitions, direct student exposure to a working artist, and
celebratory activities for students and families such as a welcome banquet or reception; formal teacher recognition at
the state level; opportunities for winning students to perform at local arts events throughout the state; strong support
for the POL program from executive leadership at the state arts agency; workshops for teachers and/or students
facilitated by the state arts agency; matching or overmatching of POL grant money with funds from the state arts
agency; and an annual program assessment.

A2.3 Who Will Use the Information
The study will provide information about POL to program participants (i.e. state arts
administrators, teachers, and students) and the NEA and their partners about the benefits of
participation. This information will be used to assess the agency’s investment in this program
and as an input for programmatic decision-making. Study findings will be shared with the public
through various media and press outlets that include but is not limited to the NEA website, the
POL website, and partner websites.
3. Describe whether, and to what extent, the collection of information involves the use
of automated, electronic, mechanical, or other technological collection techniques or
other forms of information technology.
The NEA takes very seriously its responsibility to minimize burden on respondents and
designed this project with that goal in mind. First, by designing a web-based student survey, the
Agency has eliminated hundreds of hours of labor that would have been required to administer a
paper-based on-site survey. By making the survey web-based, it becomes possible to survey
every member of a participating school. Because there are minimal costs associated with adding
participants, every person at a participating school will have a chance to answer the student
survey. Thus, the electronic nature of the student survey provides the most efficient mechanism
for the NEA to capture responses from students. Electronic surveys, and all communication
about the survey will be compliant with Section 508 of the Rehabilitation Act.
In addition, this program evaluation is multi-modal and uses a mixed-method approach,
such as triangulating administrative records with interviews and focus groups to build a
comprehensive overview of the POL program, its implementation, and the impact it has on
student outcomes. The study team will obtain administrative records from school districts

through data sharing user agreements, and also collect qualitative data at up to six schools via
remote video- and tele-conferencing systems. All other data collection will be done in person.
4. Describe efforts to identify duplication. Show specifically why any similar
information already available cannot be used or modified for use for the purposes
described in item 2 above.
There is no similar ongoing data collection being conducted that duplicates the efforts of
the proposed data collection for the study. The current study is designed to complement but not
duplicate the previous implementation evaluation study of the POL program conducted in 2008.
The current evaluation study will be the first since 2008. As noted in the response to question 1,
the prior implementation evaluation focused on the reach, support, and engagement with POL by
students and participating schools. While there were some reported student-level outcomes
related to poetry appreciation and engagement, these study findings are not assumed to be
representative of POL participants in general since the evaluation engaged only state-level POL
student champions. The current study, using a quasi-experimental design, is intended to produce
findings about student-level outcomes across multiple domains (academic engagement and
performance, social and emotional development, and poetry appreciation and engagement) when
the program is implemented under optimal conditions. An exhaustive literature review conducted
by the study team confirmed that there is little research on participation in poetry programs by
high school students.
5. If the collection of information impacts small businesses or other small entities,
describe any methods used to minimize burden.
There are no small business entities or other small entities involved in this data
collection.
6. Describe the consequence to Federal program or policy activities if the collection is

not conducted or is conducted less frequently, as well as any technical or legal
obstacles to reducing the burden.
This is a voluntary short-term study across one school year. This survey will provide the
NEA with quantitative and qualitative measures to gain valuable insights into the relationships
between the implementation of the POL program under optimal conditions and student level
outcomes. Without this evaluation, the NEA will have no methods for analyzing and assessing
the impact of its program and policy choices.
Conducting the collection less frequently or with fewer POL programs would not only
impede the Agency’s ability to track impact, but would also deprive students of an opportunity to
learn more about the impact of their participation in POL.
7. Explain any special circumstances
The proposed data collection activities are consistent with the guidelines set forth in 5
CFR 1320.6 (Controlling Paperwork Burden on the Public-General Information Collection
Guidelines). There are no special circumstances that require deviation from these guidelines.
The results will not be representative of the entire universe of states that are optimally
implementing POL because within optimally implementing states there is a lot of variation of
POL programs being implemented. Rather, results are representative of programs that have
similar characteristics to the programs implemented by the schools included in the 10-school
sample.
8. Comments in Response to the Federal Register Notice and Efforts to Consult
Outside Agency
On Tuesday, April 6, 2017, a 60-Day Federal Register Notice was published at 73 FR
12746 Volume 82, No. 65. Number of comments received: 1.

8.a Consultations Outside the Agency
The study team conducted a literature review in advance of launching the study to
determine what research existed on poetry education programs and research methods previously
used to evaluate those programs. The study team also assembled a Technical Working Group to
review and provide input on the study plan. TWG members include arts education researchers,
arts organization administrators, and teachers. That group met in August 2017 on a 90-minute
conference call to provide feedback to the study team on the evaluation plan. This working group
will also provide feedback to the study team on the draft report.
Cognitive testing of the student survey was conducted in October to November 2017.
Section B4 “Pre-Testing of Procedures” provides details on the pre-test of the survey
instruments. See also Attachment I.
9. Explain any decision to provide any payment or gift to respondents, other than
remuneration of contractors or grantees.
No payments or gifts will be given to respondents.
10. Describe any assurance of confidentiality provided to respondents and the basis for
the assurance in statute, regulation, or agency policy.
At the beginning of the survey and all qualitative data collections, participants will
receive written or verbal assurance that their participation is voluntary, that they can opt out at
any time, that their responses will not be reported individually, and that their responses will
never be linked to their individual responses. Researchers will combine all participant responses
and report them in aggregate form only. The survey data set provided to NEA at the end of the
study will not contain any personally identifying information (PII)—such as name or address of

respondents—that could permit disclosure or identification of respondents, directly or by
inference. Data for subgroups with cell sizes lower than 10 will be redacted or suppressed.5
SPR will use Survey Gizmo to collect the pre- and post-student survey and will use the
“Anonymous Response” feature to avoid storing identifiable information such as geo-location or
IP addresses. Furthermore, as discussed in the Study Overview, the research team will not be
collecting any PII within the survey. The “Anonymous Response” setting is compatible with
email campaigns. When both of these two features are used together, Survey Gizmo will track
which contacts have not yet completed the survey and send any reminders that you have set up to
these contacts, but researchers have no visibility into this process and will not be able to tie
survey responses to specific email addresses. Upon completion of the project, SPR will ensure
the secure destruction of all data originally provided or collected, employing digital or physical
shredding of electronic or physical data. When disposing of sensitive electronic data, SPR uses
secure deletion software that overwrites disks to a minimum of 7 times for reusable media (USB
drives and hard drives) and physical destruction (cross cut shredding) for non-reusable media
(e.g., CD/DVDs).
11. Provide additional justification for any questions of a sensitive nature, such as
sexual behavior and attitudes, religious beliefs, and other matters that are
commonly considered private.
Data on student gender, race/ethnicity, free and reduced lunch status, special education, and
English learner status will be collected through this evaluation study. These data are collected to

5

https://nces.ed.gov/pubs2011/2011603.pdf

assess whether outcomes are affected by students’ demographic characteristics. Race/ethnicity
survey questions comply with OMB standards.
12. Provide estimates of the hour burden of the collection of information.

Participant
Description

Students

Instrument or Activity

Total
Number
Number of Average Number of
Estimated
Total
Participants Participants Hours per Responses
Burden
Responses
per site
across 10 Response per Person
(Hours)
sites

Interviews

4

40

0.75

1

40

30

Focus group

6

60

1

1

60

60

Baseline (Pre) Surveys

1800

18000

0.25

1

18000

4500

Follow-up (Post) Surveys

1440

14400

0.25

1

14400

3600

1800

18000

0.1

1

18000

1800

1

10

2

1

10

20

1

10

0.5

1

10

5

Parents
Consent
District
Complete MOU
Staff
Administra
Interviews
tors
School/
District

Provide administrative data

2

20

8

2

40

320

Staff
School
Staff

Coordinate survey
distribution

1

10

8

1

10

80

Teachers

Interviews

2

20

0.75

1

20

15

50590

10430

TOTAL

Note that the burden estimate assumes every parent will review the passive consent letter
sent home with students. Additionally, the research team is estimating between two to four
individual student interviews, since the number of interviews may be affected by actual site visit
schedules. Because four interviews are the desired amount, we have included this top-end
estimate in the burden chart. Note also that the burden estimate for the student follow-up (post)
surveys is based on the fact that only students responding to the pre-survey will be invited to
participate in the follow-up survey. The response rate for the pre survey is estimated at 80%.
13. Provide an estimate for the total annual cost burden to respondents or recordkeepers resulting from the collection of information

Research participants do not incur any costs other than their time responding.
14. Provide estimates of annualized costs to the Federal Government.
The total one-time contracted cost to the Federal Government for this project is $360,000,
representing an annualized cost of $180,000 for a two-year project.
15. Explain the reasons for any program changes or adjustments reported on the
burden worksheet.
This is a new information collection request and will add 8,627 burden hours and 32,690
total annual responses to OMB’s inventory.
16. For collections of information whose results will be published, outline plans for
tabulation and publication. Address any complex analytical techniques that will be used.
Provide the time schedule for the entire project, including beginning and ending dates of
the collection of information, completion of report, publication dates, and other actions.
A.16.1 Recruiting Schools
After IRB and OMB approval, in approximately May 2018, the SPR research team will
begin recruiting 10 schools to participate in the study, using a two stage recruitment strategy.
First, NEA and State Art Agency officials will send principals and school district superintendents
a letter of support (Attachment B) to encourage participation in the research and introducing
SPR. The NEA and State Art Agency officials will use the draft letter, customizing it with
relevant additional detail. Once the letter from NEA/SAAs has been sent to the school principals
and school superintendents, in approximately June 2018, the research team will commence
contacting schools. Next, the team will first contact school principals via email, modeled on
Attachment B. The initial communication will contain key information about the study and will
let principals know that the research team would like to schedule a 15-minute phone
conversation, which serves as a screener to assess for appropriateness of the school as a study
site. The purpose of the call is to briefly describe the purpose of the study and the research

activities we plan to undertake. The phone call will also be used to determine whether the
selected school meets the necessary criteria for the study and find out if the principal would be
willing to participate in the research activities. The phone call will include (1) the purpose of the
study, (2) preview of school characteristics that would best fit the aims and needs of the study,
(3) what the research team will be asking of schools, (4) what data the team will collect and (5)
how the data will be used, and (6) benefits of participating in the study.
Once we assess the school site information and determine which schools meet the criteria
for the study, the research team will then contact the school superintendent’s office using a
similar process as the one followed with the school principals. The main goal of communicating
with the school district officers is to obtain additional information and determine school site
eligibility which also includes establishing data sharing agreements (Attachment C) to access
de-identified student-level data with some and make sure we follow the research protocol in
school sites within the district.
A.16.2 Student Surveys
The research team will conduct online pre- and post-surveys (Attachment D) for all
students in the 10 selected schools during SY2018-19, both participants in POL and nonparticipants. Pre surveys will be administered in July – September 2018 and post surveys will be
administered in April – June 2019.
A.16.3 Student Records
The research team will collect student-level administrative records in July 2019 from the
selected schools, per the data sharing agreements with each of the school districts. Data will be

collected for all students enrolled in the schools selected for the study. The data will include the
following and in a standardized format:
1. Unique identifiers for all students (student proxy id generated by the school district);
2. Participation in POL identifier for current and prior academic year;
3. Student-level demographic information (e.g., gender, race/ethnicity, free and reduced
lunch status, special education, English learner);
4. Grade level;
5. Relevant assessment data in English Language Arts and language proficiency tests for
school year (SY) 2018-19 and if applicable one prior school academic year (SY2016-17);
GPA and ELA end-of-course grades; and
6. Student-level records of attendance, suspensions, and expulsions.
A.16.4 School Site Visits, including Virtual Site Visits
The research team will make either day-long site visits or virtual site visits to each of the
schools between January and May 2019. Data will be collected through: (1) semi-structured
interviews following a prepared interview protocol with POL-participating students
(Attachment E) and teachers (Attachment F); state arts agency administrators will be
interviewed by phone prior to the site visit, also following a prepared protocol (Attachment G);
and (2) focus groups with POL-participating students following a prepared protocol
(Attachment H).
A.16.5 Data Analysis and Report
After data collection is complete, in July 2019, all quantitative and qualitative data will
be analyzed and a final evaluation report will be prepared. The report will be submitted in
November 2019. Supplemental publication products, at NEA’s request, will be submitted in
December 2019. These may include graphic fact sheets, power point slides summarizing the

study findings, selected quotes from educators regarding POL, and/or interview transcripts and
other raw data.
Monthly reports will be submitted about study progress throughout the entire study
period. Exhibit A.16 outlines the schedule for data collection and reporting.
Exhibit A.16 Study Schedule

Recruit Schools

Activity

Expected Activity Period
May - August 2018

Train site visit researchers

1 month after OMB approval

Conduct Pre-Test Surveys
Conduct site visits
Conduct Post-Test Surveys
Data analysis
Final Report

September – November 2018
January – May 2019
April – June, 2019
July – September 2019
November 2019

17. If you are seeking approval to not display the expiration date for OMB approval of
the information collection, explain the reasons that display would be inappropriate.
NEA will display the expiration date of OMB approval and OMB approval number on all
instruments associated with this information collection, including forms and questionnaires.
18. Explain each exception to the topics of the certification statement identified in
Certification for Paperwork Reduction Act Submissions.
No exceptions are necessary for this information collection. The agency is able to certify
compliance with all provisions under Item 19 of OMB Form 83-I.


File Typeapplication/pdf
File TitleMicrosoft Word - Part A Supporting Statement Final_080118
Authorshafferp
File Modified2018-08-02
File Created2018-08-02

© 2024 OMB.report | Privacy Policy