Summer of Innovation Stand-alone Implementation Report Dec. 13.2013

Summer of Innovation. Stand-alone Implementation Report FY2013.pdf

NASA Office of Education STEM Challenges

Summer of Innovation Stand-alone Implementation Report Dec. 13.2013

OMB: 2700-0150

Document [pdf]
Download: pdf | pdf
Evaluation Study of Summer
of Innovation Stand-Alone
Program Model FY 2013:
Implementation Report

Contract No. NNH08CD70Z, Order No.
NNH13CH35D

December 13, 2013

Prepared for:
National Aeronautics and Space
Administration (NASA)
Office of Education

Submitted by:
Alina Martinez
Tamara Linkow
Nicole Brooke
Abt Associates Inc.
55 Wheeler Street
Cambridge, MA 02138
Jackie DeLisi
Abigail Jurist Levy
Education Development Center, Inc.
43 Foundry Avenue
Waltham, MA 02453

Evaluation Study of SoI Stand-Alone Program Model FY2013:
Implementation Report
Table of Contents
Acknowledgements .............................................................................................................................. iii
Executive Summary ............................................................................................................................ iv
1.

Introduction ................................................................................................................................ 1

2.

SoI Project .................................................................................................................................. 3
2.1

3.

2.1.1

2010 Pilot............................................................................................................. 3

2.1.2

Summer 2011 ....................................................................................................... 4

2.1.3

Summer 2012 ....................................................................................................... 6

2.1.4

Summer 2013 ....................................................................................................... 6

Evaluation Design and Methodology........................................................................................ 8
3.1

Objectives and Research Questions for the Implementation Evaluation ........................... 8

3.2

Data Sources ...................................................................................................................... 8
3.2.1

Camp Registration Data....................................................................................... 8

3.2.2

Parent Surveys ..................................................................................................... 9

3.2.3

Baseline Student Survey ...................................................................................... 9

3.2.4

Common Instrument Validation Dataset ........................................................... 10

3.2.5

Site Visit Observations ...................................................................................... 10

3.2.6

Focus Groups with Teachers ............................................................................. 12

3.2.7

Interviews with SoI Awardee PIs and Lead Center Staff .................................. 12

3.3

Study Sample ................................................................................................................... 12

3.4

Analysis ........................................................................................................................... 13

3.5
4.

SoI Development and Evaluation Stages........................................................................... 3

3.4.1

Qualitative Data Analysis .................................................................................. 13

3.4.2

Quantitative Data Analysis ................................................................................ 13

Limitations....................................................................................................................... 13

Findings..................................................................................................................................... 15
4.1

Parent and Student Characteristics .................................................................................. 15
4.1.1

Parent Characteristics ........................................................................................ 15

4.1.2

Student Demographics ....................................................................................... 16

Abt Associates Inc.

Acknowledgements ▌pg. i

4.2

SoI Participation .............................................................................................................. 17

4.3

Student Baseline Interest in Science................................................................................ 19

4.4

Site Visit Observations .................................................................................................... 23
4.4.1

4.5

4.6

5.

Ratings on Dimensions of Success .................................................................... 23

Teacher Focus Groups ..................................................................................................... 28
4.5.1

Student Activities .............................................................................................. 28

4.5.2

Planning and Preparation ................................................................................... 32

4.5.3

SoI Resources .................................................................................................... 33

4.5.4

Professional Development and Support............................................................. 35

PI and Lead Staff Interviews ........................................................................................... 37
4.6.1

Recruitment ....................................................................................................... 37

4.6.2

Activities............................................................................................................ 39

4.6.3

Partnerships ....................................................................................................... 43

Discussion.................................................................................................................................. 45
5.1.1

What are the characteristics of SoI camps and their participants?..................... 45

5.1.2 To what extent do SoI camps meet program quality expectations as defined by
the PEAR Dimensions of Success (DoS) rubrics? .......................................................... 45
5.1.3 What supports and challenges do Awardees/Centers face in implementing SoI
curricula? How do they handle these challenges? ........................................................... 47
5.1.4 What staff, materials, and NASA resources are necessary for successful SoI
activities? ......................................................................................................................... 48
5.1.5 How early and to what extent must plans and preparation begin for successful
project implementation? .................................................................................................. 49
6.

Recommendations .................................................................................................................... 50

Appendix A. Data Verification Checklist ......................................................................................... 52
Appendix B. Parent Survey ............................................................................................................... 53
Appendix C. Student Baseline Survey .............................................................................................. 56
Appendix D. Dimensions of Success.................................................................................................. 61
Appendix E: Teacher Focus Group Protocol ................................................................................... 63
Appendix F: PI Interview Protocol ................................................................................................... 65
Appendix G: Parent Characteristics from the Full and Analytic Samples ................................... 67

Abt Associates Inc.

Acknowledgements ▌pg. ii

Acknowledgements
This report was prepared with assistance from our colleagues Melissa Velez, Elisabeth Ericson, Gabe
Schwartz, Stephanie Althoff, Richard Fournier, Erica Fields, and Lisa Marco-Bujosa.
This report is the result of the collaborative effort involving numerous individuals. The authors of this
report would like to thank the evaluation leads at the individual NASA Summer of Innovation (SoI)
camps, Awardees, and NASA Centers and the Jet Propulsion Laboratory (known as “Centers” for
remainder of document) for their assistance in providing information and materials on their SoI
programming. We are indebted to the individuals at the camps who shared their time and participated
in interviews and focus groups with the site visit teams, as well as the SoI students and parents who
participated in the evaluation data collection activities. We also would like to thank the members of
the evaluation team from Abt Associates and Education Development Center who helped organize
and conduct the site visits. Finally, we would like to thank the NASA staff members and contractors,
as well as members of the Summer of Innovation Technical Review Forum, who have provided
guidance on the framing and design of this evaluation and feedback on the findings of this report.
The data collection, analysis and reporting of this material was conducted in accordance with OMB
Control No. 2700-0150 (expiration February 28, 2015).

Abt Associates Inc.

Acknowledgements ▌pg. iii

Executive Summary
The National Aeronautics and Space Administration’s (NASA) Office of Education launched the
Summer of Innovation Project (SoI) in 2010 in response to President Obama’s “Educate to Innovate”
campaign. SoI is a multi-year project intended to engage the nation’s youth in NASA’s broad mission
and to inspire them to pursue education in science, technology, engineering and math (STEM) fields
leading to involvement in the country’s STEM workforce. SoI provides middle school students,
including those who are traditionally underrepresented and underserved in STEM (i.e., females,
minorities excluding Asians, and low-income students), with opportunities to engage in NASAdeveloped activities during the summer.
In fiscal year 2013 (FY2013), the SoI project operated through NASA Centers and SoI Awardees that
were previously selected through a competitive process. NASA sought to provide its
Awardees/Centers and their collaborators or partners with tools to offer high-quality STEM
opportunities to middle school youth, while also allowing Awardees/Centers to leverage their own
resources and expand their capacity to influence the educational trajectories of students
underrepresented in the STEM fields.
The 2013 evaluation efforts focused on investigating the SoI stand-alone model, which holds
particular promise as a summer engagement model for middle school students that may be replicable
across Federal government programs. Camps implementing the stand-alone model offer middle
school students a minimum dosage of 30 hours of selected NASA SoI curricula, independent of other
summer programs.
The implementation evaluation used descriptive statistics and content analytic techniques to describe
the approaches implemented at 11 SoI stand-alone camps during the summer of 2013. Parent and
student surveys, PI and Center Education Lead interviews, teacher focus groups, camp observations
and registration data all provided evidence to generate the following key implementation findings:
•

Camps included in the evaluation primarily served the SoI target population of students
underrepresented in STEM. The students attending these camps had high educational
expectations, with over one-half expecting to complete a graduate degree. Relatedly, SoI students
and their parents reported that a key reason they decided to attend SoI was to learn about NASA,
science, and space.

•

Camp recruitment most often occurred through personal communications with family, friends,
community members, and educators. Additionally, efforts to foster repeat participation seemed to
be successful as evidenced by one-third of the students at study camps reporting previous SoI
participation.

•

Generally, the camp classrooms that were observed scored high on various measures of quality
assessed using the Dimensions of Success (DoS) observational tool. Classrooms observed tended
to score high on dimensions related to organization, materials, space utilization, participation,
purposeful activities, engagement with STEM, inquiry, relationships, and youth voice. However,
classrooms did not score as well on the STEM content learning, reflection, and relevance
dimensions.

Abt Associates Inc.

▌pg. iv

•

Teachers, PIs and lead camp staff reported significant time and resources are required to
successfully recruit students, train educators, and plan and implement camp activities. They all
viewed the hands-on nature of the SoI activities as key to engaging students.

Based on the findings of this study, SoI should consider the following recommendations as it moves
forward with programming:
•

Continue to encourage active engagement of parents and students in the community through
outreach events as these events are a successful mode of recruitment.

•

Recruitment efforts that highlight the STEM content covered in SoI camps could be successful in
generating high camp enrollments as the opportunity to learn about NASA, science and space is a
key reason why students sign up for SoI. NASA should continue to monitor recruitment efforts in
order to ensure and determine the extent to which camp leaders are focusing on students in low
income neighborhoods as well as students who may not have attended SoI camps in previous
years.

•

The Technical Review Forum (TRF) recommends broadening the recruitment to reach students
not already interested and involved in STEM activities. The SoI project team should consider if a
change in the target population is desired and feasible.

•

Target earlier release of funds to camps, given the continued challenges camps face when funding
is received late in the spring. The TRF recommended releasing funds as early as January or
February. Early release of funds would ensure sufficient time for recruitment of students and
educators, and would allow for adequate planning of camp activities.

•

Continue to provide educators with access to hands-on curricula and materials as these are key to
student engagement. Include a review of curricular materials in order to ensure that factors such
as reading level and time allotment are appropriate for the recommended grade level. Increase
suggestions for implementation, including explicit connections between the activities and current
technology and authentic scientific and engineering practices.

•

Increase resources for professional development, which is key to successful implementation of
SoI curricula. Increased funding for professional development could increase the number of
educators trained, the length of training provided, and extend the content covered.

•

Provide some professional development that focuses on the areas in which camps were rated
lower on the DoS. Including explicit strategies to improve how the camp activities encourage
student reflection, demonstrate the relevance of activities for students’ lives, and support student
understanding of STEM concepts in the professional development could result in increased camp
quality on DoS dimensions.

•

Provide additional implementation strategies as part of the professional development. In addition
to suggestions for creating student roles within teams that mirror the roles of scientists and
engineers across project teams, strategies could also include ideas for saving materials and time,
or sets of activities that could use the same material resources.

•

Increase teacher planning time prior to camp for teachers to consider ways to incorporate
reflection into camp activities and to ensure the activities link to learning goals. More time for
planning also might allow teachers to collaborate prior to camps in order to figure out what

Abt Associates Inc.

▌pg. v

materials they will need, how they could obtain them, and devise back-up plans that would all
ensure ample and efficient use of materials during the camps.
•

Consider including student teamwork and collaboration as possible program outcomes and focus
for evaluation. Provide mechanisms for camp leadership to learn about possible methods of
maximizing available resources. This could include suggestions for where to obtain materials,
ways to reuse materials, ways to maximize teachers’ time, or possible types of partners who could
provide additional resources. In addition, consider establishing mechanisms for camp leaders to
share successful recruitment and resource management strategies.

Abt Associates Inc.

▌pg. vi

1. Introduction
Through its investment in science, technology, engineering, and mathematics (STEM) education projects,
the National Aeronautics and Space Administration(NASA) aims to strengthen NASA and the Nation's
future workforce; attract and retain students in STEM disciplines; and engage Americans in NASA's
mission. 1 NASA’s Office of Education launched the Summer of Innovation Project (SoI) in 2010 in
response to President Obama’s “Educate to Innovate” campaign, which aimed to improve education in
STEM for American youth. 2 SoI is a multi-year project intended to engage the nation’s youth in NASA’s
broad mission and to inspire them to pursue education in STEM leading to involvement in the country’s
STEM workforce. Through partnerships with educators, STEM learning networks and a variety of
organizations, SoI provides middle school students, including those who are traditionally
underrepresented and underserved in STEM (i.e., females, minorities excluding Asians, and low-income
students), with opportunities to engage in NASA-developed activities during the summer.
SoI’s goal statement reads, “In order to support NASA’s vision of equal access to a quality STEM
education, the Summer of Innovation engages and supports external partners in the delivery of evidencebased summer engagement opportunities in STEM to youth from underserved/underrepresented
populations with the intent of increasing interest and participation in STEM and contributing toward the
national-level impact of increased numbers of high school graduates pursuing STEM majors and
careers.” 3 To achieve this goal, in fiscal year 2013 (FY2013), the SoI project operated through NASA
Centers and SoI Awardees that were previously selected through a competitive solicitation process;
NASA Centers received SoI funds, 2010 Awardees received no-cost extensions, and 2011 Awardees
received another phase of their grant award. NASA sought to provide its Awardees/Centers and their
partners 4 with tools to offer high-quality STEM opportunities to middle school youth, while also allowing
Awardees/Centers to leverage their own resources and expand their capacity to influence the educational
trajectories of students underrepresented in the STEM fields.
Reflecting its commitment to evidence-based decision making, NASA’s Office of Education contracted
with Abt Associates Inc. and its subcontractors, Education Development Center (EDC) and DataStar
(collectively referred to as the study team) to conduct the Evaluation Study of NASA’s Summer of
Innovation (SoI) Stand-Alone Model FY2013 between January 2013 and June 2014. The study team
engaged in evaluation activities with the following key objectives: to (1) identify the key components of
successful implementation of SoI stand-alone activities and the primary challenges to implementation; (2)
estimate short-term changes in student outcomes after participation in SoI; (3) explore the relationship
between SoI camp quality, as measured by the Dimensions of Success tool, and student outcomes; and (4)

1

National Aeronautics and Space Administration, 2013. About NASA’s Education Program. Retrieved from
http://www.nasa.gov/offices/education/about/index.html.

2

November 23, 2009. The White House Office of the Press Secretary. President Obama Launches “Educate to
Innovate” Campaign for Excellence in Science, Technology, Engineering, and Mathematics (STEM) Education.
Retrieved on December 11, 2011, from http://www.whitehouse.gov/the-press-office/president-obama-launcheseducate-innovate-campaign-excellence-science-technology-en.

3

NASA, 2012. Summer of Innovation: Project Redesign and Evaluation. Report to the Executive Office of the
President, Office of Management and Budget, p.6.

4

Organizations that partner with NASA Centers are officially designated as “collaborators” and those that partner
with Awardees are “partners”; the term “partners” is used throughout this report to refer to both.

Abt Associates Inc.

▌pg. 1

document the processes and supporting materials for the performance data submitted to NASA by
Awardees/Centers. The SoI Stand-Alone Model evaluation seeks to answer the following research
questions:
Implementation
1) What are the characteristics of SoI camps and their participants?
2) To what extent do SoI camps meet program quality expectations as defined by the PEAR
Dimensions of Success (DoS) rubrics? 5
3) What supports and challenges do Awardees/Centers face in implementing SoI curricula? How do
they handle these challenges?
4) What staff, materials, and NASA resources are necessary for successful SoI activities?
5) How early and to what extent must plans and preparation begin for successful project
implementation? 6
Outcomes
6) Are SoI students’ levels of STEM interest and engagement similar at the start of SoI and in the
fall?
a) Did self-reported interest in STEM change significantly between the baseline and followup surveys? Are there differences by subgroups (e.g., gender)?
b) To what degree does SoI youth self-reported interest in STEM at follow-up differ from
youth involved in other out-of-school time programming?
c) Do students report participating in STEM—in-school, extracurricular, or out-of-school—
more frequently since SoI participation than they did in the previous school year? Are
there differences by subgroups (e.g., gender)?
7) Are there correlations between camp characteristics and project quality and the student attitudes
and behaviors?
This report, the second in a series of reports from this evaluation, presents the evaluation’s findings
related to the implementation of SoI activities in summer 2013, reporting evidence related to research
questions one through five (listed above). 7 It begins with a brief overview of SoI, which draws upon
information gleaned from previous evaluations. The next section describes the evaluation design and
methodology used to analyze both qualitative and quantitative data. Section four reports the findings
based on camp registration data parent surveys, student baseline surveys, the Common Instrument
Validation dataset, site visit observations, teacher focus groups, and PI and center lead staff interviews.
The report concludes with a discussion of the implementation findings that explores the characteristics of
students, the quality of the camps, and identifies camps’ successes and challenges and presents
recommendations for future SoI programming.

5

See http://caise.insci.org/sparks/128-dimensions-of-success-dos-afterschool-science-observation-tool for more
information about the DoS.

6

Information on the processes and materials that camps use to register students for SoI will also be provided as
necessary to supplement the information that NASA obtains through its own efforts.

7

The first report described how camps document and prepare performance data submitted to NASA. The third
report will investigate SoI outcomes, addressing research questions 6 and 7.

Abt Associates Inc.

▌pg. 2

2. SoI Project
During SoI’s fourth year (FY2013), SoI funds were provided to NASA Centers as well as SoI Awardees
that were previously selected through a competitive process (collectively referred to as
Awardees/Centers). Through the Awardees/Centers, SoI focused on providing intensive and interactive
summer STEM experiences to underrepresented, underserved students entering 4th to 9th grades. NASA
also awarded mini-grants to a range of educational partners (e.g. museums, schools or school districts,
youth organizations) to integrate STEM content into existing summer and after-school student programs.
The SoI project model aligns with existing literature on enhancing student interest and engagement in
STEM, particularly in informal settings. 8 A key component of the SoI summer programming through the
Awardees/Centers was the integration of selected NASA content in hands-on, problem- and inquiry-based
STEM activities, to create summer enrichment experiences for students. To support these unique summer
learning experiences, SoI provided professional development opportunities for educators that focused on
the implementation of the NASA content during the summer program.
Research indicates that hands-on, inquiry-based activities delivered in informal environments are key
factors in helping to develop critical thinking skills and play a significant role in increasing students’
interest and engagement in STEM and the likelihood that they will consider science-related occupations. 9
SoI aims to foster students’ interest and involvement in STEM activities and ultimately, to increase the
overall number of students pursuing STEM degrees and related careers and, more specifically, increase
the proportion of underrepresented students who pursue these paths.

2.1

SoI Development and Evaluation Stages

While the core components and goals of SoI have remained consistent since the project’s start in 2010,
both the evaluation and NASA’s FY2013 SoI project have evolved building on previous efforts that are
briefly described below.
2.1.1

2010 Pilot

Project funding and requirements

The SoI pilot in 2010 employed a multi-faceted approach to reach and engage middle school students in
STEM learning with NASA content and experiences. NASA provided funding for SoI activities to the
nine NASA Centers and the Jet Propulsion Laboratory (JPL); awarded cooperative agreements through a
competitive process to four Space Grant Consortia (Idaho, Massachusetts, New Mexico, and Wyoming)
and contracted with one external organization (Paragon TEC Inc.). NASA also issued an open call
inviting interested groups to use NASA content during their existing summer activities for students;
however, no funding was provided. 10

8

A review of relevant literature investigating the relationship between informal science education and student
engagement in STEM, with a particular emphasis on the middle and high school levels, was conducted by EDC
under a previous contract with NASA. See Fournier, R., DeLisi, J., & Levy, A.J. (2011). NASA Summer of
Innovation (SoI): Informal science education and student engagement. Newton, MA: Education Development
Center, Inc.

9

National Institute on Out-of-School Time. (2007). A Review of the Literature and the INSPIRE Model: STEM in
Out-of-School Time. Wellesley, MA: Wellesley College.

10

SoI Planning Meeting Handout, April 2010, p. 4.

Abt Associates Inc.

▌pg. 3

NASA’s requirements differed for each of these groups. The four Space Grant Consortia and the
contractor (Paragon TEC Inc.) were expected to provide professional development to the educators who
would lead the summer activities; implement intensive, hands-on activities to middle school students;
infuse NASA content into the summer activities; develop a STEM community of learning for sustained
engagement over a 36-month period; and evaluate the effectiveness of their programs. 11 NASA Centers in
2010 received fewer SoI resources than the Space Grant Consortia and the contractor and were operating
under tighter time constraints. The Centers’ SoI funding was provided to support collaborations with
partners that offer summer camps for the targeted student audience and who agreed to provide at least 30
content hours in STEM - of which 7.5 hours were to be NASA content – during these camps. Therefore,
NASA Centers were not expected to provide professional development activities nor were they required
to provide follow-on activities during the academic school year (although some did). Finally, NASA did
not detail explicit expectations for the organizations that responded to the open call as they did not receive
funding.
Evaluation Activities

The 2010 pilot evaluation activities examined both SoI’s implementation and the project’s short-term
outcomes. Implementation and outcomes data were collected from various sources during the pilot to
produce lessons learned and recommendations regarding program design, implementation, and program
evaluation. The pilot evaluation provided insight into the SoI pilot implementation and served as an
important resource in the redesign of SoI for FY 2011.
2.1.2

Summer 2011

Project funding and requirements

Building on the pilot’s successes and responding to the lessons learned NASA made significant changes
to SoI for FY2011. For example, because of potential partners’ limited ability to implement large-scale
STEM summer experiences, SoI was refined to emphasize “strengthening the capacity” of community
and school-based organizations to engage 4th through 9th grade students in NASA-themed intensive, highquality, inquiry-based learning experiences.” NASA also sought to build the sustainability of the
Awardees’ programs by encouraging partnerships with formal and informal STEM organizations so that
they could eventually operate high-quality STEM programs without additional SoI funding at scale.
NASA continued to utilize multiple funding models for SoI in FY 2011.Four primary approaches were
taken: 1) competitive national awards for large, proven providers of STEM learning experiences to the
targeted student groups; 2) funding to the nine NASA Centers and JPL to support collaborations with
small to mid-size organizations interested in enhancing their STEM activities; 3) continued funding to the
2010 pilot Awardees; and 4) mini-grants to various organizations desiring to engage students in NASA
content.
In May 2011, eight national awards were made to organizations implementing summer programs in
Pennsylvania, Nebraska, South Dakota, Indiana, Georgia, Texas, California, and Puerto Rico. These
national Awardees received the most funding, made the longest commitment, and were responsible for the
greatest number of requirements. In partnership with schools, districts, or state departments of education,
as well as community/professional organizations and partners, Awardees were to provide students with 40
hours of STEM activities that utilized NASA content over the summer and an additional 25 hours of

11

FY 2010 Cooperative Agreement Notice (CAN) for the National Needs Grant: Summer of Innovation Pilot
(announcement NNH10ZNE004C, January 27, 2010, p.12).

Abt Associates Inc.

▌pg. 4

sustained engagement activities during the school year. They were also required to direct certified
teachers supporting summer learning and other certified teachers supporting SoI sustained engagement
activities to NASA professional development opportunities, and to participate in the national evaluation
by collecting parent consent forms, administering student and teacher surveys, and completing forms
describing their activities.
The nine NASA Centers and JPL were required to be actively engaged in the SoI project and to
collaborate with established organizations. These collaborations were to provide students with a minimum
of 20 hours of NASA content over the summer, and the partner organizations were expected to implement
two follow-up activities during the school year. Centers and their partners were also required to
participate in the national evaluation by collecting parent consent forms and administering student
surveys.
The third approach used by SoI in FY2011, facilitated by the Space Grant Foundation, was the mini-grant
award, whereby small, one-time awards were made to various organizations that were interested in
enhancing STEM opportunities for the targeted audience. These organizations were required to provide
only a minimum of six hours of NASA-related programming to students. They were not included in the
national evaluation.
Evaluation Activities

The goal of the 2011 national evaluation was to gather data that would inform NASA’s continued
development of the project as well as to assess whether evidence supports the progression to a more
rigorous, summative, impact evaluation. As such, the formative evaluation focused on describing SoI’s
implementation and associated outcomes, but did not determine whether there is a causal link between the
project and outcomes.
The 2011 national evaluation consisted of four parts: an implementation evaluation, an outcomes
evaluation, a sustained engagement evaluation, and a feasibility study. The implementation evaluation
relied on Awardee focus group data, PI interview data, and class- and activity-level data to describe the
various approaches taken by the Awardees to implement student activities, and to examine the feasibility
and utility of the project’s requirements. The outcomes evaluation used a one group, pre-post design that
compared participants' responses on baseline and the first follow-up surveys to examine the extent to
which SoI activities met the project's short-term objectives and goals and to identify potential areas of
promising practices. The sustained engagement evaluation included interviews with Awardee PIs and the
collection of implementation forms from instructors to describe the school year activities as well as the
administration of second follow-up surveys to sampled students and certified teachers to provide insight
into student interest in and attitudes towards science during the school year and teachers’ access to and
comfort in teaching NASA content. The feasibility study included fall site visits and interviews with camp
staff to explore whether student learning outcomes—broadly conceptualized as both content knowledge
and higher-order reasoning skills—were measurable outcomes.
Several key findings from the 2011 national evaluation were influential in the evolution of SoI:
•

National Awardees and Centers demonstrated substantial variation in their approaches to
implementing student activities. Some Awardees created new stand-alone one-week SoI camps,
others integrated SoI into their own pre-existing programs lasting for multiple weeks or even months,
and others utilized a combination of the two approaches.

Abt Associates Inc.

▌pg. 5

•

Student survey responses from Awardees suggested that there was a significant increase in student
interest in a science career, although there was no significant change in students’ responses on items
designed to gauge their interest in science.

•

Because sustained engagement relied upon SoI’s summer teachers’ infusing NASA SoI content into
their school year curricula, the students exposed to NASA SoI content during the school year may or
may not have attended SoI during the previous summer.

•

Camp staff interviewed in the fall of 2011 agreed that student learning occurs but that it is not the
primary outcome of SoI. Student engagement in STEM activities and exposure to STEM content were
cited most often as the projects’ goals for students.

•

A longer planning period prior to the summer would likely improve program recruitment and
implementation and facilitate successful participation in the evaluation.
2.1.3

Summer 2012

Project funding and requirements

NASA continued to fund the five pilot Awardees selected in 2010, the eight national Awardees selected
in 2011, the nine NASA Centers and JPL, and also continued mini-grants to various organizations. In
2012 programming requirements were similar to those in FY 2011; national Awardees were required to
provide 40 hours of student STEM activities utilizing NASA content over the summer, while NASA
Center partnerships needed to provide 20 hours of student STEM activities utilizing NASA content
during the summer. For the national awards, organizations receiving SoI funding were required to direct
certified teachers supporting summer learning and other certified teachers supporting SoI sustained
engagement activities to NASA professional development opportunities; NASA Center partnerships were
not expected to provide professional development for classroom teachers.
Evaluation activities

In FY2012, NASA engaged in internal evaluation planning. NASA established a group of external
experts in evaluation, STEM programming, and out-of-school time activities and convened several
technical review forums. The forums provided NASA with guidance on refining future evaluations to
ensure they inform the development of SoI. There was no evaluation of SoI in 2012.
2.1.4

Summer 2013

Project funding and requirements

In 2013, NASA continued funding to the five pilot Awardees, the eight national Awardees selected in
2011, the nine NASA Centers and JPL, and also continued mini-grants to various organizations. The nine
Centers, JPL and the eight 2011 national awardees collaborated with various organizations around the
country to serve over 39,000 rising fourth through ninth grade students and to engage over 4,000
educators at SoI camps.
Changes were made to the SoI requirements to better fit with implementation realities in the field. The
sustained engagement requirements were eliminated, funded Awardees and Centers were required to use
the NASA SoI curricula, and the student to teacher ratio was expected to remain lower than 20 students to
one teacher. Additionally, Awardees and Centers were required to offer outreach activities to parents or
caregivers of students engaged in NASA activities.
Evaluation activities

The 2013 evaluation efforts focused on camps operating SoI independent of other summer programs (i.e.
the stand-alone model). While some camps infuse NASA content and activities into pre-existing
Abt Associates Inc.

▌pg. 6

programs, the SoI Stand-Alone project model offers middle school students with a minimum dosage of 30
hours of selected NASA SoI curricula during the summer. SoI content features hands-on, problem-based
activities in an appropriate learning progression. NASA believes the stand-alone model holds particular
promise as a summer engagement model for middle school students that may be replicable across Federal
government programs.
Four Awardees/Centers were involved in the FY2013 evaluation: Chester County Intermediate Unit
(CCIU), Rio Grande Valley Science Association (RGVSA), Glenn Research Center (GRC), and Johnson
Space Center (JSC). These four Awardees/Centers were selected because they implemented stand-alone
SoI camps in summer 2013, had been successful in recruiting students for SoI project participation in the
past and are geographically diverse. Across the Awardees/Centers, a total of 11 camps were included in
the evaluation and eight of those camps were visited during the summer of 2013.
Together with the monitoring activities that NASA conducts, the 2013 evaluation helped to: (1) identify
the key components of successful implementation of SoI stand-alone activities and the primary challenges
to implementation; (2) estimate changes in student outcomes after participation in SoI; (3) explore the
relationship between SoI project quality and student outcomes; and (4) document supporting records for
the performance data submitted to NASA by Awardees/Centers.

Abt Associates Inc.

▌pg. 7

3. Evaluation Design and Methodology
The implementation evaluation uses descriptive statistics and content analytic techniques to describe the
approaches implemented at 11 SoI stand-alone camps during the summer of 2013. The findings are
informed by responses to parent and student surveys, PI and Center Education Lead interviews, teacher
focus groups, and camp observations and registration data.

3.1

Objectives and Research Questions for the Implementation Evaluation

This report aims to identify the key implementation components of successful SoI stand-alone activities
and the primary challenges to implementation. It addresses the implementation research questions across
selected FY2013 SoI camps implementing the stand-alone model during the summer of 2013. 12 The
specific questions include:
Implementation
1) What are the characteristics of SoI camps and their participants?
2) To what extent do SoI camps meet project quality expectations as defined by the PEAR
Dimensions of Success (DoS) rubrics? 13
3) What supports and challenges do Awardees/Centers face in implementing SoI curricula? How do
they handle these challenges?
4) What staff, materials, and NASA resources are necessary for successful SoI activities?
5) How early and to what extent must plans and preparation begin for successful project
implementation? 14

3.2

Data Sources

To explore the characteristics of SoI participants at stand-alone camps, camp quality, as well as identify
the supports and resources necessary for success and the challenges encountered by SoI camps, the
evaluation team analyzed both quantitative and qualitative data. Seven key data sources were used: camp
registration data, parent surveys, student baseline surveys, the Common Instrument Validation dataset,
camp site visit observations, focus group discussions with lead teachers, and interviews with
Awardee/Center PIs and lead staff. We describe each of these data sources below and, in the following
section, discuss how the data were analyzed.
3.2.1

Camp Registration Data

Study team site liaisons coordinated with the Awardee/Center or camp evaluation leads to obtain
registration information, including parent and student names and contact information. Camp evaluation
leads were required to submit registration data to the study team within one week of the camp’s end date.
These data, in conjunction with the student attendance sheets identified the study sample and enabled the
study team to contact parents and students in the fall of 2013, during the follow-up period.

12

A subsequent report will explore the outcomes research questions.

13

See http://www.pearweb.org/tools/dos.html for more information about the DoS.

14

Information on the processes and materials that camps use to register students for SoI will also be provided as
necessary to supplement the information that NASA obtains through its own efforts.

Abt Associates Inc.

▌pg. 8

3.2.2

Parent Surveys

Parent surveys were collected to provide data on parents’ demographic characteristics such as their
highest level of education achieved, whether or not they were employed in a STEM occupation, their
children’s experience with afterschool activities, and their reasons for enrolling their children in SoI camp
(Appendix B). The parent survey was developed by NASA in collaboration with several external
experts 15 and the study team. Surveys were available in English and Spanish. To address the research
questions related to project outcomes, parent surveys were collected from 1,121parents of 6th to 9th grade
students participating in the 11 camps included in the evaluation.16 The estimated response rate for the
parent survey was 86% (Exhibit 3.1).
Exhibit 3.1

Parent Survey Response Rates

Total Rising 6th-9th Grade
Participantsa
1,309

Total Number of Surveys Received from
Parents in the Study Sample
1,121

Response Rate
86%

a: The total number of 6th-9th grade SoI participants is the number of students who either appeared on camp rosters or completed a baseline
survey. Actual participation in SoI is being confirmed during the follow-up period through telephone reminder calls. Therefore, the total number
of students reported here may overestimate the total number of students who actually participated in study camps as some students on rosters may
not have actually participated in the camps.

3.2.3

Baseline Student Survey

The baseline student survey was developed by NASA in collaboration with several external experts (as
noted earlier) and the study team. The survey also benefited from the feedback of SoI Awardee and
Center representatives. The survey questions were selected to gather information from students about
their demographics, educational plans, experience in SoI, interest in and enthusiasm for science, and their
participation in organized and leisure science activities (Appendix C). The 11 survey questions were
selected and/or adapted from previously field-tested instruments, eliminating the need for cognitive
testing. The sources for the survey question items included:
•

Student Baseline Survey, High School Longitudinal Study (HSLS) of 2009, IES/Department of
Education

•

Assessing Women and Men In Engineering (AWE), Middle School Students Pre-Activity Surveys
and Immediate Post-Activity Surveys for Middle School-Aged Participants – Science and
Engineering (2009)

•

4-H Science Youth Survey (2012)

•

Summer of Innovation Baseline Student Survey (2011)

•

The Common Instrument (2010), PEAR and the Noyce Foundation

15

The external experts include: Laura LoGerfo, the Project Officer for High School Longitudinal Study of 2009 at
the U.S. Department of Education National Center for Education Statistics; Gil Noam, Founder and Director of
the Program in Education, Afterschool & Resiliency (PEAR), Harvard University; and Sara Spiegel, Director of
Administration at the Noyce Foundation. Several experts also advised on the evaluation design, including Henry
Frierson, University of Florida; Anita Krishnamurthi, Afterschool Alliance; Carol Stoel, National Science
Foundation; Robert Tai, University of Virginia; and Diego Zapata-Rivera, Educational Testing Service.

16

In total, 1,447 parents in the target sample (i.e. parents of 6th to 9th grade students) completed a parent survey
but for 326 parents the evaluation team was unable to confirm through either camp rosters or student baseline
surveys that their children ever attended on the SoI camps included in the evaluation.

Abt Associates Inc.

▌pg. 9

The baseline surveys were sent to the 11 camps included in the evaluation prior to the camp start date.
The study team coordinated with the designated camp evaluation leads to ensure that survey
administration procedures were understood and followed at each camp. Camps administered the survey in
classrooms on the first day of camp and returned the surveys to DataStar within one week of camp end
date. Next, DataStar entered the student survey data into an electronic data file. In total, 1,012 baseline
surveys were collected from students with parent consent, resulting in a 77% response rate (Exhibit 3.2).
Exhibit 3.2

Student Baseline Survey Response Rate

Total 6th-9th Grade Students
Participating at Study Camps
1,309

Total Number of Students with
Parent Consenta
1,012

Response Rate
77%

a: In total, 1,160 baseline surveys were collected from target students but parent consent was not obtained for 109 students. Collection of parent
consent forms is continuing into the follow-up period, therefore, the number of student baseline surveys for the outcomes analysis may increase.

3.2.4

Common Instrument Validation Dataset

The Common Instrument Validation dataset, developed by the Program in Education, Afterschool &
Resiliency (PEAR) at Harvard University for the Noyce Foundation, provides national benchmarking data
on a variety of questions about student interest and engagement in science for use in informal science
programs. The dataset includes the question items that make up the enthusiasm for science scale as well
as other questions from the 2005 and 2009 National Assessment of Educational Progress (NAEP) Science assessments and the AWE Science and Engineering survey. The data come from surveys of
middle school students participating in various STEM programs during the summer and school year from
across the country. The CI Validation data provide benchmark data to compare the SoI student data.
3.2.5

Site Visit Observations

To provide data on the quality of SoI programs, pairs of study team members conducted site visits to
observe activities at eight SoI camp sessions, from across the Awardees and Centers included in the study.
Camps were selected based on logistical considerations and recommendations from SoI leaders. Data was
collected, by certified observers, using the Dimensions of Success (DoS) observation tool. 17 The DoS is
an observation tool that focuses on 12 dimensions of quality in STEM out-of-school programs (Exhibit
3.3), which are grouped into four broader domains. (Appendix D contains details on the 12 dimensions.)
Exhibit 3.3

The Four Domains and Twelve Dimensions of the DoS

Domain
Features of the learning environment—that make the environment suitable for STEM
programming
Activity engagement—looking at how the activity engages students
STEM knowledge and practices—particularly the extent they help students understand
STEM concepts, make connections, and participate in practices of STEM professionals
Youth development in STEM—looking at whether: interactions encourage student
participation, activities are relevant to students lives and experiences, and students are
encouraged to voice their ideas and opinions and make meaningful choices.

17

Dimension
Organization
Materials
Space Utilization
Participation
Purposeful Activities
Engagement with STEM
STEM Content Learning
Inquiry
Reflection
Relationships
Relevance
Youth Voice

1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.

To become certified observers, nine study team members participated in a two-day DoS training webinar and
each conducted two practices observations.

Abt Associates Inc.

▌pg. 10

Consistent with the guidelines for DoS administration, at each camp visited, two classrooms were
observed twice, once on each day of the two day visits, for a total of four observations per camp. During
each of the observations, the 12 dimensions were rated using a four-level rubric representing increasing
quality, where a rating of “1” indicates that evidence is absent, “2” indicates there is inconsistent
evidence, “3” indicates there is reasonable evidence, and “4” indicates there is compelling evidence. 18
According to the developers of the DoS, ratings of three or four on a dimension are desirable ratings.
Examples of key prompts used in determining ratings for each dimension are presented in Exhibit 3.4.
Exhibit 3.4

Key Prompts for each DoS Dimension

FEATURES OF THE LEARNING ENVIRONMENT
Organization
Materials

Space Utilization

• Are the activities delivered in an
organized manner?
• Are materials available and do
transitions flow?
• Is a back-up plan available if conditions
change?
ACTIVITY ENGAGEMENT
Participation

• Are the materials appropriate for
the students, aligned with the
STEM learning goals, and
appealing to the students?

• Is the space utilized in a way that is
conducive to out-of-school-time
(OST) learning?
• Are there any distractions that
impact the learning experience?

Purposeful Activities

Engagement with STEM

• Are students participating in all aspects
of activities equally?
• Are some students dominating group
work?
STEM KNOWLEDGE AND PRACTICES
STEM Content Learning

• Are the activities related to the
STEM learning goals?

• Are students doing the cognitive
work while engaging in hands-on
activities that help them explore
STEM content?

Inquiry

Reflection

• Is STEM content presented accurately
during activities?
• Do the students’ comments, questions,
and performance during activities
reflect accurate uptake of STEM
content
YOUTH DEVELOPMENT IN STEM
Relationships

• Are students participating in the
practices of scientists,
mathematicians, engineers, etc.?
• Are students observing, collecting
data, building explanations, etc.?

• Do students have opportunities to
reflect and engage in meaningmaking about the activities and
related content?

Relevance

Youth Voice

• Are there positive student-facilitator
and student-student interactions?

• Is there evidence that the facilitator
and students are making
connections between the STEM
content and activities and students’
everyday lives and experiences?

• Are students encouraged to voice
their ideas/opinions?
• Do students make important and
meaningful choices that shape
their learning experience?

18

Shah, A., Wylie, C., Gitomer, D., Noam, G. (2013). Technical Report for Dimensions of Success: An
Observation Tool for STEM Programming in Out-of-School Time. Released by Program in Education,
Afterschool, and Resiliency (PEAR) at Harvard University and McLean Hospital.

Abt Associates Inc.

▌pg. 11

3.2.6

Focus Groups with Teachers

SoI teachers were invited to participate in one-hour focus groups during the site visit in order to gather indepth information about program implementation. The semi-structured interviews asked teachers about
student activities, curriculum planning and preparation, successful and challenging aspects of the
curriculum, resource availability and utilization, and professional development experiences.
The focus group sessions were audiotaped, with the permission of participating teachers, detailed notes
were taken, and the discussions were transcribed. To guide study team members in the facilitation of the
teacher focus groups, a protocol was developed and used (Appendix E). One focus group was conducted
at each of the eight camps visited. In total, 45 teachers participated in the focus groups conducted across
the camps.
3.2.7

Interviews with SoI Awardee PIs and Lead Center Staff

The study team conducted semi-structured, one-hour interviews with SoI PIs and Center education leads
at each of the Awardees/Centers involved in the evaluation. In addition, interviews were conducted with
the camp coordinators at the four camps associated with the NASA Centers included in the evaluation
(JSC and GRC). The interviews probed respondents on the program structure and format, program
planning, program’s evidence base, content selection, recruitment strategies, registration processes,
training and support of educators leading the activities, data collection methods, parent and community
engagement, and the roles of partnerships. To help guide the interviews, the study team developed and
used an interview protocol, for which all study team members received training (see Appendix F). After
obtaining respondent agreement, these interviews were recorded and transcribed to ensure accuracy. All
SoI lead staff identified for interviews participated. Representatives from all four Centers and Awardees
as well as individuals from the Center camps participated in interviews between June and August 2013.
The interviewees included Awardee PIs, Center education leads and Center camp coordinators.

3.3

Study Sample

The FY2013 evaluation focused on investigating the effectiveness of the stand-alone SoI model under
optimal conditions. The stand-alone model may hold promise as a summer engagement program model
for middle school students that is replicable across the Federal government. Camps implementing the
stand-alone model offer middle school students a minimum dosage of 30 hours of selected NASA SoI
curricula, independent of other summer programs.
NASA identified a purposive sample of SoI stand-alone camps administered by Awardees or NASA
centers for the FY2013 evaluation. The selected camps were all previously funded in FY2012. Camps
were selected based on specific programmatic criteria. Eligible camps:
•

Offered stand-alone SoI camp experiences, typically one week in length, utilizing NASA SoI
curricula for a minimum of 30 hours during the camp; and

•

Targeted rising 6th through 9th grade students.

NASA also selected camps based on demographic and geographic diversity and the total number of
students served. Logistical constraints were also considered such as the feasibility of the evaluation team
conducting site visits at multiple camps over the short camp operating timeframes. After identifying a
pool of eligible camps, NASA and the evaluation team held a series of conference calls with staff
representing each camp to gather information about camp length, dates, and projected size that was
necessary to finalize the camp selection.
Abt Associates Inc.

▌pg. 12

NASA identified a sample of stand-alone camps administered by four Awardees and centers, including:
NASA Glenn Research Center; NASA Johnson Space Center; Rio Grande Valley Science Association;
and Chester County Intermediate School District. Eleven camps across these 4 Awardees/Centers
participated in the FY2013 evaluation.

3.4

Analysis

Both quantitative descriptive analyses and qualitative content analyses were conducted to investigate
questions of implementation.
3.4.1

Qualitative Data Analysis

At the conclusion of the data collection period, all members of the qualitative data collection team
reviewed the site visit notes and PI/Center education lead interviews and teacher focus group transcripts.
After closely reading these materials, the team discussed the themes that emerged, and through consensus,
crafted a coding scheme to apply to the qualitative data. The coding scheme also incorporated NASA’s a
priori topics of interest, guiding questions from the interview and focus group protocols, and patterns
emerging from the data.
An inductive coding process was used during preliminary analysis, when codes were refined and new
codes and sub-codes were generated in response to new emergent themes. During the coding process, the
study team held regular analytic meetings to review the data, discuss such themes, and agree on
refinements to coding and analysis strategies. Sub-codes were also used, as these are particularly useful
for those codes that have great frequency across respondents, allowing researchers to identify a next level
of emergent themes, and thus systematically provide more detail regarding frequently discussed topics.
The data were coded using NVivo, a qualitative software program that facilitates tagging and retrieval of
data associated with selected codes and sub-codes.
3.4.2

Quantitative Data Analysis

Descriptive Analyses

The study team cleaned and analyzed parent and student survey data and generated descriptive statistics at
baseline (i.e. counts, ranges, frequencies, means, and standard deviations) about key predictor and
outcome variables. Descriptive statistics of parent survey items include student demographics, parent
backgrounds, and parents’ motivations for registering their children for SoI. Descriptive statistics of
student survey items focus on variables such as student motivation, educational aspirations, and interest
and engagement in STEM. Descriptive statistics for DoS data include results of the four different domains
and their dimensions.
Means and standard deviations are used to describe central tendency and variation for survey items using
continuous scales. Frequency distributions and percentages are used to summarize answers given on
categorical scales. Because the universe of students and parents within the selected camps were
administered surveys, the descriptive statistics for a single point in time do not need to be adjusted for
sampling design. Additionally, the high response rates to the parent and student surveys made it
unnecessary to make any adjustments for nonrespondents.

3.5

Limitations

This evaluation and its data collection contribute valuable information about the SoI project. Nonetheless,
it is not without its limitations.
Abt Associates Inc.

▌pg. 13

In particular, the evaluation is limited in its ability to generalize findings to camps outside of those in the
study sample. The camps that participated in the FY2013 evaluation were purposively selected to ensure
the evaluation would improve NASA’s understanding about the successes and challenges, and, ultimately,
the replicability of the most promising SoI program model. The evaluation examined the stand-alone
model implemented at camps with at least 75 students. Because the sample of camps was purposively
selected, they might have characteristics that are systematically different than the full population of camps
and even the population of camps that implement the stand-alone model. Therefore, it cannot be assumed
that the sample is representative of all SoI camps, nor that is it representative of all stand-alone SoI
camps. The focus on understanding SoI in camps with past success implementing the stand-alone model
means that the FY2013 evaluation cannot make statements about the SoI project as a whole or about all
stand-alone camps.
The Outcomes Report will include a discussion of limitations pertinent to the examination in change in
student outcomes.

Abt Associates Inc.

▌pg. 14

4. Findings
In this chapter, we present findings related to the implementation of the stand-alone program model in
selected SoI camps, drawing on data collected through parent and student surveys, registration data from
camps, site visit observations, focus groups with teachers, and interviews with PIs and lead staff.
The first section focuses on student and parent demographic characteristics. These data indicate that: the
study camps primarily served students underrepresented in STEM; about one-third of the parents have a
college degree in a STEM field and/or work in a STEM occupation.
The second section describes how parents learn of SoI, the reasons given for participation, and the extent
to which students are repeat participants. The findings reveal that parents most often hear about SoI
through personal communications; parents indicate that their children attend SoI camps to learn more
about NASA, space, and science; students’ reasons for attending SoI camps include to have fun, and to
learn about science, NASA and space; and one-third of the participants had attended SoI in the past.
The third section describes student interest in science and participation in science activities at baseline.
The findings document that participants have high educational expectations; that SoI students generally
enjoyed their previous science class; and that SoI students engage in a variety of science leisure activities.
The fourth section provides descriptive statistics of ratings on the DoS observational tool used to describe
the quality of camp classes. Overall, classrooms scored highly on nine of the 12 dimensions. More than
half the classrooms observed received a rating of “3” or “4” on the dimensions related to organization,
materials, space utilization, participation, purposeful activities, engagement with STEM, inquiry,
relationships, and youth voice. However, less than half the classrooms received a rating of “3” or “4” on
the dimensions STEM content learning, reflection, and relevance.
The fifth and sixth sections describe the responses from teachers, PIs and lead camp staff during
interviews and focus groups. Each group of respondents noted the importance of the hands-on nature of
the activities in engaging students. Their responses also detailed the large effort that goes into successful
implementation of SoI starting from recruitment and training, and extending to planning and
implementation of activities.

4.1

Parent and Student Characteristics

Parent surveys, collected along with camp registration materials, provide information on the background
characteristics of SoI students and parents. 19
4.1.1

Parent Characteristics

The parent survey collected information about parents’ educational attainment and experience in STEM.
Exhibit 4.1 shows that about half (48.9 percent) of SoI parents at the study camps have at least a
Bachelor’s degree. However, almost one-third of the parents only had a high school education. Of the
parents with an Associate’s degree or higher, about one-third reported that their degree was in a STEM
field and just over one-third reported working in a STEM field.
19

Parent surveys were received from a total of 1,121 parents but only 1,012 students completed baseline surveys
and have parent consent. Parent characteristics for each of these two groups did not differ substantially, and,
therefore, the data presented in this chapter are based on the 1,012 parent surveys for the students included in
the analytic sample (See Appendix G for a comparison of parent characteristics from the two samples).

Abt Associates Inc.

▌pg. 15

Exhibit 4.1

Parent Characteristics

Highest Level of Education Completed (n = 964)
Less than high school
High school diploma or GED
Associate's degree
Bachelor's degree
Graduate degree
Degree in STEM Field (n = 592)a
Yes
No
Work in a STEM Occupation (n = 584)a
Yes
No

%
8.8
28.8
13.4
28.3
20.6
36.0
64.0
39.7
60.3

a: These questions were only asked to parents who indicated they completed an Associate's degree or higher.
Source: Summer of Innovation 2013 Parent Survey

4.1.2

Student Demographics

The majority of students in the sample were rising 6th graders and almost all (91 percent) were entering 6th
through 8th grade in fall 2013 (Exhibit 4.2). Fewer than 10 percent of these students were entering high
school in fall 2013. At the study camps, there were slightly more male students than female students
participating, and the majority (70 percent) of SoI students participating were from an ethnic or racial
group underrepresented in STEM (i.e. Hispanic, Blacks, American Indians or Alaska Natives, and Native
Hawaiians/Pacific Islanders); Hispanic students and Black students were the largest minority groups.
Exhibit 4.2

Student Demographic Characteristics

Rising Grade Level (n = 1012)
6th Grade
7th Grade
8th Grade
9th Grade
Gender (n = 1005)
Female
Male
Ethnicity (n = 995)
Hispanic or Latino/Latina
Not Hispanic or Latino/Latina
Race (n = 1008)a
American Indian or Alaska Native
Asian
Black or African American
Native Hawaiian or other Pacific Islander
White
Underrepresented in STEMb
Yes
No
Notes:

%
39.7
29.2
22.2
8.9
42.8
57.2
32.0
68.0
1.7
9.6
38.0
0.1
44.7
70.1
29.9

a:Responses do not sum to 100 because more than one response could be selected.
b: Includes students who reported ethnicity as Hispanic or Latino/Latina or race as American Indian or Alaska Native, Black or African
American, or Native Hawaiian or other Pacific Islander. Asian students are not considered underrepresented in STEM.

Source: Summer of Innovation 2013 Parent Survey and camp registration data

Abt Associates Inc.

▌pg. 16

4.2

SoI Participation

In addition to questions about their background, parents were asked about how they heard of SoI (Exhibit
4.3). Personal communications was reported as the primary method of outreach; teachers, the local school
or community center, and friends or family members were all frequently reported as sources of
information about SoI. Print announcements or advertisements distributed either in the mail or in a
newspaper were the least frequently reported sources of information.
Exhibit 4.3

Parent Reports of Sources for Hearing of SoI
A teacher

School or community center
A friend or family member
Other
Web/Internet search
Received something in the mail
Newspaper or other advertisement
0

20

40
60
Percent

80

100

Notes: n = 1005
Source: Summer of Innovation 2013 Parent Survey

Information on reasons for participation in SoI was collected from both parents and students (Exhibit 4.4).
The top two reasons cited by parents (light gray bars) were to learn more about NASA and space and to
learn more about science; each of these reasons was cited by about 80 percent of parents. Fewer than onethird of the responding parents indicated that their child attended an SoI camp “to have something to do.”
Students’ responses as to why they signed up for SoI, which reflect the responses from their parents.
While students (dark gray bars) most commonly reported “to have fun” as a reason for signing up for SoI,
the second and third most common reasons for signing up for SoI (i.e. to learn more about NASA and
space and to learn about science) were also the top two reasons cited by parents. Fewer than one-quarter
of the students reported signing up for SoI to make their parents happy.

Abt Associates Inc.

▌pg. 17

Exhibit 4.4

Reports of Reasons for Participation in SoI
Parents

Students

To have fun
To learn more about NASA and space
To learn more about science
To have something to do
To learn about what scientists and engineers do
To do well in school
To meet others with similar interests
Not sure
0

10 20 30 40 50 60 70 80 90 100
Percent

Notes: Parent n = 1005; student n = 1012; Responses do not sum to 100 because more than one response could be selected.
Source: Summer of Innovation 2013 Parent and Student Baseline Surveys.

To understand, year-to-year student retention, students were asked if they had participated in SoI in a
prior year. Exhibit 4.5 demonstrates that about one-third of students reported previously participating in
SoI, but that for over one-half of the students their first time participating in SoI was in summer 2013.

Abt Associates Inc.

▌pg. 18

Exhibit 4.5

Proportion of Students Who Had Previously Attended SoI
11%

55%

34%

No

Yes

Don't Know

n= 1010
Source: Summer of Innovation 2013 Student Baseline Survey

4.3

Student Baseline Interest in Science

The student baseline survey included questions that asked SoI students about previous experiences with
science in school and participation in science activities as well as a series of questions to gauge their level
of interest in science.
Responses revealed that student had high educational aspirations. Exhibit 4.6 shows that two-thirds of the
respondents expect to attain at least a degree and of those students most expect to go on to complete a
graduate degree. Given that in 2009, only 10 percent of adults in the U.S. held a graduate degree (NCES,
2011), the SoI students appear to have high educational motivations.
Exhibit 4.6

Student Expectations of Educational Achievement
%

As things stand now, how far in school do you think you will get? (n = 1009)
High School or Less
Associate’s College
Bachelor’s Degree
Graduate Degree
Don't Know

5.9
2.2
10.6
56.0
25.4

Source: Summer of Innovation 2013 Student Baseline Survey

Students also reported information regarding the last science class they took in school. Exhibit 4.7 shows
that 29 percent of the students reported taking a general science class last year, another 27 percent did not
Abt Associates Inc.

▌pg. 19

know, and 15 percent reported taking multiple classes. These results are likely due to the fact that most of
the responding students were in the lower middle school grades where they often take general science that
covers multiple science topics and where students typically have no choice regarding their science
classes. In the cases where students did report a specific science class, life science and earth science were
most commonly cited.
Exhibit 4.7

Last Year’s Science Class
15%
29%

General Science
Life
Earth
Physical
Integrated

27%

Other
Don't Know

8%

4%

2%

6%

Multiple Responses

9%

n= 1000
Source: Summer of Innovation 2013 Student Baseline Survey

Exhibit 4.8 demonstrates that students had positive attitudes about their previous science class. Eightynine percent of students reported that they often or sometimes enjoyed their last science class, 83 percent
reported never or rarely thinking the class was a waste of their time, and 70 percent said that they were
rarely or never bored in their last science class.
Exhibit 4.8

Students' Opinions of Last Year's Science Class
%
Never

I enjoyed this class very much
I thought this class was a waste of my
I thought this class was

boringa

timea

Rarely
2.9

7.6

Sometimes
39.4

Often

61.4

21.7

12.7

4.2

40.2

30.1

22.2

7.5

50.1

Notes:
a: Because these statements are negatively worded, the responses of never and rarely reflect positive opinions of students’ last science classes.
n = 982 to 990
Source: Summer of Innovation 2013 Student Baseline Survey

In addition to generally high opinions of in-school science classes, Exhibit 4.9 shows that students and
their parents reported participating in various out-of-school time science activities. Watching science
related television programs; playing with science games, kits, or experiments at home; accessing websites
Abt Associates Inc.

▌pg. 20

for computer technology information; and reading science books and magazine were all activities that
over one half of students and their parents reported that students participated in at least once over the
course of the previous school year. Visiting science museums, planetariums, or environmental centers
were reported by the fewer students and parents.
While the majority of responding students and their parents reported participation in various out-of-school
leisure activities, few students and their parents reported that during the previous school year students had
participated in organized science activities such as a science camp, science competition, science club, or
science study group/tutoring program (Exhibit 4.8). Fewer than eight percent of students (dark gray bars)
reported participating in a science club and less than nine percent of students participated in a science
study group or tutoring; parent reports (light gray bars) of these activities were in line with students’
reports. Students and their parents more frequently reported participation in the organized out-of-school
activities of science competitions and science camps, but no more than 22 percent of students reported
participating in either activity.
Exhibit 4.9 Participation in Out-of-School Science Activities
Parents

Students

Watching programs on TV about nature and
discoveries
Playing games or using kits or materials to do
experiments or build things at home
Accessing web sites for computer technology
information
Reading science books and magazines
Visiting a science museum, planetarium, or
environmental center
Science camp
Science competition
Science study groups or tutoring program
Science club
None of these
0

10

20

30

40

50 60
Percent

70

80

90 100

Notes: Parent n = 983; student n = 1012; Responses do not sum to 100 because more than one response could be selected.
Source: Summer of Innovation 2013 Parent and Student Baseline Surveys.

Baseline survey data also suggested that students rated highly on outcomes of interest to the SoI project.
Students who participated in SoI expressed high existing interest in science prior to the start of the
summer activities. Exhibit 4.10 displays SoI students’ mean level of interest in science, as measured by
the Enthusiasm for Science scale, relative to the Common Instrument (CI) benchmark comparison sample
of middle school students involved in out-of-school science programming. The SoI student mean on the
Abt Associates Inc.

▌pg. 21

scale is slightly higher than that of the CI benchmark sample and both means are greater than the scale
mid-point, suggesting that students in both groups are interested in science.
Exhibit 4.10 Common Instrument Enthusiasm for Science Measures
SOIa
Enthusiasm for Science
Supplemental

Scalec

Mean

SD

Min

Max

Mean

SD

Min

Max

3.0

0.5

1.1

4.0

2.9

0.6

1.0

4.0

3.0

0.9

1.0

4.0

3.1

0.9

1.0

4.0

2.4

1.0

1.0

4.0

2.6

0.9

1.0

4.0

Itemsd

Before joining this program, I was interested
in science and science-related things.
Before joining this program, I participated in
science activities outside of school.
a

CI Benchmarkb

Source: Summer of Innovation 2013 Student Baseline Survey

b

Source: Program in Education, Afterschool & Resiliency (PEAR) Common Instrument national benchmarking data. More information on PEAR
can be found here: http://www.pearweb.org/
c

SoI n= 1005; CI n =1173

d

SoI, n= 995; CI n= 442

Further, students scored higher than the NAEP national sample and the CI benchmark comparison sample
of middle school students on selected science-related NAEP items. Exhibit 4.11 shows that on average
SoI students more frequently reported doing more science related activities, that science was a favorite
subject, and aspiring to have a science or computer job.
Exhibit 4.11 Selected Items from NAEP
Mean

SOIa
SD
Min

Max

CI Benchmarkb
Mean SD
Min Max

NAEPc
Mean Min

Max

I do science-related activities at that are not
for schoolwork

2.6

0.9

1.0

4.0

2.6

1.0

1.0

4.0

2.1

1.0

4.0

Science is one of my favorite subjects

3.0

1.0

1.0

4.0

2.8

1.1

1.0

4.0

2.4

1.0

4.0

I take science only because I have to

1.9

1.0

1.0

4.0

2.1

1.0

1.0

4.0

2.4

1.0

4.0

I take science only because it will help me in
the future

2.5

1.0

1.0

4.0

2.5

1.0

1.0

4.0

2.7

1.0

4.0

I would like to have a science or computer
job in the futured

2.6

1.0

1.0

4.0

-

-

-

-

2.2

1.0

4.0

Selected NAEP Items

a

Source: Summer of Innovation 2013 Student Baseline Survey
Ns range from 974 to 997.

b

Source: Program in Education, Afterschool & Resiliency (PEAR) Common Instrument national benchmarking data. More information on PEAR
can be found here: http://www.pearweb.org/
Ns range from 929 to 1155.

c

Source: 2009 8th grade National Assessment of Educational Progress (NAEP) national data, available online at
http://nces.ed.gov/nationsreportcard/naepdata/report.aspx
Ns and standard deviations for these items were not available.
d

In NAEP's original survey, this item is worded, "When I graduate from high school, I would like to have a job related to science." NAEP data
are based on responses to this question, not to NASA's phrasing (above). National data for this question were not available from PEAR. Also,
while the NAEP data for all of the other items included in the table came from a national sample of 8th graders in 2011, the data reported for this
are from a national sample of 12th graders in 2009.

Abt Associates Inc.

▌pg. 22

These findings document that SoI students at the participating camps had, on average, greater interest in
science than their peers. On many of the items reported above the average SoI student scored above the
median score. This suggests that there is limited room for SoI students to increase their interest in
measurable ways given the ceilings these question scales impose on student responses.

4.4

Site Visit Observations

Classroom student activities were observed during site visits to gather information about implementation
of SoI activities. Data were collected from observations of 16 SoI classrooms at eight camps, by certified
observers, using the Dimensions of Success (DoS) observation tool. 20 Observers rated each of the 12
dimensions using the four levels of evidence, with “1” representing absent evidence, and “4” representing
compelling evidence.
4.4.1

Ratings on Dimensions of Success

Exhibit 4.12 displays the distribution of ratings on each of the 12 dimensions. In general, the sessions
observed received high ratings on the dimensions, following the guideline that indicate ratings of “3” or
“4” to represent high quality on a given dimension. However, there are three dimensions—STEM content
learning, reflection, and relevance—that received low ratings across the majority of classroom sessions
observed; more than half of the sessions observed scored a “1” or “2” on these dimensions. The ratings on
each dimension are discussed in greater detail below grouped by the DoS tool’s four domains: features of
the learning environment, activity engagement, STEM knowledge and practices, and youth development
in STEM.

20

Shah, A., Wylie, C., Gitomer, D., Noam, G. (2013). Technical Report for Dimensions of Success: An
Observation Tool for STEM Programming in Out-of-School Time. Released by Program in Education,
Afterschool, and Resiliency (PEAR) at Harvard University and McLean Hospital.

Abt Associates Inc.

▌pg. 23

Youth
STEM
Development in Knowledge and
STEM
Practices

Features of
Activity
the Learning
Engagement Environment

Exhibit 4.12 Dos Ratings of Classroom Observations
14
13
13

Materials
Organization
Space Utilization

Participation

4

STEM Content Learning

7

12
16

7

4

10

Youth Voice

1

9
6

0

5

12
9

10
15
20
25
Number of Classroom Observations

3
2

2

9
17

4

1

12

18

Relationships

3
4

14

8

2
1

7
11

10

2

Reflection

10
13

14

Inquiry

1
1

6

9

8

Purposeful Activities

4

12

12
11

Engagement with STEM

Relevance

13

5
8
30

Note: The numbers inside the bars represent the number of classroom observations with a given rating. For example, 14 classroom observations
received the rating of "4" on the materials dimension, whereas one classroom observation received a "4" rating on the relevance dimension.

Features of the Learning Environment

The learning environment domain includes dimensions of organization, materials, and space utilization.
These measure the features that make the environment suitable for informal STEM programming.
Organization. Seventy-eight percent of classrooms observed received a rating of “3” or “4” on the
organization dimension, which reflects the planning and preparation conducted for the STEM activity,
and considers the availability of materials used in the activity, the facilitator’s ability to adapt to changing
situations, and the fluidity of transitions during the session. For example, in one of the classrooms that
scored a “4” on this dimension, there were sufficient materials for each team to make the rovers they had
designed using the materials the students chose. There was sufficient time allotted for each portion of the
design process, including time for students to demonstrate their rovers at the end of the session.
In contrast, visitors observed challenges with both the availability of materials and adaptation to changing
situations in a classroom that was rated a “1”. In this class, there was only one hole-puncher available for
an entire class, where the activity required that each team use it. While the facilitator circulated with the
single tool, teams were left to wait. In addition, the facilitator was challenged when the session period was
extended by 15 minutes. Initially, a video was started, but there was no sound, so the facilitator read from
an information sheet about kites to fill the extra time.
Materials. Eighty-four percent of classrooms observed received a rating of “3” or “4”on the materials
dimension, which assesses both the appropriateness and appeal of the materials used in the STEM
learning activity. In one of the classrooms that received a rating of “4”, the materials for the activity were
both appropriate and appealing and they supported the learning goals. Materials included a cup, balloon,
Abt Associates Inc.

▌pg. 24

scissors, tape, marshmallows, stacking cups, paper, string, and rubber band. As soon as the teacher gave
instructions and passed out the materials students were eager to get started with the activity and to go
outside to launch marshmallows with the launchers they designed.
In contrast, in a classroom that received a rating of “1” there were not enough yardsticks for all of the
students to measure kite dimensions, resulting in students waiting for long periods of time unable to
engage in the activity. In addition to the number of yardsticks being insufficient, the yardsticks
themselves were inappropriate for the activity because they were much longer than the kites making the
collection of kite dimensions unwieldy.
Space utilization. Sixty-nine percent of classrooms observed received a rating of “3” or “4” on the space
utilization dimension, which gauges the extent to which the physical space in which the STEM activity is
held is conducive to out-of-school-time STEM learning. In a classroom that received a rating of “4”,
small desks were pushed together as tables for groups of students. There was enough space for students to
complete their work and move around their tables, and there was room for the teacher to circulate around
the space checking on student progress. The hallway and large open indoor areas were used for testing
because it was raining outside. In contrast, a classroom that received a rating of “2” was small for the
number of kids. While the desks were pulled together to make more space for students to work, the
groups of desks were loosely formed and difficult to distinguish. There was little room for students to
move or work, and the teacher had trouble moving from group to group because of the limited space
between tables.
Activity Engagement

The activity engagement domain dimensions—student participation, purposeful activities, and student
engagement with STEM—assess how the activity engages all students.
Participation. Seventy-two percent of classrooms observed received a rating of “3” or “4” on the
participation dimension, which reflects the extent to which students are visibly and audibly participating
in the activities. In a classroom that received a rating of “4”, all students were actively participating
throughout the entire observation. All students participated and followed directions during each part of
the observed activity: constructing the plane, testing, re-designing, playing the hangman game,
experimenting with the propulsion devise, and more re-designs. As soon as the teacher distributed the
materials, the students started working on building their designs. Students collaborated in their groups, the
first team that finished yelled, “we got it” and jumped up and down giving each other high-fives.
In contrast, in a classroom that received a rating of “1”, the majority of students were off-task throughout
the lesson. At any given point in the session, over half the students were off task, for example, walking
aimlessly around the room, poking each other with the rockets, entertaining themselves with their
cellphones, texting, throwing materials and talking and laughing. The teacher circulated but did not
attempt to re-engage students who were off-task.
Purposeful activities. Seventy-five percent of classrooms observed received a rating of “3” or “4” on the
purposeful activities dimension, which focuses on the structure of the learning activities, measuring the
extent to which the students understand the goals of activities and the connections between them, as well
as the amount of time spent on activities that relate to the STEM learning goals. In one classroom that was
rated a “4”, all portions of the activity clearly aligned with the learning goals. First, students read
background material about the Wright brothers and the first gliders. Then, students worked on design,
constructed their design glider, tested the glider, re-designed, tested, added new design features, and
tested again. As students were testing their gliders they were talking about adding weight during the next
re-design (“that looks like it might have too much weight in the front”). The process of designing, testing,
Abt Associates Inc.

▌pg. 25

redesigning was reinforcing the aerodynamic concepts they were learning. The teacher managed the time
well, giving students enough time to complete each construction/re-design phase.
Conversely, in a classroom that was rated a “2”, the goals related to science content or engineering were
unclear. Students were manipulating beams and funnels in order to move marbles, but did so without
explicit consideration of what was needed to build the most efficient run for the marbles or that they were
engaged in an engineering design challenge. The activity mostly involved cutting and folding paper to
make funnels, which distracted students from the engineering design process.
Engagement with STEM. Seventy-eight percent of classrooms observed received a rating of “3” or “4” on
the engagement with STEM dimension, which measures the extent to which students are working in a way
that is both “hands-on” and “minds-on”, rating both the type of activities as well as the type of learning
experience (i.e., passive versus active learning). Note, the DoS distinguishes the hand-on, minds-on
nature of the activities from the STEM content learning of the activity. In a class that received a rating of
“4”, the activities were hands-on and minds-on. The teacher connected Newton’s laws of motion to the
activity to support the STEM learning goal. Students engaged in a hands-on activity that involved
building their own Alka-Seltzer fueled rocket. They needed to figure out on their own where they would
place fins and the measurements for the body and cone of the rocket. The students also needed to think
about whether they wanted to use clay to hold their Alka-Seltzer, how much water they wanted to use and
the amount of Alka-Seltzer they would put in their rocket. Students were engaged in a discussion
regarding Newton’s law and chemical reactions throughout the hands-on activity.
Classrooms that rated low on this dimension typically involved hands-on activities, but the teachers did
not link the activity with STEM content. For example, instead of designing a rocket, students in one
classroom followed step-by-step instructions to construct a rocket that would be powered by an AlkaSeltzer rocket.
STEM Knowledge and Practices

The STEM knowledge and practices domain consists of dimensions of students’ STEM content learning,
use of inquiry, and extent of student reflection. These dimensions gauge the extent to which the activities
help students understand STEM concepts, make connections, and participate in practices of STEM
professionals.
STEM content learning. Forty-four percent of classrooms observed received a rating of “3” or “4” on the
STEM content learning dimension, which considers the support students receive to build their
understanding of STEM concepts through the activities implemented. In a classroom that received a “4”
rating, the teacher presented STEM content accurately making various connections with Newton’s Law of
Motion and chemical reactions related to the hands-on activity of building an Alka-Seltzer-powered
rocket. By contrast, in a classroom that received a rating of “1”, the facilitator confounded the
requirements for life with the characteristics of living things. For example, a list generated by the class
included “shelter” and “food” (requirements of living things) as well as “cells” and “animals”. This
confounding made the purpose of the experience opaque for students, and made the activity confusing.
There were also errors introduced by the teachers, such as confusing oxygen with food, calling a slug an
insect, and stating that the presence of gas in the jar with yeast, sugar, and water was evidence of life,
when it is a chemical reaction that does not necessarily indicate life.
Inquiry. Seventy-eight percent of classrooms observed received a rating of “3” or “4” on the inquiry
dimension, which reflects the engagement of students in STEM practices, such as making observations,
asking questions, developing and using models, planning and carrying out observations, analyzing and
interpreting data, engaging in argument from evidence, and sharing findings with peers.
Abt Associates Inc.

▌pg. 26

In one classroom that was rated a “4” students had the opportunity to ask questions, make observations,
collect data, develop models and share findings with their peers and the teacher. Students designed cars,
made predictions about how their cars would perform, built the cars, tested the cars, recorded data and
redesigned their models. The activity culminated in a competition between two classes. As part of the
competition, students described their car designs, tested the cars, recorded data for the distance traveled
by the cars, summarized the data and declared one class the “winner” because their cars went the farthest.
This contrasted with a classroom that received a rating of “2”, where students followed specific
instructions in building launchers, and when they tested the launchers, they just launched marshmallows
one after another, without any discussion about the results of the testing.
Reflection. Thirty-one percent of classrooms observed received a rating of “3” or “4” on the reflection
dimension, which measures the amount of explicit reflection of STEM content during the activity, and the
degree to which student reflections are deep and meaningful, supporting connection-building between
concepts. In one classroom rated “4”, the teacher posed questions during the activity such as, “What did
you predict? How will you modify your design to change the result?” At the end of the activity, the
students responded to questions about how they had designed their cars and about what kinds of changes
to the design resulted in changes in the distance their cars traveled. During the conversation, connections
to Newton’s Laws and different types of energy were drawn. However, in many of the classrooms that
received lower ratings, site visitors observed opportunities both during and after SoI activities where
instructors did not but could have asked questions to prompt students to reflect more deeply on their own
learning.
Youth Development in STEM

The youth development in STEM domain includes the dimensions of relationships, relevance of activities,
and presence of youth voice. These dimensions measure whether interactions encourage student
participation in activities, whether activities are relevant to students’ lives and experiences, and whether
students are encouraged to voice their opinions and make meaningful decisions.
Relationships. Ninety-four percent of classrooms observed received a rating of “3” or “4” on the
relationships dimension, which assesses the nature of the relationship between the facilitator and the
students and the students with their peers, gauging from conversations and actions whether or not the
interactions suggest warm, positive relationships. In most classrooms, teachers had positive rapport with
their students and peer interactions were positive and friendly.
In one classroom that received a “4” rating, peer interactions and interactions between teacher and
students were warm, positive and friendly. Students were laughing, joking and talking about content with
each other. The teacher talked to students as a group and individually, called students by name and
engaged with students as questions arose. This contrasted to impersonal exchanges in a classroom that
received a rating of “2”. There were no negative interactions, but students did not interact with each other
except to hand materials to each other. Further, the facilitator did not call students by their names, nor was
any praise given for student ideas.
Relevance. Twenty-two percent of classrooms observed received a rating of “3” or “4” on the relevance
dimension, which focuses on the extent to which the facilitator guides students in making connections
between the STEM activities and their own lives or experiences, situating the activity in a broader
context. Across the classrooms, while some connections were made to NASA missions, instructors made
few connections to students’ lives.
In a classroom that was rated a “3”, the facilitator made connections, for example – equating the rudder
on a plane to the rudder on a boat (water pushes the rudder on a boat, air pushes the rudder on an airplane)
Abt Associates Inc.

▌pg. 27

and what students have seen on airplanes in their own experience. However, students did not engage in
the discussion of relevance to their experiences. Classrooms that were rated a “1” lacked any discussion
of the application of the STEM content beyond the camp activity.
Youth voice. Fifty-six percent of classrooms observed received a rating of “3” or “4” on the youth voice
dimension, which reflects the ways in which the STEM activity allows students to use their voices and
fulfill roles that offer them personal responsibility. There was a balance between classrooms were
students were given an active role in guiding their learning, and classrooms were the activities were
predominately directed by teachers.
In one classroom rated a “4”, the students were encouraged make a plan for their design and share their
ideas among each other. There were opportunities for students to make important decisions about their
design such as figuring out how a cup will carry a marble down the zip line and how the marble would be
released. As part of this process, students shared opinions about their decisions and collectively made
decisions about design, testing, and redesign. In contrast, in a classroom that was rated a “1”, the activity
was fairly instructor directed. Students looked to the facilitator to make the decisions for selection project
components, and the teacher did not attempt to encourage student decision-making or discussion and
opinions on the decisions being made.

4.5

Teacher Focus Groups

Focus groups conducted with teachers during the site visits provided insight into the implementation of
student activities, the planning that went into the summer camp activities, the resources that were required
to successfully implement camps, and the professional development that had been offered.
4.5.1

Student Activities

Teachers described the ways in which they structured the activities and the classes, the hands-on nature of
the activities that students engaged in, the benefits they perceived for their students, and the challenges
they faced in implementing the activities.
Perceived benefits and successes

Most teachers described organizing and structuring activities through a similar routine; they devoted a
small amount of time to presenting background information and activity goals to the students—sometimes
with handouts or visual information posted on a blackboard, whiteboard, or projector—and then they had
the students break into groups to engage in the activity. The nature and number of the activities
themselves varied depending on the teacher, camp, and the daily plan. One teacher stated:
I do three or four activities a day. I break them up. I give them the times [for the activities] and
all of them they do in groups. I give them time for individual planning, then team planning, then
they have time to build it, and then time to test it, and then the challenge. So that’s how I do my
activities.
Most teachers indicated that they allow time for groups to show and/or demonstrate the final products
from their activity, with competitions being a favorite format. One teacher noted that “whenever we make
it into a competition, [the students] love that.”
Most activities occurred in groups, but a few teachers occasionally had students work independently. “It
varies because we have a lot of individual activities and some that are more geared toward group-centered
activities, team building and working.” Students in groups were typically the same or proximate grades,
mixed gender and sometimes included students who knew one another from their schools or district.
Abt Associates Inc.

▌pg. 28

However, some teachers made efforts to create groups comprised of students unfamiliar with one another,
and others made efforts to change groups each day, so that all students would be included and have
opportunities to make new friends. Additionally, some teachers described the groups as functioning
science teams; that is, each member had a specific role for the activity. One teacher stated:
… within those groups, breaking them into their job duties, which from our session was project
manager, logistics, engineer, and … [other teachers contribute names] reporter,
communications, scientist. From there, they got to know each other in the group and I explained
to them if you can’t put a light bulb in a lamp, don’t sign up for engineer. Go where you’re
strongest at so when you’re collectively together your project will be great.
Overall, teachers indicated that the activities they conducted with students were based almost exclusively
on SoI curricula and materials, though occasionally there were curriculum modifications and/or outside
materials introduced. Teachers cited the satellite, rocket, and volcano activities as the most popular
activities. Generally, they described the activities as being fun and hands-on. One teacher stated,
“Everything is exploring. The activities they have given us for the rocketry, it gives the opportunity to
build, rebuild, redesign, come up with a hypothesis.” According to another teacher, “once [students] got
their hands-on activities, it just continued right until the afternoon. As teachers, we see that and we say
keep going! That hands-on stuff is great… when they create its good stuff!” Moreover, teachers
described their students as having more motivation and engagement precisely because of the hands-on
activities. Students often wanted to take their activity projects (e.g. rockets, roller coasters, cranes) home
with them, or continue working on their activities.
I had this one student who came back today and she said ‘guess what I did?’ ‘I said, what did you
do?’ She said, ‘Well I went home and showed my mom and my brother how to make their own kites,
so now we have kites hanging from our room’s ceiling.’
The activities also provided opportunities for students to experiment and engage in the engineering design
process. Teachers indicated that students appreciated being able to revise their designs. “What I find that
they respond to well is when we say their first try is their prototype. The idea is that they get to do it over.
They really like that second time.”
Teachers perceived the activities as a chance for students to engage in the type of work and cognitive
application of a career scientist. Some topics were new to students, and a few teachers noted that the
application or connection between the activity and real life was successful. One teacher stated that some
of her students, upon finishing an activity on aeronautics, wanted to be pilots and flight attendants.
Another teacher stated that it was the career aspect of the activities that she used to motivate students: “I
started motivating my students by telling them, you know what is in demand [in] jobs today, science,
engineers, building. So that’s what you’re going to do today. You’re going to build something today that
in the near future you can make yourself.”
Many teachers attributed the success of the hands-on activities to their contrast to formal school
environments, where “tests” and direct instruction are prevalent. “Kids were so used to being told what to
do, so to tell them to think about it and watch them think about it and try different things and find
something that works was something a lot of them haven’t experienced.” Teachers also appreciated the
departure from more structured classes. As one teacher noted, “it’s like the crane activity today—I’d love
to incorporate that into my regular class but with 25-30 kids there’s no way I could do that.” Similarly,
teachers enjoyed the “freedom” they had in planning the SoI activities; if an activity or topic was not
working well, the teacher could simply “move on,” unlike the school year where a teacher may be “stuck
to a rigorous curriculum … because you are getting ready for the tests that we have.”
Abt Associates Inc.

▌pg. 29

In addition to students’ enthusiasm, engagement and interest in careers, teachers also described several
other perceived outcomes for students, including increased content understanding and teamwork skills.
One teacher described the ways in which students’ enthusiasm translated to retaining important concepts:
“At the beginning of each day we’ve asked them questions from the previous day . . . I’d say well over
half the students could write down at least two if not three of Newton’s laws... word for word. I mean,
they were getting them.” In addition, it was in the group activities, where students learned to cooperate,
listen, and share ideas.
That’s where you start to see the kids cooperate. They start to realize, ‘maybe it’s not just my
opinion that matters and going back and retrying after maybe not being so successful… They get
to communicate with each other.
Challenges

Implementing, monitoring, and preparing for student activities were not accomplished without challenges.
While teachers did not describe or otherwise imply that any one challenge was overwhelming or
significantly hindered their ability to teach or facilitate the activities, they did note a number of areas that
posed challenges. These included the amount of time needed for activities, the behavior of particular
students, and activity tasks that resembled formal school work.
The amount of time for activities was a challenge mentioned by some teachers, with the most experienced
teachers generally perceiving the activities as taking students more time than allocated, while the least
experienced teachers reported that activities did not take students long enough. For example, one first
year teacher said, “when it says you have two hours to do something it’s way too much time. We have
three hours in the morning. With these kids you just gotta keep going” According to another teacher:
… as teachers we’re like ‘this isn’t gonna take two hours,’ so how can I supplement all this extra
time? What are they going to do because we can’t talk at them. They’re not gonna take notes. So
what are they gonna do? That’s why the hands-on stuff—sometimes NASA may put in one and a half
hours but it may only take fifteen minutes. There are a lot of good resources out there but we have to
take the time to find them.
More experienced teachers suggested that the activities tended to take longer than expected and
mentioned filling any extra time with videos or alternative activities, or having students spend more time
on presenting their products. One teacher advised, “you don’t want to start an activity that they aren’t
going to be able to finish. [NASA] gave us a list of how much time the activity is going to take, but the
kids take longer than that. So after lunch I don’t want to give them an activity that is going to take them
more than an hour. I want them to finish that.” She went on to note that if she had been able to practice
more activities during her training, she would have had a “better idea of which ones I could do after
lunch.”
A few teachers noted that it was also challenging to keep students inside and focused during summer
hours. Some teachers took their activities outside in order to allow the students to “breathe fresh air.”
One teacher who did not take her students outside expressed more frustration:
… it is summer time and I think kids need a break. I know they are older, but, they still need time to
go out and play at the playground even at this age. That wasn’t able to be fit in for us to get our hours
in. They still need that break time, just like they would in the classroom, they still need the break time
to let out some energy.

Abt Associates Inc.

▌pg. 30

Additionally, because teachers and students perceive SoI as a “summer camp,” with informal learning,
some teachers noted the challenge of managing work that resembled formal school work. For example,
teachers described some activities that asked for written observations or reflection pieces.
I think a lot of it feels like school. The hands-on [activities] are great but when you have to write and
you have to write what you saw and they sit there for so long. One kid said yesterday ‘I like it, but
sometimes it feels like school. I just got out of school.’
Others mentioned that a few activities came with packets for planning, stating that the reading and
volume of the packets discouraged some students. Writing portions of activities were also cited as a task
that was disliked by students, as the students “wanted absolutely nothing to do with the writing.”
Similarly, some teachers described the ways in which the camp schedule created an atmosphere similar to
school. Specifically, many discussed the need for more breaks and time for physical activities. Camp
schedules included breaks for meals and restroom breaks, but not breaks for play or structured physical
activities. Teachers recognized the need to fit the required number of SoI content hours, and suggested
that camp leadership struggled to incorporate enough content hours, limiting the possible time for breaks
in the schedule. One teacher commented, “some of these kids just need to break out and run!” Another
teacher suggested that students could benefit from creating NASA themed games, such as tag or a space
version of basketball.
Finally, teachers also described the students and communities they served as both a success and a
challenge. Some noted the success of SoI in providing STEM programming in these communities, “I
know there are financial issues. It would be a shame if the whole thing were dropped. To not fund
something like this especially in our area here. A lot of these kids do not have the opportunity for
something like this.”
However, other teachers noted that particular students have learning challenges that were more difficult to
overcome when trying to manage large numbers of students and student groups as they engaged in an
activity. They gave examples of some students with special needs and some who required additional
attention. For example, one teacher stated:
I have a wanderer, I have one that doesn’t like to be touched or anything like that, doesn’t like
interacting with other students when it gets too loud. So I’ve got a couple of students that I’m finding
I have to modify, I have to work with a little differently… the size of the classroom doesn’t help them
when you’ve students who are sensitive to loudness.
In addition, teachers from one camp noted that their students’ families cannot provide transportation and
students must take a bus to the camp. As a result, many students arrive late, missing some of the morning
activities. Another camp reported that they often have the same students participate in the camp from year
to year. While it is positive that students enjoyed the camp, it is challenging to keep the material
interesting and new.
I started to see kids like a lot, repeat faces, and I’m like, you know we’re doing this again, and then I
had to find different lessons and activities for those kids who did sign up again. That would be torture
to sit through this thing all over again. But that’s how the program is set up. I try not to do the same
things, knowing it may be another time I see this student.

Abt Associates Inc.

▌pg. 31

4.5.2

Planning and Preparation

Teachers’ involvement in program planning

Teachers discussed planning for the program as a whole, as well as planning for specific activities. In
some cases, teachers relied on NASA and their camp leaders for planning the content of their camp and
activities. Many stated that they had not been involved in planning the activities, or that the activities
came from NASA. Generally they described receiving materials and activities to include in the camp
experience, and they described occasionally supplementing activities with additional materials or
resources they found on their own. Before camps began, some camp directors and staff took care of all the
planning—with input from teachers—and all of the supply management. Dealing with materials and
supplies was a significant component of planning. Camps had to ensure that all the needed supplies were
identified and purchased, as well as make sure that the needed supplies were available to teachers each
day. Camps took care of this in different ways. Some teachers were responsible for gathering their own
materials for each day’s activities, and in other cases, the camp director or other non-teaching staff had
that responsibility.
In other camps, the teachers planned all of the activities themselves, either as a component of their PD in
advance of camp, or before the camp began. Teachers who had taught the SoI camp in prior years were
called upon to do much of this work, which they described as a more efficient and effective process.
Teachers noted that without that prior experience, planning the week’s activities was difficult. One
teacher described the process at her camp:
What we normally have is that they assign teachers to each of the content and grade areas during
PD. From that pool of teachers we decide as a team what material we are going to teach what
materials we will use. In terms of the activities, some we take from the PD and some we’d integrate
and modify, it depends on the activity. So if the teacher has some better activities than those on the
NASA website they might use those. We kind of tweak the lessons a little bit.
According to another teacher:
Because I was familiar from last year. I learned from the previous year. I think you have to chunk
these activities together. If someone could actually just chunk them all together. And say, the first day
you should work on these activities because they all go to the same concept… If you haven’t gone
through the experiments, it’s hard for you to do it on your own.
Despite the variation in the level of teacher involvement in planning the activities, they all clearly
appreciated the amount of planning and preparation that was done for them to organize the activities and
assemble the materials needed. Many noted that SoI provided more materials than they would typically
have in their classrooms. They also noted that they were given smaller class sizes in SoI than they would
typically have during the school year, and this was also seen as a real plus. According to one teacher,
“Everything is pretty much here for you – it’s nice to know you don’t need to bring anything. Even if you
get your curriculum really late, at least you don’t have to collect anything and I think that’s good.”
According to another teacher, “so much organization went into this and it was just so nice. If we did this
in our classroom we wouldn’t have this.”
Teachers also noted that, with time and practice over the years of SoI camps, the camp leadership and
staff were getting better at planning the whole camp experience, including estimating the amount of time
that activities would take, and as a consequence, better at planning each day so that it had a rhythm that
worked well. One teacher stated, “First year was two hours longer every day. Too long for the kids, no

Abt Associates Inc.

▌pg. 32

prep time and a working lunch. They wanted us to get through so much stuff. We’re slowly getting there
now to where it should be.”
Challenges

Teachers also described some of the challenges related to program planning, most of which involved the
timing of planning and camp activities. For example, teachers noted their desire to have more time to plan
and prepare for the camp’s activities. According to one teacher, “[We need more] time together. When we
came in I kept hearing I wish we would’ve had that day before, or time before the kids came to be able to
get all of our stuff together.” According to another teacher, “Part of [the challenge] is a time constraint.
This year the grant wasn’t released until late which left our people holding the bag. [We] did not have a
lot of lead time. If you don’t give enough time to plan, you expect to fail. Time constraints can really hurt
sometimes if you’re not given enough time to plan.”
Teachers also described different approaches that the camps took for gathering, organizing, and
distributing materials to teachers and the systems they had developed for ensuring that the program ran
smoothly and the teachers had the materials they needed. In some cases the camp leadership took on the
planning and organizing of materials, while at other camps teachers took a more active role. For example,
one teacher commented, “You have to do your shopping and have your classroom set up for the following
day. I usually stay and set up materials that they need for the following day.” Another camp, in response
to challenges in previous years, had one teacher responsible for picking up the materials for the grade
team.
4.5.3

SoI Resources

Successes

Teachers described camps with slightly different structures and approaches, including the use and
availability of materials and available time and space. The materials students worked with were perhaps
the most critical resource and, as described above, teachers perceived the materials to be very engaging
for students. One teacher stated: “They grabbed the materials and the Play-Doh, and stuff they really
enjoyed. It was simple stuff they could mess with in their hands.” Others described students’ interest in
the website. Several teachers mentioned students who copied down the websites to visit them again at
home. The live feed of astronauts was particularly popular among students. One teacher explained that it
was interesting for students to see the work of scientists and relate it to SoI. “So they would see what [the
astronauts] are doing, and they would say, ‘that kind of looks like what we are doing over here.’ They
enjoyed it more. They do a little bit of research... it was a neat experience.”
Some teachers also described how the amount of available materials was an improvement over their
school year classrooms. For example, a teacher from one camp said, “[The camp leadership] have
provided everything for us. And they have already separated it. By the time we get there, they have
already separated everything for us. So when we go we just take our box and fill it all up, and we’re ready
to go. That’s been great.” Another teacher from the same camp added:
And I didn’t have to worry about copies. I didn’t have to go around looking for cardstock. She gave
me the 35 pop rockets in a purple folder. and I knew in my mind, I saw that I said, I need to make
sure I have that purple folder with me. I need to have it with me on Wednesday. She made the copies
for us in different folders, and all I had to do, as I was leaving, I had to pick up that folder and make
sure I took it with me. So that has helped me out a lot this year.
In addition to discussing the material resources, teachers discussed the amount of time they had available,
and described some of the successes they had in working with colleagues and dividing up staff among
Abt Associates Inc.

▌pg. 33

teaching and administrative roles. Across all sites, teachers reported enjoying working with their
colleagues and discussed ways in which they supported each other, including dividing up roles in order to
minimize prep time for any one particular teacher and allowing them to take on other tasks as needed.
One teacher described how this strategy worked in their camp: “I think the fact that we all chose we have
an activity to be the lead on is much easier on the individual because then you can plan better for it. And
teaming is a lot more productive not only for you but for the kids.”
Challenges

Although the materials were engaging and interesting for students, and some teachers stated that they had
more materials than expected, gathering and ensuring the availability of enough materials to implement
the curriculum as planned still posed one of the greatest challenges for many teachers. Several teachers
stated that they had “just enough” materials for the number of students. For example, one teacher noted,
“I shouldn’t have to substitute half the supplies,” and according to another, “They need to know that it
doesn’t work if you don’t have the proper materials.” And “Trying to find a way to make copies, or trying
to find an extra of something NASA got us is difficult.” Some teachers noted the challenge of cost, and
found their camp was unable to gather all of the necessary resources due to budget constraints. “We
definitely need more money for materials for supplies. This year we didn’t have as many as last year.”
Teachers also described that their methods for adjusting to limited supplies by, for example, having
students working in groups, occasionally resulted in more challenges as each student would want to bring
their product home: “To engage the kids it almost has to be one on one. Even now, it’s like take three to
four kids and build this… but at the end of the day all those kids are invested and want to take it home.”
Teachers also commented that it would be useful to have extra materials for teachers to practice in
advance. Other teachers noted the ways in which limited supplies influences their ability to prepare for
implementation. One teacher noted:
I try to read over the materials, how to build it. I try to make my own. And that’s another thing. The
materials we have I’m noticing that there is just enough for the 25, for the number of students that we
have, so there isn’t enough for us to make a sample. So, I’m figuring out and I’m not going to use
that, because I need that for the kids. I like to make my own sample of some of the things to know how
it works. So I can explain it.
In addition to describing the quantity of materials for specific activities, many teachers also mentioned
they were lacking basic supplies necessary to run the camp, such as paper, scissors, and folders,
influencing their ability to prepare for their lessons. As explained by one teacher, “Sometimes it’s just
enough, or that certain type of paper… we’re at other schools and can’t make copies.” And according to
another, “I don’t have access to the copier. I don’t even know the code. And it takes an act of Congress
for them to let me go into the office and use their copier.” In addition to physical supplies, teachers
pointed out that access to technology can be a challenge, particularly during the summer. A few suggested
that they have no access, or limited access, to technology, including a printer or the internet. One teacher
thought this was the greatest challenge they faced: “Our biggest thing is that we didn’t have the
technology to incorporate the videos and things and that really would have been mind-blowing for the
kids to see it. The building facilities we use is packed up for the summer so we don’t have access to the
technology.”
In addition, despite the collaborative teaming approach described above, many teachers perceived the
number of teaching and administrative staff to be low across their camp, resulting in many staff members
taking on administrative roles to ensure the camps run smoothly. A teacher noted:

Abt Associates Inc.

▌pg. 34

That’s the one thing that gets over looked… these are camps in schools that are shut down. We have a
kid that’s sick. We don’t have a nurse. If a kid gets in a fight, we don’t have a principal. We’re here
on our own. Maybe there’s a janitor, but… you’re by yourself. We try to rotate and keep rooms small
but you’re really in a building by yourself.
Teachers described the challenge that low numbers of staff created for the teacher to student ratio. Class
size ranged from 25 to 30 across most camps, although at some smaller camps teachers reported class
sizes as low as 15. While teachers agreed that classes were smaller than during the regular school year,
those who had class sizes toward the upper end of this ratio noted that it was not ideal for instruction in a
summer program, and they attributed the numbers of available staff to the number of students enrolled
and available funding. “They want us to have 25 to 30, and that is a lot of kids. Especially if you are
having activities that are cooperative activities, because you are always going to have kids that don’t work
well with other kids. So, when you put 30 kids to 1 teacher, that’s a problem. But 20 kids to 1 is a really,
really good number.”
Aside from materials and time, teachers also described the space where the programs were run. Although
the space available for each camp varied based upon the location in, for example, school buildings or
university facilities, many teachers perceived the locations for camps to be too hot, loud, or cramped. One
teacher described a unique challenge in her classroom that affected the implementation of the lessons, “I
have a room without tables. So we are moving desks. It’s fine, but it’s a huge issue, we’re spending our
time moving desks into groups.”
4.5.4

Professional Development and Support

Components of the professional development

Teachers identified a variety of sources for support, including the professional development (PD) and the
support received from their colleagues. The PD was the primary source of growth and learning for the
teachers. Although the amount and duration of PD varied from camp to camp, and from one year to the
next, the components of the PD consisted primarily of:
•

Having teachers/counselors experience a selection of the activities just as their students will during
camp. In rare cases the training covered all of the activities;

•

Providing a demo of all or a selection of the activities;

•

Providing instruction on the science content that will be explored through the activities;

•

Exposing teachers/counselors to NASA’s missions, culture, and/or approach to teaching science;

•

Reviewing videos, PowerPoint presentations, and websites that will support the activities; and

•

Reviewing the materials and supplies that will be used in the activities.

Most teachers reported that the PD lasted one day, with some reporting PD that lasted for up to two or
three full days. Teachers also noted that the training had evolved from previous years. In at least one
camp, more extensive training was provided to first year teachers, and returning teachers received a
shorter, “refresher” version. Typically training lasted longer in prior years and covered more topics in
more depth. Some teachers noted this as a challenge, stating that they would have preferred spending
more time on specific activities: “That first year that’s where I picked up a lot. Since then it’s quicker.
This year it was even faster. If they would just do that, spend a little more time, it would be [much]
better.” However, some teachers noted that the training evolved in a more strategic manner, attempting to
build each year’s PD on what was offered in the past. “I know it has to change every year because we
Abt Associates Inc.

▌pg. 35

don’t want to be trained on the same things so what we did this year was more for 8th grade and last year
was more specifically geared towards 7th grade.”
Successes

Teachers highly valued the ways in which the PD sessions enabled them to experience the activities,
providing them with an understanding of the logistics and content of the activities so that they felt
prepared to address any potential difficulties and answer students’ questions. According to one teacher,
“It was good to do the activities beforehand so we could have a handle on what they had to do.” Another
stated, “If we are able to do them first then we can figure out some of the stumbling blocks that they may
come to.”
Other teachers commented on the content knowledge that they developed through the experience of trying
out the activities with a trainer. They appreciated the connections made between the content and the
activities, and indicated that their deepened content knowledge further prepared them for working with
the students. According to one, “I think they tried to teach you content ... I really liked how they set it up
this year... it was like an hour of content and then follow up with hands-on activity.”
Teachers’ experiences with and opinions about the trainers and their skills varied. Some attributed the
success of the PD to the trainers’ knowledge and positive energy, while others indicated that their trainers
were less skilled at explaining or presenting activities. One teacher stated positively, “They were
enthusiastic and they seemed to like what they were doing. That was good. Like we said, some of the
activities were good, some were ok, but I think the way they were presented was really nice. I liked that.”
And according to another teacher:
He was very knowledgeable, so if we had any problems if we had questions, the way he explained
it, it was like, ‘Wow! Wow! Oh, that makes perfect sense.’ He [related it to] the background
knowledge, and used vocabulary, to where he explained it and we understood. I thought that was
really, really great.
Finally, in most cases, teachers also noted that the collaboration between themselves and their colleagues
contributed to their own learning and teaching. According to one teacher, “Even as a first year you get a
really good support system. The two other teachers I’m working with… they are wonderful and they help
me if I have a question.” Other teachers, such as those who divided up roles among themselves, attributed
this collaborative effort to the success of their camp.
Challenges

Although some teachers spoke positively of the trainers, others conveyed less favorable impressions of
the trainers. According to one teacher, “There were directions but she wouldn’t help us with any of this
stuff or answer questions on it. That was frustrating.” Another teacher explained that the trainers did not
have the prior experience to help them consider the implications for camp or classroom implementation.
[The trainer] was full of knowledge, there is no doubt she knows her stuff. But to translate that
into my classroom and the reality of how I can combine this activity and this one from day one, in
my classroom... I don’t think they have knowledge of how we implement this in the classroom. I
think that gets in the way of it.
Although teachers perceived the trainings to be helpful in preparing them for teaching, some suggested
that additional training time would be even more helpful in aiding their implementation of the activities.
Many of those who suggested the need for more training were either new to teaching or not regular
science classroom teachers. One teacher described his need for additional preparation, “I would happily
Abt Associates Inc.

▌pg. 36

give up three days to go and feel completely confident going into it, versus feeling like I was taught to
swim in a bathtub. And that’s kinda the way I felt, especially not being a science content teacher.”

4.6

PI and Lead Staff Interviews

Interviews with Awardee PIs and Center SoI Leads (collectively referred to as PIs) provided further
insight into the preparation and recruitment activities as well as the implementation of SoI activities.
4.6.1

Recruitment

Students

Efforts to recruit students ranged widely. Some camps relied on distributing flyers and brochures, creating
or placing information on websites, placing phone calls, and word of mouth; across the camps, flyers and
brochures were distributed to thousands of students. In others, information about the SoI camp and how to
enroll was placed on Facebook, Twitter, and school and partner websites. For some camps, recruitment
efforts were fairly comprehensive. One principal investigator explained:
We’ve got our camp brochure … that went out to a distribution list of about 5,000. And then
we’ve got our Facebook and Twitter sites. We’ve got our program landings page, and we’re
always doing email blasts. And then in the community, different organizations taught information
sessions in the neighborhoods, in the school districts. The school districts got our information as
well, the charter schools, and then our partners – the [public library], they sent out information
to their branch locations and then just the recreation centers around the city.
While PIs did not indicate a general preference among recruitment methods, word-of-mouth was viewed
as an effective tool. One PI noted, “parents recommend the program to other parents, I feel like that’s our
largest recruitment effort: word of mouth.” PIs with past SoI experience indicated that word-of-mouth
was easier since SoI had become “institutionalized,” in that students who had experienced SoI were likely
to tell their peers about the program.
Despite the high levels of interest in SoI among partners and within communities, PIs noted a few
challenges related to low enrollment numbers and students who had enrolled but did not attend camp were
the most common. Low registration numbers, though not common, were attributed to competing summer
activities. PIs also reported that in some cases, camps had high registration numbers but then students did
not show up at camp or to SoI “kick off” events. At some camps when teachers realized numbers were
low on the first day, efforts were made to follow up with families. One PI explained, “dealing with this
population, the numbers were not a valid number, we left messages to no return call, some said ‘I forgot.’
We had a few kids show up after follow up call. And being free, people just think that it’s ok, they don’t
pay for it, so they don’t have to come.” One of the consequences of students forgetting or choosing not to
attend is that other students miss the opportunity to participate—students who otherwise may have
enrolled. One PI warned:
We turn kids away saying camps are full. And then [students] come to the camp and it’s not full.
So we’ve got to find a better way for those kids that are signing up and the camps are full to say
‘show up on this day and we’ll put you in the camp,’ so when there is a camp that isn’t full we
have room for these extra kids.
Future funding for recruitment efforts was another concern, although it was mentioned less frequently.
One site relied heavily on teachers to recruit students, and there was concern that less funding would
mean fewer teachers and, in turn, fewer students.
Abt Associates Inc.

▌pg. 37

That’s going to be our biggest challenge especially because when we first started we told them
recruit 20 kids. You have to have to have 20 kids. And now we’re having them recruit 30 kids so
that we can ensure that we have the 20-1 ratio. You know next year we won’t be able to pay this
many teachers and yet we still have to come up with the same numbers so I think that’s going to
be our biggest challenge for next year.
PIs also discussed some of the challenges related to recruiting students in the upper middle school grades.
Some recruitment strategies relied on targeting older students, especially those in eighth and ninth grade
because the younger grades were easier to recruit. One PI said, “we had to turn some fourth and fifth
graders away.” Another PI had the same experience, advising that students in the older grades have more
activities from which to choose.
When you’re doing programming for kids, you’re going to recruit kids who are in grades K-4
before you are going to recruit kids who are in 5-12, meaning like a concentrated number. And
really, you probably will top out somewhere around 5th grade. You’ll top out there, meaning
you’ll get your enrollment number there but 6th grade you’ll start getting a decline in numbers
because you’ve got so many competing activities.
One PI noted the low numbers of ninth graders: “ninth grade this year was really low, compared to past
years. I don’t know if kids are getting to a point where they hit a certain age and they don’t really want to
do a summer camp anymore. Because a lot of my collaborators told me they struggled with that age, and
in the past it wasn’t a problem.”
PIs noted the diversity of students they were able to recruit—there was variation in grades, socioeconomic
backgrounds, gender, and students new to SoI. One PI summarized the students in their SoI camps, “the
first camp was 70 percent free and reduced, second camp was about the same, maybe 65 percent, and On
the Moon about 75 to 80 percent, so we did reach those individuals that were [eligible for] free and
reduced [lunch]. We probably had about 50 percent were female.” Another PI noted, “…we reached well
over our target number this year. And again all of the students are underrepresented and underserved.”
PIs acknowledged their success in retaining students from previous years and in recruiting new students.
“We usually try and do a 50/50 split of returning students versus new students, and I think our summer
numbers, we gained more new students than we did returning, and of course that doesn’t have to be a
negative thing. It just means that you’ve reached a few new groups, which is great.”
Teachers

Teacher recruitment strategies varied, and many camps primarily used returning teachers, or teachers who
were already on the payroll running programs during the school year, so they had less to do in the way of
recruitment. Others reached out to district math and science departments or to returning teachers for
recommendations. At one site, the districts or other collaborators were responsible for their own teacher
recruitment. Some of the PIs talked about their criteria for hiring teachers. At one site, the city set criteria
for who could be hired; at others, the PIs discussed the importance of hiring STEM professionals and
certified teachers. One noted:
We’ve got a few science chairs who are well versed in the areas at different grade levels so that’s
definitely a skill set that we want to have at the table…You really want people who you can
present information to and have them see the big picture of how it should play out in the
classroom and they are definitely good at that.”

Abt Associates Inc.

▌pg. 38

The PIs stated that recruitment of teachers generally went well. As one noted, “I was very confident in
who I hired. They were all very good counselors.” In general, the PIs also felt that teachers seemed
enthusiastic about returning each year. The lack of teacher turnover was seen as a great success in the
majority of camps. One PI noted the eagerness of one teacher in particular who “brought her manual from
the year before with all of her notes and highlighters and her sticky notes.” Another PI felt that the
success of recruitment at their camp was at least partially a result of the camp’s clear expectations and
quick payment. She noted, “I think that they know exactly what’s required of them and what has to be
turned in in order for them to get paid. And how quickly we pay them because I will pay them as quickly
as I check their paperwork.”
When asked about challenges in recruiting teachers, two PIs reported that they did not have a large pool
of teachers to choose from. One of their camps had to bring in teachers from other districts because they
did not have enough of their own. Another PI noted that, although half of their teachers returned from the
previous year, it was difficult to get many applications for the vacant positions. A third site mentioned the
delay in the availability of funds as another challenge for recruitment.
The majority of the SoI camp instructors were certified teachers, primarily science and math teachers,
with a mix of both middle and elementary teachers. One PI reported that their camp also had some
instructors who were science professionals with advanced degrees, while another camp used primarily
college students who were education majors and had worked with children before.
4.6.2

Activities

Student Activities

PIs’ descriptions of student activities were similar to those given by the SoI teachers who participated in
focus groups. The activities were described as hands-on, focused on NASA content and engineering
design, and were generally aligned with national and state standards. Teachers arrived at camps in the
morning to prepare for the day, and after the students arrived, the teachers introduced a new STEM topic
and activities. One PI summarized the experience by saying:
You know the teachers then became real facilitators and that’s the real beauty of it. They weren’t
telling [the students], ‘No that’s not right,’ or ‘Well you should do this.’ The conversation was
always, ‘Well what worked and how could you improve? Which one of these designs worked best
for our objective?’ So it was always focused on thinking and problem solving, working in groups.
There was never a time when the students weren’t active; they were always engaged.
In general, the hands-on aspect of the activities was popular among the students and teachers. The lack of
formal teaching offered a different approach to learning, and was well received by the students. Rocket
activities, roller coasters, videos, presentations, creative scenarios and problem solving worked better than
“lecturing,” as one PI stated. “We found that the activities we used were more hands-on and were not a lot
of worksheets, anything the kids could do that they could see or physically create, those activities were
much more successful.” The activities were so engaging in some cases that teachers allowed students to
spend more than the scheduled amount of time: “It was amazing when we gave them a large piece of
paper and… had them show their creativity. What we thought would take 10 minutes, they spent 45
minutes on.”
Similar to the teachers’ descriptions, PIs noted that while the hands-on component consistently engaged
students, the activities with formal components did not. One PI stated that when activities involved too
much writing or reading, “the kids said it was a lot like school and they were just finishing school and
they really didn’t want to do a lot of things like they’re sitting and doing school stuff . . . We did a lot
Abt Associates Inc.

▌pg. 39

better job with “On the Moon,” we took away the worksheets and . . . it was more successful in my
opinion.”
The PIs noted that the goals and outcomes of the activities were for students to learn content, but also to
understand engineering design and the work of STEM professionals. “Most of the teachers use the format
of putting their students into research groups . . . they basically became engineers in every classroom, and
even one of the students said, “I use that in my Summer of Innovation story book, one of the students
said, ‘I understand what it is to be an engineer now.’” In addition, one PI felt that the activities exposed
students to NASA scientists. “[The students] are having fun while they’re learning and they feel
connected to NASA in a way that they would never be able to by just reading about it or hearing it on the
news.”
Several PIs noted that the curriculum content used for their SoI camps was almost exclusively from
NASA and included topics such as engineering, Mars exploration, rocketry, light and optics, and the solar
system. Some PIs explained that the curriculum varied by grade, and that sometimes the teachers added
new activities to fill time, but across grade levels the majority of the content was from NASA. Two PIs
explained that the content of activities was aligned with abilities of each grade level, using, for example,
“intermediate to moderate to an advanced level” for fourth graders through sixth graders.
Non-NASA content and materials were sometimes used to enhance topics or activities. “We may have
went through the internet, like one of the activities they have to design a rollercoaster so the teachers, I
went into the room and the teacher had a bunch of different YouTube sites of different rollercoasters so
they can get ideas on how to design their rollercoasters.”
Despite the additional content found on the web or, as one PI described, “helpful content provided by a
partner,” the SoI camps relied on NASA content.
It's all NASA content. I mean obviously an activity doesn't always run the same in each camp so
we always gave camps extra activities just in case they have extra time . . . But it's an add-on and
we have a number of add-on things that we add for kids who are advanced beyond the activities
after they've finished the main activity. Robotics and rocket training and that kind of stuff, those
are extras that we brought in just to go above and beyond the NASA curriculum once we finish
the main curriculum.
Professional Development

Most camps held either one or two days of training for instructors that covered the NASA content and
lessons and incorporated hands-on activities. Some sites provided their own professional development,
others had NASA educators come and deliver the training, and a few attended training at NASA facilities.
One of the PIs that had NASA educators at their site described it as “NASA, they came in and hosted I
think eight hours of training using engineering design as the topic. And they just taught the teachers how
to deliver the content to the students like different ways to engage and different ways to break up the
group, different ways to modify the activity.”
PIs reported that teachers were interested and engaged in the training they received. One PI described a
teacher who was supposed to attend a half-day of training and was so excited and enthused that “he said
‘you know gosh I really love this. Can I come back tomorrow?’ So he did and he came back the next day
even though he wasn’t paid to do it.” Another PI commented that the NASA specialist who ran the
training was “wonderful, extremely intelligent... has a great passion for the math and sciences and the
teachers greatly enjoy him.”
Abt Associates Inc.

▌pg. 40

One PI reported that the different modalities used during the PD training were also very constructive.
The fact that [the teachers] actually get to do [the activities], they understand how to teach it to
their kids instead of us just saying here’s the lesson read it and do it. I think the fact that we did
the videos also because I had a lot of teachers ask me you know give me the website again so I
can go look at the video and stuff; I think that helped a lot also.
And in addition to the positive experience of the PD itself, one PI noted an added benefit for the teachers.
“Teachers want the professional development to maintain their certification. They’re always looking for
those units. We are able to provide those hours, so they actively come to us and it’s an expectation that we
provide it to them – that’s a positive.”
Although PIs generally described the PD as engaging opportunities for the teachers, they also described
many of the challenges they encountered. Many of the PIs remarked that the lack of funding caused them
to offer less training than in years past. One PI noted:
For the past two years we had open content that we picked, so we picked whatever we wanted
and it was different content from year one to year two based on the standards. But this year with
an additional drop in funds having professional development was going to be a problem for
us…So in order to make it as efficient and as effective as we could we took activities that we had
already done in year one and two and they became the core for year three.
Another PI commented on the perceived impact of limited funding for professional development.
[This year] we could not give as much background information. It was more doing the hands-on
activities and if you want the background information you’re going to have to look for it in the
curriculum guide or look at a video or something. So we didn’t show videos we didn’t show the
NASA videos, we gave them the links to the NASA videos. In the past, during the 3-day training
we would say so here’s the video that would be really engaging and you should show your
students and here is a NASA brain bite so you can use something.
A third PI commented on the balance they had to strike between running a certain number of camps and
providing sufficient PD.
I think if we had more money, we would do more professional development. The bottom line is we
didn't have any more money to do anymore. Something had to go, we either had less camps and
do more professional development and if we had chosen to do all new activities we would have
less camps and less professional development too. So we try to maximize the number of camps
with quality instruction with adequate professional development to help teachers.
PIs also suggested that funding created challenges for ensuring that all teachers, especially those who
were new to SoI, received adequate training. According to one PI:
One of the complaints from teachers is that they felt for new teachers especially they wanted more
training. I think probably because we had trained for three days in the past and they knew that so
they wanted that. So we’ve got to figure out a way where we can make them feel that they haven’t
been slighted and confident in the content.
Lack of time was another issue for the Centers and Awardees. They mentioned the short amount of time
between receiving their funding and having to get the camps up and running, which limited time for

Abt Associates Inc.

▌pg. 41

planning, preparation, and collaboration, including the time available for providing PD to their
instructors. One PI said:
We offer all of our collaborators professional development, but we didn’t have enough time to
reach all of our collaborators. [One of the partners] wanted to do some new NASA activities, for
example, but we have to prep and they were winding down their school year. And we were trying
to make sure teachers have enough time with their professional development before the camp
began. Our first camp began June 3 and then the first in our area began June 10. When you put
all of this together, I would have liked to have had more one-on-one interaction, but the time
didn’t allow for that.
Although few PIs commented on the content of the support teachers received, some described the need
for additional support around working with special populations of students. For example, PIs described
teachers who needed ideas for modifying activities. One of the PIs, who mentioned that the majority of
the students in their camps were on IEPs or 504 plans, stated, “Professional development is always good,
but maybe some additional things where prepping…Maybe something with how to modify activities to
meet the needs of a special needs student.”
To ensure that teachers received the training they needed, the camps developed different strategies to
overcome the lack of funding and time. One site chose to provide five hours of PD to their returning
teachers, and eight hours of PD to their new teachers who needed more support and were not familiar
with the activities. However, that particular site also commented that their teachers were well prepared
due to an additional external funding source that required additional hours of PD that were aligned with
SoI.
Another PI described using their website to further support their teachers and make up for the lack of
training time.
So what we did was we put on our website videos and clips and things like that that would help
the teachers because we knew that it was going to be difficult to get the training done in one day
for them to be prepared to teach for the week. And so what we would do is the hands-on activities
that would be a little more difficult that we had to do additional training on, those were the ones
that we selected to teach and have them do hands-on during the one-day training. And then if
there was something that was a simple activity then we would just do a simple demonstration of
that activity and then have it online.
Parent Events

PIs from four camps mentioned some form of parental or family involvement during the camps. Most of
the camps hosted a family science night toward the end of the program. The structure of this event at
many camps included parents working through a camp activity with their child and a showcase of student
work from the week. These events took place during camp time, generally on the last day, or, as reported
by one camp, during the last hour of the camp. One camp reported having a separate parent event at the
end of summer which included SoI as well as other programs. Despite the fact that these parent events
took place during the work day all PIs reported that these events were well attended and successful.
According to one PI, “I definitely believe that Family Involvement Day was a success. We had all but two
or three parents show for each one of the camps. It was a good thing.”
Most camps reported that they invited family members other than parents to these events.

Abt Associates Inc.

▌pg. 42

We did have some camps that unfortunately because [the family event] was done during the day,
parents were at work so we did allow the kids to bring grandparents or another relative if parents
were working. We did have many camps that didn’t have any parents but we did have some
camps where almost every single parent could come and the kids had a blast.
Also, camps reported that these events were great formats for talking to parents, soliciting feedback about
the program, and increasing their commitment to SoI programming.
Getting the parents involved has I believe has opened up the parents’ eyes as to what happens at
camp. And the parents have been excited to see that the kids are excited to come to these camps.
All PIs reported plans to continue their parent events and saw them as important aspects of the camp.
Some described plans to expand their parent events in the future.
One PI reported holding a parent night prior to the summer to recruit students. This event was supported
with external funding and took place in the spring. All eligible students and parents in the school district
were invited to attend.
4.6.3

Partnerships

Camp PIs tended to use the terms “partners” and “collaborators” interchangeably for any organization that
provided support to the SoI camps. All PIs reported having a partner, though the type of support provided
and the nature of the relationship between SoI and the partner varied greatly across sites. In some cases,
the partners were engaged at the Awardee level and at others at the camp level. Some PIs reported having
school districts or individual schools as partners or collaborators. Of these sites, the school districts were
primarily responsible for providing the space for the camps and being in charge of student recruitment. In
some cases, the school districts were responsible for recruiting teachers, and, less frequently, student
transportation. One PI also reported that the school district was able to provide breakfast and lunch to
students. However, several camps noted that the school partners provided space, but all other costs
associated with running the camp were covered by SoI. One PI described their relationship with the
school district: “They provide a space, the teachers come from their district unless we have to move some
extra teachers in, but all the costs associated with the grant other than the space are all of the Summer of
Innovation costs. They cover no other costs.”
Other PIs described partners that included the Boys and Girls Club, local colleges or universities, or local
companies and non-profit organizations. Support provided by these organizations was wide-ranging, but
included managing camps, offering facilities, assisting with student recruitment, providing STEM
mentors for students, or even leading activities or demonstrations during the camp. Many of these
partnerships were not specific to the SoI project; in fact, these partnerships were longstanding and cut
across many other types of summer programming offered to students. As explained by one PI, “They are
partners in our camp but they were not partners with the NASA camps. They’re part of our overall
partnership for our large offerings.”
Several PIs reported receiving outside funding or grants from additional public or private sources. These
additional funds supported specific aspects of the SoI project. For example, one PI reported that a grant
supported camp family events, while another reported that grant funds were used to support teacher
training and materials. One PI described their competitive process for soliciting and choosing partners
that allowed them to ensure that the partner would be able to provide the necessary financial support as
well as a commitment to the mission of the camp.

Abt Associates Inc.

▌pg. 43

Overall, the PIs reported many benefits to working with partners, notably ensuring that the camps had
needed space, materials and supplies. Although many partners did not provide financial support to the SoI
project, they were instrumental in recruiting students. One camp reported that their relationship with the
Boys and Girls Club will be instrumental in expanding the number of students reached by the program in
the future. Their success in working with partners was attributed to ensuring a shared vision and goals.
One PI summarized this well, “Well I think all of our partners have a strong connection and want to have
a part in getting kids interested in STEM careers.”
Many of the challenges that PIs described in working with partners were related to the timeline for
funding and communication. A number of PIs reported that because they did not receive notification of
funding for the SoI camps until May, they did not have sufficient time to work with school districts to
plan for the camps, which limited their time available for student recruitment.
We weren’t able to give our collaborators enough time. We didn’t receive notification of funding
until early May and schools let out at the end of May. We’re not allowing … our collaborators to
have enough time to market and reach a large number of students.
Finally, some PIs described their partners, particularly school districts, as being spread out across a state,
at times, far from each other. This made communication and scheduling a challenge. One PI described
being “a good 30-40 minutes away” from most partners and schools, creating some logistical challenges
for working with them.

Abt Associates Inc.

▌pg. 44

5. Discussion
In the section below, we summarize the key findings and discuss how they relate to each of the five
implementation research questions.
5.1.1

What are the characteristics of SoI camps and their participants?

FY2013 SoI stand-alone camps consisted of sets of classes of NASA-themed activities that engaged
students in hands-on activities related to STEM content and the engineering design process. Most camps
divided students into grade-level classes, where students worked in small groups to complete design
challenges or other STEM activities. Teachers and PIs reported that the hands-on nature of the activities
was key to differentiating SoI activities from formal school work. Camps differed in the scientific content
addressed and the overall duration of the program for students.
About one-third of the students at study camps reported participating in SoI during previous summers and
camp leadership described word-of-mouth as their preferred method of recruitment among students and
families from prior summers. While many of the students at SoI study camps were from groups
traditionally underrepresented in STEM, the students were highly motivated to learn about STEM topics
and were, on average, slightly more interested in science than their peers. Not only were the students
motivated to learn about STEM but the students in the study were also motivated to learn in general; over
half expected to complete a graduate degree. The students appeared to be engaged in a multitude of
organized activities and informal, leisure activities as the students reported participating in a variety of
science-related activities both in school and outside of school.
The documentation at baseline of high aspirations, interest in science at baseline, and engagement with
science leisure activities suggests that may be limited room for SoI students to increase their interest in
measurable ways given the ceilings these question scales impose on student responses.
One reason that these students are highly engaged in STEM topics may be that more than one third of
their parents either work in a STEM occupation or have an educational degree in a STEM field. The
parent reports regarding the reasons why their children attended SoI align with the students’ reports of
wanting to learn more about NASA, space and science. Perhaps parents’ awareness of their children’s
interest led them to seek information from friends and their community about summer STEM
opportunities as many reported hearing about SoI from personal communications.
However, data also indicate that SoI camps are reaching the intended audience. Approximately 70% of
students who participated in the evaluation were classified as under-represented in STEM. In addition,
teachers described the gender, socio-economic, and grade level diversity among their students. They
noted the need for the SoI camps among the populations they served; without these camps students would
have had little, if any, access to summer STEM programming. Teachers also described small numbers of
students in their classrooms with transportation challenges, and some who required extra learning support
within their classes.
5.1.2

To what extent do SoI camps meet program quality expectations as defined by the
PEAR Dimensions of Success (DoS) rubrics?

About one-third of SoI participants in 2013 were previous campers suggesting that many students enjoy
their SoI experiences so much that they return in subsequent summers. While student enjoyment is not
necessarily related to program quality, the site visit observations confirmed that students are receiving a
generally high quality camp experience.
Abt Associates Inc.

▌pg. 45

Overall, the observed classrooms rated highly on nine of the 12 dimensions. In general, the sessions
observed received high ratings on the dimensions, following the guideline that indicate ratings of “3” or
“4” to represent quality on a given dimension. A majority of the classrooms rated highly (“3” or “4”) on:
•

Relationships (94 percent)—the nature of the relationship between the facilitator and the students and
the students with their peers, gauging from conversations and actions whether or not the interactions
suggest warm, positive relationships. Based on observation and focus group data, it was clear that
relationships between students and between students and teachers were strong. Several teachers noted
that students enjoyed working together and that team building and collaboration were unintended
outcomes of attending SoI camps.

•

Materials (84 percent)—the appropriateness and appeal of the materials used in the STEM learning
activity. While the majority of camps observed scored on the higher end of the DoS scale, some
teachers noted insufficient supplies due to budget constraints. Other teachers mentioned that having
organized activity boxes ready for use with their students was very helpful and reduced their planning
time.

•

Organization (78 percent)—the availability of materials used in the activity, the facilitator’s ability to
adapt to changing situations, and the fluidity of transitions during the session. Qualitative data
revealed that teachers and camp leaders perceived the availability of material and time resources to be
among their greatest challenges. Some described their efforts to use both material and time resources
efficiently, including dividing up roles among a team of teachers, and reusing supplies when possible.
The high DoS ratings also indicate that teachers and camp leaders demonstrated flexibility and
creativity necessary to adapt to their perceived resource challenges.

•

Engagement with STEM (78 percent)—the extent to which students are working in a way that is both
“hands-on” and “minds-on”, rating both the type of activities as well as the type of learning
experience. The qualitative data indicated that the hands-on nature of the activities were central to the
success of the camps. The hands-on nature of activities were almost universally observed, although
the minds-on engagement was less consistent.

•

Inquiry (78 percent)—the use of activities that support STEM practices, such as making observations,
asking questions, developing and using models, planning and carrying out observations, analyzing
and interpreting data, engaging in argument from evidence, and sharing findings with peers.
Engineering practices, which included gathering and interpreting data in order to meet a design
challenge, were commonly observed activities. Data gathering and recording were other STEM
practices that were often practiced as part of the activities.

•

Purposeful activities (75 percent)—the structure of the learning activities, measuring the extent to
which the students understand the goals of activities and the connections between them, as well as the
amount of time spent on activities that relate to the STEM learning goals. In general teachers felt
prepared and comfortable providing structure for the activities, and this was reflected in the
observations. Some teachers, however, noted that they felt less prepared, and some activities were
easier to structure, especially those that had been covered in PD.

•

Participation (72 percent)—the extent to which students are visibly and audibly participating in the
activities. Students were typically engaged in the SoI classrooms. As documented in the observations
and in teachers’ comments, however, there were instances in which students were not engaged, and
some students that required more attention to keep them engaged. Although SoI specifies a student to
teacher ratio of 20 to 1, some teachers reported larger groups as well as challenges with implementing
hands-on activities with large groups.

Abt Associates Inc.

▌pg. 46

•

Space utilization (69 percent)—the extent to which the physical space in which the STEM activity is
held is conducive to out-of-school- time STEM learning. Many of the classrooms or spaces that were
used for SoI allowed for students to engage in STEM in formats that do not mirror rigid classrooms.
Some of the spaces observed and described by teachers, however, were cramped or structured in a
way that did not readily support more information STEM activities and learning.

•

Youth voice (56 percent)—the ways in which the STEM activity allows students to use their voices
and fulfill roles that offer them personal responsibility. While teachers talked about the engagement
of students in hands-on activities, they less frequently expressed the importance of student choice and
voice in these activities. The active role of students in assuming responsibility for activities was more
in engineering design activities where students made decisions about design, testing, and redesign.

However, there are three dimensions that received low ratings across the majority of sessions observe;
more than half of the sessions observed scored a “1” or “2” on these dimensions.
•

STEM content learning (44 percent)—the support students receive to build their understanding of
STEM concepts through the activities implemented. Many teachers expressed comfort with the
STEM content of the lessons. Teachers noted that the PD provided them with additional content
knowledge, although there were some that felt somewhat unprepared for activities that were not
covered in the PD. This variation across classrooms was observed as STEM content learning was
strong in some classrooms and weak in others.

•

Reflection (31 percent)—the amount of explicit reflection of STEM content during the activity, and
the degree to which student reflections are deep and meaningful, supporting connection-building
between concepts. Instances where explicit reflection that supported connections between concepts
were not commonly observed. Where observed, teachers used probes to get students to reflect more
deeply about their learning.

•

Relevance (22 percent)— the extent to which the facilitator guides students in making connections
between the STEM activities and their own lives or experiences, situating their activity in a broader
context. Observers noted only a few instances where direct connections were made to the real-life
significance or related experiences of the activities. Despite the low DoS ratings, teachers in the focus
groups perceived students to have made connections between the activities and the work of scientists
and engineers. They described creating roles among groups of students that reflected real-life teams
and cited examples of students who, after participating in the design activities, expressed interest in
related careers.

Observations documented high ratings on most dimensions on the DoS tool suggesting that most students
enjoy a quality learning experience across most dimensions. The dimensions on which lower ratings were
generally received suggest that some more attention may be needed in the extension of concepts to
students’ lives and the intentional engagement of students with reflection.
5.1.3

What supports and challenges do Awardees/Centers face in implementing SoI
curricula? How do they handle these challenges?

SoI teachers and PIs reported successful implementation of the student activities and credited the success
in large part to the hands-on nature of the activities, which departed from the format that students often
encountered in their school science classes. Teachers described the enthusiasm that students exhibited
while engaged in the activities, the collaboration and teamwork that students exhibited, and pointed to
some instances where students expressed interest in science-related careers.

Abt Associates Inc.

▌pg. 47

However, despite these successes, the data also provided some insight into on-going challenges that
teachers and camp leaders faced. First, the funding, and timing of funding distribution posed challenges
for some. In some cases, teachers tied limited funding to a lack of material resources and staff time. The
observation and qualitative data indicate that while the majority of lessons included appropriate materials
that were organized and ready for use, teacher planning time was limited and during camps some were
called upon to create last minute back up plans in response to limited resources.
Further, although the 20 to 1 student to teacher ratio is an explicit requirement of the program, and on
paper camps achieve this ratio, teachers described situations in which their classrooms contained more
than 20 students, creating challenges for the hands-on activities.
In addition, while the PD was generally viewed as a valuable resource to assist teachers in preparing for
the implementation of student activities, teachers and PIs noted that the hours of PD this year were
reduced from previous years and camps prioritized PD for the least experienced teachers. As a result,
some teachers were less prepared for the full range of activities that they used with students. Teachers
acknowledged that available funds probably limited the PD offered, but most would have preferred to
have more time in PD working with the activities. In addition, the data indicated the possibility that even
with the increased PD for less experienced teachers, successful implementation of camp programs may
have been more common with more experienced teachers.
5.1.4

What staff, materials, and NASA resources are necessary for successful SoI
activities?

The data revealed that experienced and often certified teachers may be especially important for the
successful implementation of SoI camps. Across most camps, most SoI teachers were certified teachers,
although informal and other non-certified educators were also present as instructors in some camps.
Teacher and PI respondents described enthusiastic teachers who were generally capable of implementing
the SoI activities with the supports that were available. Where less experienced teachers were present, the
PD was especially important for ensuring that these teachers had adequate familiarity with the SoI
activities. However, some of these teachers described activities that were too short for the time allotted,
indicating the possible need for additional support for extending students’ thinking. Some camps reported
having only a small pool of teachers from which to draw, while others described the need for recruiting
additional teachers just prior to the start of camps. In some cases, low numbers of teachers resulted in
high student to teacher ratios. However, many teachers were returning teachers, which lessened the extent
to which camps had to recruit new teachers.
In most classrooms, the SoI materials served as the only source for the full set of classroom activities,
while in other classrooms, the SoI materials were supplemented by other materials. Materials for the SoI
camps were typically funded and provided through different mechanisms—at the time of PD, through
supply room requests, and teacher purchases—however, there were some instances where key tools for an
activity were in limited supply. In addition, partners provided necessary resources for SoI ranging from
support in recruitment efforts to classroom space to donated materials. However, the qualitative data
provided examples of the ways in which teachers and camp leaders distributed supplies and staff roles,
indicating the need for creativity and flexibility in the distribution of resources.
While many activities observed were implemented as intended, some activities were modified prior to use
in the classroom. The degree to which modifications to the SoI materials enhanced the learning
experience varied. In some cases, modifications to the SoI activities undermined the intended inquiry
component of the hands-on activity. In other cases, teachers sited the need for modifications to the
materials either for planning purposes, or to further support the students’ learning. These included more
Abt Associates Inc.

▌pg. 48

accurate indications of the time required to complete the activities, ideas for adjustments or modifications
that could be made in instances when materials were limited, and a change in the reading materials or
expectations for students to engage in reading and writing during a summer STEM program.
5.1.5

How early and to what extent must plans and preparation begin for successful
project implementation?

SoI camps benefited from an extended period for preparation in FY2013 compared with previous years.
However, even though funds were in place earlier than they have been in some previous years, some PIs
reported that funds were still received after they would ideally have received them. Extensive recruitment
efforts were able to generate interest in the program, but some camps still were faced with low enrollment
and registered students who did not show up for camp. The technical review forum (TRF) noted that full
enrollment can often be a challenge for free, short summer programs and they suggested dispersing funds
as early as possible, possibly as early as January, in order to allow camps sufficient time for planning and
recruitment.

Abt Associates Inc.

▌pg. 49

6. Recommendations
This evaluation reported on how the SoI stand-alone model was implemented at 11 camps in FY2013.
Based on the findings of this study, the following recommendations are for consideration as the SoI
project continues to move forward.
•

Continue to encourage active engagement of parents and students in the community through outreach
events as these events are a successful mode of recruitment.

•

Recruitment efforts that highlight the STEM content covered in SoI camps could be successful in
generating high camp enrollments as the opportunity to learn about NASA, science and space is a key
reason why students sign up for SoI. NASA should continue to monitor recruitment efforts in order to
ensure and determine the extent to which camp leaders are focusing on students in low income
neighborhoods as well as students who may not have attended SoI camps in previous years.

•

The Technical Review Forum (TRF) recommends broadening the recruitment to reach students not
already interested and involved in STEM activities. The SoI project team should consider if a change
in the target population is desired and feasible.

•

Target earlier release of funds to camps, given the continued challenges camps face when funding is
received late in the spring. The TRF recommended releasing funds as early as January or February.
Early release of funds would ensure sufficient time for recruitment of students and educators, and
would allow for adequate planning of camp activities.

•

Continue to provide educators with access to hands-on curricula and materials as these are key to
student engagement. Include a review of curricular materials in order to ensure that factors such as
reading level and time allotment are appropriate for recommended grade level. Increase suggestions
for implementation, including explicit connections between the activities and current technology and
authentic scientific and engineering practices.

•

Increase resources for professional development, which is key to successful implementation of SoI
curricula. Increased funding for professional development could increase the number of educators
trained, the length of training provided, and extend the content covered.

•

Provide some professional development that focuses on the areas in which camps were rated lower on
the DoS. Including explicit strategies to improve how the camp activities encourage student
reflection, demonstrate the relevance of activities for students’ lives, and support student
understanding of STEM concepts in the professional development could result in increased camp
quality on DoS dimensions.

•

Provide additional implementation strategies as part of the professional development. In addition to
suggestions for creating student roles within teams that mirror the roles of scientists and engineers
across project teams, strategies could also include ideas for saving materials and time, or sets of
activities that could use the same material resources.

•

Increase teacher planning time prior to camp for teachers to consider ways to incorporate reflection
into camp activities and to ensure the activities link to learning goals. More time for planning also
might allow teachers to collaborate prior to camps in order to figure out what materials they will
need, how they could obtain them, and devise back-up plans that would all ensure ample and efficient
use of materials during the camps.

Abt Associates Inc.

▌pg. 50

•

Consider including student teamwork and collaboration as possible program outcomes and focus for
evaluation.

•

Provide mechanisms for camp leadership to learn about possible methods of maximizing available
resources. This could include suggestions for where to obtain materials, ways to reuse materials, ways
to maximize teachers’ time, or possible types of partners who could provide additional resources. In
addition, consider establishing mechanisms for camp leaders to share successful recruitment and
resource management strategies.

Abt Associates Inc.

▌pg. 51

Appendix A. Data Verification Checklist
Exhibit A.1

Performance Monitoring Data Elements and Supporting Documentation

Information
Related to
Camp Materials

Y/N

Examples of Documents
Lessons plans
Daily camp schedules
Curriculum materials
Class rosters

Student Data

Educator Data

Educator
Trainings

Partnership
Information
Family
Involvement
Events
Recruitment
Events

Abt Associates Inc.

Registration data

Types of information we will
look for in the documents
SoI content
hours of exposure to SoI
content
SoI content
average classroom youth to
educator ratio
number of youth in total and by
grade, gender, race/ethnicity,
and free- and reduced-price
lunch eligibility

Daily attendance sheets

attendance rates

Lists of educators

number of educators

Documents with certification
status

educator type

Educator applications

educator experience with SoI

Educator training agenda

hours of educator training

Educator training
schedule/calendar
Educator training sign-in
sheets
Names of partners

SoI content in educator
trainings

Partner agreements

partner/collaborator type

Agendas

activity type and length

Activity plans

content of activities

Sign-in sheets

number of participants

Sign-in sheets

number of participants

number of participants
number of partners

▌pg. 52

Appendix B. Parent Survey

Summer of Innovation Parent Survey

We are delighted that your child will be part of NASA’s Summer of Innovation. Parents of youth
attending this program are being asked to complete this survey. NASA wants to learn more about the
youths and their parents taking part in NASA experiences so that we can improve what we offer in the
future. There are no “right” or “wrong” answers to any of the questions. The survey should take about 8
minutes to complete the questions.
Your participation in the evaluation is voluntary. Your child can take part in the program even if you do
not take part in the survey.
Securing Your Responses
Protecting your and your child’s privacy is very important to us.
•

NASA’s Office of Education, the research organizations doing the evaluation, and the
program’s staff will follow strict rules to protect the information you provide.
• The evaluation reports will not include your name, your child’s name, or the name of
your child’s school.
• We will not share information that identifies you or your child to anyone outside the
evaluation team and the Summer of Innovation staff, except as required by law.
Questions about the Evaluation
•

For questions about the evaluation, please email Dr. Patricia Moore Shaffer, NASA’s
Office of Education Evaluation Manager, at [email protected] or call
202-358-5230 (toll call).
• For questions about your child’s rights as a participant in this evaluation, please call
Abt’s Institutional Review Board Administrator, Teresa Doksum at 877-520-6835 (tollfree).
If you wish to participate in this study, please turn the page.

Paperwork Reduction Act Statement - This information collection meets the requirements of 44 U.S.C. § 3507, as amended by section 2 of the
Paperwork Reduction Act of 1995. You do not need to answer these questions unless we display a valid Office of Management and Budget
control number. The OMB control number for this collection is . We estimate that it will take 8 minutes to
read the instructions, gather the facts, and answer the questions. Send comments relating to our time estimate above to the NASA Office of
Education at [email protected].
NASA Privacy Policy - This notice provides NASA's policy regarding the nature, purpose, use and sharing of any information collected via this
form. The information you provide on a NASA-issued form will be used only for its intended purpose. NASA will protect your information
consistent with the principles of the Privacy Act, the e-Government act of 2002, the Federal Records Act, and as applicable, the Freedom of
Information Act. Submitting information is strictly voluntary. By doing so, you are giving NASA your permission to use the information for the
intended purpose. If you do not want to give NASA permission to use your information, simply do not provide it. However, not providing certain
information may result in NASA's inability to provide you with the information or services you desire. For additional information please
visit NASA Privacy Policy and Important Notices at http://www.nasa.gov/about/highlights/HP_Privacy.html.

Abt Associates Inc.

▌pg. 53

1. Child’s first name: ________________ Last name: __________________________
2. What is your child’s birthday (MM/DD/YYYY)?:

Month: __ Day:__ Year: ____

3. What grade level will your child enter in fall 2013?
4th 5th 6th 7th 8th 9th Other: ________________________
4. What is your child’s gender? Male Female
5. Is your child Hispanic or Latino/Latina?  Yes  No
6. What is your child’s race? Check one or more.
American Indian or Alaska Native
Asian
Black or African American
Native Hawaiian or Other Pacific Islander
White
7. What is the highest level of education you have completed?
 Less than high school (Skip to Question 10)
 High school diploma or GED (Skip to Question 10)
 Associate’s degree
 Bachelor’s degree
 Master’s degree
 Ph.D., M.D., law degree, or other high level professional degree
8. Do you have a degree in a science, technology, engineering, or mathematics field?
Yes No I don’t know
9. Do you work in a science, technology, engineering, or mathematics-related occupation?
Yes No I don’t know
10. During the last 12 months, has your child participated in any of the following activities
outside of school? Check all that apply.
 Music, dance, art, or theater
 Organized sports supervised by an adult
 Religious youth group or religious instruction
 Scouting or another group or club activity
 Academic instruction outside of school such as from a Saturday Academy, learning center,
personal tutor or summer school program
 A math or science camp
 Another camp
 None of these
Abt Associates Inc.

▌pg. 54

11. Why is your child attending Summer of Innovation? Check all that apply.
 To have fun
 To learn more about NASA and space
 To have something to do
 To learn more about science
 To learn about what scientists and engineers do
 To meet others with interests similar to my child
 Help my child to do well in school
 Not sure
12. How did you hear about Summer of Innovation? Check all that apply.
 A teacher
 A friend or family member
 Newspaper or other advertisement
 Web/Internet search
 Received something in the mail
 School or community center
 Other
13. We also ask that you provide contact information.
Your Contact Information
Your first name: ____________________ Your last name: ________________________
Telephone no.:
( )___________________

Alternative telephone no.:
( ) ______________________

Best time to call:_________________________________________________
Permanent email address (optional):____________________________________________
Alternative email address (optional): ___________________________________________
Student mailing street address:________________________________________________
City:_____________________

State: _________ Zip code: ______________

Emergency Contact (other than parent) Information
Please provide contact information for a responsible adult should you not be available.
First Name:__________________________ Last Name: ____________________________
Relationship to student: _______________________________________________________
Telephone no.:
(
) ____________________

Alternative telephone no.:
(
) ____________________

Best time to call: _____________________________________________________________
Thank you!
Abt Associates Inc.

▌pg. 55

Appendix C. Student Baseline Survey

Summer of Innovation Youth Survey (Baseline)
Congratulations on taking part in NASA’s Summer of Innovation! To improve this program for the
future, all students who attend this program are being asked to complete a survey. There are no “right” or
“wrong” answers to any of the questions. We want your honest opinions. It should take about 6 minutes
to complete the questions. Thank you very much for your help!
NASA and its research team follow strict rules to make sure that only they will see your answers to this
and future surveys for this program, except as required by law. No report will use your name or describe
you in any way that could identify you.

If you wish to participate in this study, please continue.

Paperwork Reduction Act Statement: This information collection meets the requirements of 44 U.S.C. § 3507, as amended by section 2 of the
Paperwork Reduction Act of 1995. You do not need to answer these questions unless we display a valid Office of Management and Budget
(OMB) control number. The OMB control number for this collection is . We estimate that it will take 6 minutes
to read the instructions, gather the facts, and answer the questions. Send only comments relating to our time estimate above to: [email protected].
NASA Privacy Policy - This notice provides NASA's policy regarding the nature, purpose, use and sharing of any information collected via this
form. The information you provide on a NASA-issued form will be used only for its intended purpose. NASA will protect your information
consistent with the principles of the Privacy Act, the e-Government act of 2002, the Federal Records Act, and as applicable, the Freedom of
Information Act. Submitting information is strictly voluntary. By doing so, you are giving NASA your permission to use the information for the
intended purpose. If you do not want to give NASA permission to use your information, simply do not provide it. However, not providing certain
information may result in NASA's inability to provide you with the information or services you desire. For additional information please
visit NASA Privacy Policy and Important Notices at http://www.nasa.gov/about/highlights/HP_Privacy.html.

Abt Associates Inc.

▌pg. 56

Tell NASA about yourself
1. Your first name: ____________________

Your last name: _____________________________

2. What is your birthday (MM/DD/YYYY)?: Month: ___ Day:___ Year: ______
3. What grade level will you enter in fall 2013?
4th 5th 6th 7th 8th 9th Other: ____________________
4.

As things stand now, how far in school do you think you will get?
 Less than high school
 Earn a high school diploma or GED
 Complete an Associate’s degree
 Complete a Bachelor’s degree
 Complete a Master’s degree
 Complete a Ph.D., M.D., law degree, or other high level professional degree
 I don’t know

5.

Why did you sign up for Summer of Innovation? Check all that apply.
 To have fun
 To learn more about NASA and space
 To have something to do
 To learn more about different majors in college (e.g., engineering, science)
 To learn more about science
 To learn about what scientists and engineers do
 To make my parents/guardians happy
 To meet others with interests similar to mine
 To help me to do well in school
 None of these

6. Have you ever been in another Summer of Innovation camp?
 Yes
 No
 I don't know

Tell NASA about your science activities in and outside of school
7.

What science class did you take last year?
 Science or General Science
 Life Science
 Earth Science
 Physical Science
 Integrated or Coordinated Science
 Other science course
 I don’t know

Abt Associates Inc.

▌pg. 57

8.

9.

How much do you agree or disagree with the following statements about your 2012-13 science
class?
Never

Rarely

Sometimes

Often

a.

I enjoyed this class very much

1

2

3

4

b.

I thought this class was a waste of my time

1

2

3

4

c.

I thought this class was boring

1

2

3

4

Since the beginning of the last school year (2012-2013), which of the following activities have you
participated in? Check all that apply.
 Science club
 Science competition
 Science camp
 Science study groups or a program where you were tutored in science
 None of these

10. Since the beginning of the last school year (2012-2013), how often have you done the following
science activities?
Activity

Never

Rarely

Sometimes

Often

a.

Read science books and magazines

1

2

3

4

b.

Access web sites for computer technology
information

1

2

3

4

c.

Visit a science museum, planetarium, or
environmental center

1

2

3

4

d.

Play games or use kits or materials to do
experiments or build things at home

1

2

3

4

1

2

3

4

10.5 Watch programs on TV about nature and
discoveries

Abt Associates Inc.

▌pg. 58

Tell NASA your opinions about science
The next series of questions contain a number of statements about science. You will be asked what you
think about these statements. There are no “right” or “wrong” answers. We just want your opinions. For
this survey, the word “science” covers a broad range of topics, including space and planets, animals and
plants, medicine, computer programming, and designing things like machines.

11. Please indicate the extent to which you agree or disagree with each of the following
statements. Select ONE in each row.
Strongly
Disagree

Disagree

Agree

Strongly
Agree

11.1

Science is something I get excited about

1

2

3

4

11.2

I like to take things apart to learn more
about them

1

2

3

4

11.3

I like to participate in science projects

1

2

3

4

11.4

I’d like to get a science kit as a gift (for example,
a microscope, magnifying glass, a robot, etc.)

1

2

3

4

11.5

I like to see how things are made (for example,
ice-cream, a TV, an iPhone, energy, etc)

1

2

3

4

11.6

I like to watch programs on TV about nature and
discoveries

1

2

3

4

11.7

I am curious to learn more about science,
computers or technology

1

2

3

4

11.8

I like to work on science activities

1

2

3

4

11.9

If I have kids when I grow up, I will take them to
a science museum

1

2

3

4

11.10 I would like to have a science or computer job in
the future.

1

2

3

4

11.11 I want to understand science (for example, to
know how computers work, how rain forms, or
how airplanes fly)

1

2

3

4

11.12 I enjoy visiting science museums or zoos

1

2

3

4

11.13 I get excited about learning about new
discoveries or inventions

1

2

3

4

11.14 I like reading science magazines

1

2

3

4

11.15 I pay attention when people talk about
recycling to protect our environment

1

2

3

4

11.16 I am curious to learn more about cars that run
on electricity

1

2

3

4

11.17 I get excited to find out that I will be doing a
science activity

1

2

3

4

11.18 I enjoy reading science fiction books

1

2

3

4

11.19 I like learning about science on the internet

1

2

3

4

11.20 I like online games or computer programs that
teach me about science

1

2

3

4

11.21 Science is boring

1

2

3

4

Abt Associates Inc.

▌pg. 59

Strongly
Disagree

Disagree

Agree

Strongly
Agree

1

2

3

4

11.23 I like science

1

2

3

4

11.24 Science is one of my favorite subjects

1

2

3

4

11.25 I take science only because I have to

1

2

3

4

11.26 I take science only because it will help me in the
future

1

2

3

4

11.27 Before joining this program, I was interested in
science and science-related things

1

2

3

4

11.28 Before joining this program, I participated in
science activities outside of school

1

2

3

4

11.29 I like to design a solution to a problem.

1

2

3

4

11.30 I like to be part of a team that designs and
builds a hands-on project.

1

2

3

4

11.31 I’m curious to learn how to program a computer
game.

1

2

3

4

11.32 I like to design and build something mechanical
that works.

1

2

3

4

11.22

I do science-related activities that are not for
schoolwork.

Thanks for taking the time to complete this survey!

Abt Associates Inc.

▌pg. 60

Appendix D. Dimensions of Success
1. Organization reflects the planning and preparation for the STEM activity, and considers the
availability of materials used in the activity, the facilitator’s ability to adapt to changing
situations, and the fluidity of transitions during the session. These features reflect the claim that
learning is promoted when the facilitator is well-prepared and uses activity time wisely, avoiding
wasted time and maximizing learning opportunities.
2. Materials assesses both the appropriateness and appeal of the materials used in the STEM
learning activity. The appropriateness of materials is constructed to include the consideration how
well matched the materials are to students’ abilities as well as the extent to which they suit the
learning goals of the activity. This dimension provides important commentary on one feature of
the activity, and offers the opportunity to make somewhat minor adjustments to the STEM
activity by swapping out the materials if the rating reveals they are not the best option for the
audience or activity.
3. Space utilization gauges the extent to which the physical space in which the STEM activity is
held is conducive to out-of-school- time STEM learning. Rather than “teacher-directed”
approaches with desks in rows facing an instructor at the front of the room, informal STEM
spaces often have more fluid settings where students have space to move around, discuss with the
group, and have appropriate access to materials. A second important feature of the space
utilization dimension is the assessment of other distractions (such as noise from a different afterschool program) that impede student learning. These elements of the physical activity space
contribute to the ability and ease of students’ learning.
4. Participation reflects the extent to which students are visibly and audibly participating in the
activities, but does not extend to rating their participation in STEM thinking or inquiry practices
(which falls under the inquiry dimension). Instead, participation rates the extent to which students
are participating in the activities, following directions, and completing the activities as instructed
by the facilitator. An example of a student not participating would be someone “zoning out” or
chatting with peers about unrelated topics. Student participation is a key part of their learning
experiences, so more successful activities tend to have high levels of student participation.
5. Purposeful activities focuses on the structure of the learning activities, measuring the extent to
which the students understand the goals of activities and the connections between them, as well as
the amount of time spent on activities that relate to the STEM learning goals (versus time spent
on other less productive activities). When STEM activities are well-structured, facilitators
scaffold student thinking and allow them to deepen their learning.
6. Engagement with STEM measures the extent to which students are working in a way that is both
“hands-on” and “minds-on”, rating both the type of activities as well as the type of learning
experience (i.e., passive versus active learning). The goal of this dimension is to tap into students’
abilities to construct knowledge for themselves, as opposed to passively watching or listening to a
facilitator engage in a STEM activity or demonstrate knowledge.
7. STEM content learning considers the support students receive to build their understanding of
STEM concepts through the activities implemented. The dimension includes the consideration of
the accuracy of the STEM content presented, the connectedness of the STEM content, and
evidence of students’ accurate understanding of the concepts (as demonstrated by responses,
questions, and conversations).
Abt Associates Inc.

▌pg. 61

8. Inquiry reflects the use of activities that support STEM practices, such as making observations,
asking questions, developing and using models, planning and carrying out observations,
analyzing and interpreting data, engaging in argument from evidence, and sharing findings with
peers. The use of such practices typically help students learn STEM content more deeply and give
them the opportunity to engage with skills pertinent to the daily work of scientists,
mathematicians, and engineers.
9. Reflection measures the amount of explicit reflection of STEM content during the activity, and
the degree to which student reflections are deep and meaningful, supporting connection-building
between concepts. For example, activities that ask students to make sense of what they’ve learned
and discuss their ideas with a peer or a larger group allow for reflection.
10. Relationships assesses the nature of the relationship between the facilitator and the students and
the students with their peers, gauging from conversations and actions whether or not the
interactions suggest warm, positive relationships. Having a positive, respectful relationship helps
students and facilitators complete the STEM learning activities to the best of their abilities,
couching the experience in a friendly, positive environment that students and facilitators feel
comfortable in.
11. Relevance focuses on the extent to which the facilitator guides students in making connections
between the STEM activities and their own lives or experiences, situating their activity in a
broader context. The incorporation of relevance to a STEM activity can help students connect
their learning to potential careers and their communities at large.
12. Youth voice reflects the ways in which the STEM activity allows students to use their voices and
fulfill roles that offer them personal responsibility. The goal is to foster activities in which
students’ ideas, concerns, and opinions are acknowledged and acted upon by others (within
acceptable limits). The opportunity for students to have ownership over the activity and feel like
their voices are heard lends itself to a comfortable learning environment where youth can develop
as STEM learners.

Abt Associates Inc.

▌pg. 62

Appendix E: Teacher Focus Group Protocol
Hello! Thanks so much for coming! My name is [Name], and this is my colleague [colleague name]. We
work for Abt Associates/EDC, which is a research firm in Cambridge, MA and Washington, DC. Abt
Associates and its partner, Education Development Center (EDC) are conducting an evaluation of
NASA’s Summer of Innovation. As part of this study, we are talking with teachers who led Summer of
Innovation camps to learn about how this year went.
NASA is specifically interested in talking with teachers about these topics:

•
•
•

The supports and the challenges faced in implementing SoI curricula
The staff, materials, and NASA resources necessary for successful SoI activities
The plans and preparation necessary for successful program implementation

Through this focus group discussion, and other information collections, NASA intends to document how
SoI was implemented across the nation to better understand what worked and what did not during this
summer. The results of this evaluation are intended to ultimately inform future decisions about program
requirements and supports.
We expect that our discussion will last about 50 minutes. We will be taking notes during our conversation
to ensure accuracy and we would like to audio-tape this conversation, with your permission. No
individuals will be identified by name. If you have any further questions that we may not be able to
answer about this evaluation or this conversation, please contact Katie Speanburg, Abt Associates’
Institutional Review Board Administrator, at (877) 520-6835, or Alina Martinez, the Abt project director
of this study at (617) 520-3516. Please note that these are toll calls.

Teacher Focus Group Protocol
1. Warm-Up
• What do you consider your greatest SoI experience so far this summer?
2.

Student Activities
• Please describe the student activities/camps this summer. What did a typical day look like for
students? [What were the core components of the student activities?]
− Probe - How similar were the camps you taught? In what ways did the activities differ
across camps? Why?
− Probe - How was NASA curriculum used – alone or in conjunction with non-NASA
content? How was the non-NASA content used?
− Probe – What was most successful about the implementation of the NASA curriculum?
− Probe – What challenges did you face in implementing the SoI curriculum?
− Probe - How did you address these challenges?
•

Please describe how you plan and prepare for the SoI camps.
− Probe – Who prepares the overall camp curriculum and is responsible for integrating the
SoI content into the overall curriculum?
− Probe – What challenges have you faced in planning and preparing for camp instruction?
− Probe - How did you address these challenges?

Abt Associates Inc.

▌pg. 63

−

•

3.

Probe – Do you have any suggestions to improve the planning and preparation for SoI
camps?

Did you have adequate resources to provide SoI experiences?
− Probe – Was there adequate staffing to maintain a teacher to student ratio of 1 to 20?
− Probe – Were there adequate materials for SoI activities?
− Probe – Were there sufficient NASA resources provided for SoI activities?
 Probe – What resources did NASA provide you with?
 Probe – Were sufficient NASA curricular materials provided?
 Probe – Was training provided by NASA staff sufficient?
− Probe – Do you have any suggestions to improve the support for SoI camps?

Professional Development
•

Could you describe the training that you received to prepare you to lead SoI camps?
− Probe – What was the training content? What was the most valuable training that you
received?
− Probe - What did you gain from the PD?
− Probe - What types of training approaches were used? Were training sessions in person,
virtual, hands-on, demonstrations, etc? Did you find that some modalities were more
helpful to you than others? Why?
− Probe - How would you like to see the training and support for teachers improve in the
future?

4. Closing
• Do you have any additional thoughts about this summer’s SoI activities that we have not
discussed?
Paperwork Reduction Act Statement - This information collection meets the requirements of 44 U.S.C. § 3507, as amended by section 2 of the
Paperwork Reduction Act of 1995. You do not need to answer these questions unless we display a valid Office of Management and Budget
control number. The OMB control number for this collection is 2700-0150, expiration 2/28/2015. We estimate that it will take about 50 minutes
to hear the instructions, gather the facts, and answer the questions. You may send comments on our time estimate above to: by email to [email protected] or by mail to NASA Office of Education, 4U18, 300 E Street SW, Washington, DC, 20546-0001.
NASA Privacy Policy - This notice provides NASA's policy regarding the nature, purpose, use and sharing of any information collected via this
form. The information you provide on a NASA-issued form will be used only for its intended purpose, which is to improve NASA’s Summer of
Innovation program based on participant feedback. Your responses will be made anonymous and aggregated for review by the Summer of
Innovation program management. NASA will protect your information consistent with the principles of the Privacy Act, the e-Government act of
2002, the Federal Records Act, and as applicable, the Freedom of Information Act. Submitting information is strictly voluntary. By doing so, you
are giving NASA (and its designated representatives) your permission to use the information for the intended purpose. If you do not want to give
NASA permission to use your information, simply do not provide it. However, not providing certain information may result in NASA's inability
to provide you with the information or services you desire. For additional information please visit NASA Privacy Policy and Important Notices at
http://www.nasa.gov/about/highlights/HP_Privacy.html.

Abt Associates Inc.

▌pg. 64

Appendix F: PI Interview Protocol
PI Post-Summer Interview Protocol
Hello! My name is [Name], from [Abt Associates/EDC] and we are conducting an evaluation of NASA’s
Summer of Innovation program. As part of this study, we are collecting data from the national Awardees
and NASA Centers to learn about this summer’s SoI activities.
The evaluation of NASA’s SoI Project will provide insight into this year’s implementation and the
feasibility of NASA’s SoI requirements and appropriateness of NASA’s training and support of the
activities. The approach will allow NASA to document the implementation of the SoI activities across the
Awardees to better understand what worked and what did not during 2013. The results of this evaluation
are intended to ultimately inform future decisions about programmatic requirements and supports.
We expect that our discussion will last about 60 minutes. We will be taking notes during our conversation
to ensure accuracy and we would like to audio-tape this conversation, with your permission. Your
responses will be used to examine the implementation of SoI activities. No individuals will be identified by
name, and your specific responses will only be shared with the evaluation study team. Do you provide
your verbal consent to be audio-taped?

1. Big Picture
• Can you give me an overview of your program’s structure and format?
− How many camps did you run this summer?
•

What content was addressed in the camps and how does it align with national/state standards?
− Probe – science content, engineering content, mathematics content, technology and
investigation skills

•

Were your program choices informed by any data or research in particular? (what was it)
− On what evidence of effectiveness did you base your program design?

2. Partnerships
• Who are your partners? What roles did they play in the camp activities?
• Which aspects of working with partners worked well during the summer (and why)? What would
you like to change and why?
• How are parents or community members involved in your activities?
3. Summer Implementation
Educator Activities
• Please describe the recruitment efforts for teachers.
− How did you recruit your camp teachers? What specific strategies did you use?
− What challenges did you face?
− What experience (science/math/instruction) did your teachers have?
•

Please describe the professional development that was offered to teachers.
− How did the sessions vary?

Abt Associates Inc.

▌pg. 65

−

−

What types of modalities were used? Were they in person, virtual, hands-on,
demonstrations, etc? Did you find that some modalities were more successful in
supporting the teachers than others? How?
How would you like to see the support for teachers improve in the future?

Student Activities
• Please describe the recruitment efforts for students.
− How did you recruit your campers? What specific strategies did you use?
− What challenges did you face?
− Were there any prerequisites for camper participation?
•
•

Please describe the student activities/camps this summer. What did a typical day look like for
students? [What were the core components of the student activities?]
How similar were the camps across all locations? In what ways did the activities differ across
locations?
− Probe – duration of camps; student grade range served; how many students were served;
what was the student-teacher ratio; describe the camp facilities (school/community
center/church etc.)
− Probe – What was the content covered in the camps? How was NASA content used –
alone or in conjunction with non-NASA content?
Successes & Challenges
Overall, what successes did you have in meeting NASA’s requirements, regarding:
− Reaching and retaining underserved and underrepresented students
− Reaching and retaining classroom teachers
− Providing professional development to classroom teachers
− Using NASA curriculum materials and other resources

•

Overall, what challenges did you have in meeting NASA requirements? How did you address
these challenges?
− Reaching and retaining underserved and underrepresented students
− Reaching and retaining classroom/certified teachers
− Providing professional development to classroom teachers
− Using NASA curriculum materials and other resources

•
•
•
•
•

What other challenges did your program face?
How, if at all, did the successes and challenges vary across all of your locations?
If you could start over, what would you do the same? What would you do differently?
What staff, materials, and NASA resources are necessary for successful SoI activities?
How early and to what extend must plans and preparation begin for successful program
implementation?

4. Closing (interview target: PI or camp coordinator)
• What do you consider your greatest achievements of this summer?
• Do you have any additional thoughts about this summer’s SoI activities that we have not discussed?

Abt Associates Inc.

▌pg. 66

Appendix G: Parent Characteristics from the Full and Analytic
Samples
Exhibit G.1

Parent Characteristics by Sample

Highest Level of Education Completed (n = 964)
Less than high school
High school diploma or GED
Associate's degree
Bachelor's degree
Graduate degree
Degree in STEM Fielda
Yes
No
Work in a STEM Occupationa
Yes
No

Analytic
Sample

Full
Sample

%
n = 964
8.8
28.8
13.4
28.3
20.6
n = 592
36.0
64.0
n = 584
39.7
60.3

%
n = 1075
9.1
29.3
13.8
27.3
20.5
n = 650
36.8
63.2
n = 641
40.2
59.8

Notes:
The analytic sample does not include the 109 parents who did not give consent for their children to be surveyed.
a: These questions were only asked to parents who indicated they completed an Associate's degree or higher.
Source: Summer of Innovation 2013 Parent Survey

Abt Associates Inc.

▌pg. 67


File Typeapplication/pdf
File TitleAbt Single-Sided Body Template
AuthorTamara Linkow
File Modified2015-12-10
File Created2013-12-17

© 2024 OMB.report | Privacy Policy