Program Evaluation Plan

FINAL ITEST Evaluation Design and Instruments 4 12.pdf

Data Collection for the Innovative Technology Experiences for Students and Teachers Program Evaluation (ITEST)

Program Evaluation Plan

OMB: 3145-0222

Document [pdf]
Download: pdf | pdf
Innovative Technology Experiences for Students and
Teachers (ITEST) Program Evaluation

Program Evaluation Plan and
Data Collection Instruments
February 3, 2012

PREPARED FOR:

Monya Ruffin
National Science Foundation
COTR, Room 805N
4201 Wilson Boulevard
Arlington, VA 22230
PREPARED BY:

Center for Education Policy
SRI International
333 Ravenswood Avenue
Menlo Park, CA 94025
NSF CONTRACT NUMBER #NSFDACS10C1652
SRI PROJECT NUMBER P19831

CONTENTS

I. INTRODUCTION .................................................................................................................... 1
II. LOGIC MODEL, RESEARCH QUESTIONS, AND EVALUATION OVERVIEW ..................... 1
LOGIC MODEL .......................................................................................................................... 1
RESEARCH QUESTIONS ............................................................................................................ 4
OVERVIEW OF THE EVALUATION ................................................................................................ 4
III. PORTFOLIO REVIEW .......................................................................................................... 6
PROGRAM DESCRIPTION .......................................................................................................... 6
PROJECT DESCRIPTIONS .......................................................................................................... 7
PROJECT EVALUATIONS ..........................................................................................................10
IV. CASE STUDIES ..................................................................................................................13
CASE STUDY SAMPLING ..........................................................................................................13
CASE STUDY DATA COLLECTION AND INSTRUMENT DEVELOPMENT ............................................14
CASE STUDY DATA ANALYSIS ..................................................................................................16
INTEGRATED ANALYSIS ...........................................................................................................17
VI. INSTITUTIONAL REVIEW BOARD.....................................................................................18
VII. TIMELINE ...........................................................................................................................18
REFERENCES .........................................................................................................................19
APPENDIX: DRAFT DATA COLLECTION INSTRUMENTS ....................................................21

I. INTRODUCTION
NSF’s Innovative Technology Experiences for Students and Teachers (ITEST) program aims
to strengthen the formal and informal learning experiences of K-12 students to cultivate their
interests in and capacities for science, technology, engineering and mathematics (STEM) careers,
with a special emphasis on technology. It also seeks to improve the leadership and knowledge of
the community that conducts outreach for STEM learning. Since 2003, ITEST has funded nearly
150 projects to implement, study, and scale up effective strategies to motivate and equip young
students to pursue STEM careers. Yet, evidence about the program’s overall impact is thin, and
data are lacking on more and less effective project features—what works and what does not.
To address the lack of evaluative program data, SRI International and Inverness Research
Associates have undertaken an evaluation of the ITEST program. This document presents the
evaluation’s design. We first present the logic model underlying the ITEST program and the
research questions driving the evaluation. Then, we present the data collection and analysis
plans. Finally, we present a timeline for the program evaluation.

II. LOGIC MODEL, RESEARCH QUESTIONS, AND EVALUATION OVERVIEW
The ITEST program invests in projects that vary widely by type (i.e., research, convenings,
scale-up, and strategies), context (e.g., urban, rural), content focus (e.g., bioscience, engineering,
computer science), setting (e.g., informal, formal), and target population (e.g., teachers,
students), among others. Despite these variations, all ITEST projects share a common goal: to
expand the STEM workforce by increasing students’ capacity for and interest in STEM studies.
We begin this section with an overview of the theory of change that specifies how ITEST seeks
to attain this goal. We then use the logic model to specify a set of overarching research
questions. Note that the logic model represents the program and not the evaluation. Later in the
document we specify the aspects of the logic model that the evaluation will be measuring.
Logic Model
Given the complex and multifaceted nature of ITEST, we have developed a logic model to
aid in decisions about key aspects of the program that merit research and evaluation. The model
is a systematic and visual way to present the relationships among the resources available to
create and deliver the program, the activities the program offers, and the anticipated changes or
results. It identifies the ultimate outcomes the program seeks to achieve and the assumptions
regarding how the program is hypothesized to contribute to those outcomes. The logic model
also identifies intermediate outcomes.
Exhibit 1 portrays the logic underlying the ITEST program, beginning with the program’s
goal of increasing participation in the STEM workforce at the far right bottom of the graphic.
Increasing workforce participation generally requires that students successfully attain STEM
degrees. We use this term to refer to graduate and undergraduate degrees in a STEM discipline
(whether through traditional or nontraditional pathways) (Malcolm et al., 2005) as well as to
technical degrees requiring fewer than 4 years of postsecondary schooling. Successful attainment
ITEST Program Evaluation Plan – Final

1

of such degrees generally requires students to leave secondary school with the dispositions and
capacities for future STEM study.
Our review of the research points to six central dispositional constructs. Students must have
an interest in STEM learning and STEM careers (Girod, 2005; Appleton et al., 2008). Students
must possess a self-identity as science learners (Girod, 2005; Kozoll & Osborne, 2004) as well as
a sense of self-efficacy (Pintrich & DeGroot, 1990; Thomas, Anderson, & Nashon, 2008; Kind,
Jones, & Barmby, 2007). They must have a favorable disposition toward their future
participation in science (Kind, Jones, & Barmby, 2007). They must be aware of the value of
science in society (Kind, Jones, & Barmby, 2007; Davis-Kean, 2007; Messersmith et al., 2008).
Finally, they must believe that they have knowledge of STEM careers (Hurtado et al., 2009).
In terms of student capacities for STEM learning, we rely on the National Research Council
consensus documents on learning in formal and informal settings (NRC, 2007; 2009). Learners
practice and develop scientific capacities as they (1) “come to generate, understand, remember,
and use concepts, explanations, arguments, models, and facts related to science;”
(2) “manipulate, test, explore, predict, question, observe, and make sense of the natural and
physical world”; and (3) “participate in scientific activities and learning practices with others,
using scientific language and tools” (NRC, 2009, p. 4).
For students to develop these dispositions and capacities, they must have access to productive
learning opportunities. Students must have opportunities to engage with and develop their
fluencies with scientific concepts, explanations, arguments, and models (Chinn & Malhotra,
2001) as well as with scientific practices of reasoning, metacognition, data analysis, and mastery
of the use of scientific language (Brown, Reveles, & Kelly, 2004). In our own work, we have
defined productive experiences as those comprising “rigorous investigation informed by the
scientific process that results in new content knowledge and knowledge of how to learn, relevant
to students’ lives, and providing the opportunity for inspiration that translates into motivation”
(American Museum of Natural History, 2010, p. 11).
For teachers to create active, STEM-rich environments, they need opportunities to learn
deeper science content and to participate in scientific inquiry themselves (Basista & Mathews,
2002). Many ITEST projects focus on creating just such productive learning opportunities for
teachers. The overarching goals are to support teachers to have the capacity to develop coherent
learning plans, to guide student learning through authentic experiences, to assess their teaching
and student learning on an ongoing basis, to design and manage authentic learning environments,
and to develop a community of learners in their classrooms (see NRC, 1996).
Most ITEST projects provide direct service—meaningful learning opportunities—for youth
and their teachers. Scale-up projects implement models in a large-scale setting such as across a
state, region, or the nation. Other projects involve improving the knowledge base on effective
STEM experiences by conducting research or convening researchers and practitioners. Through
its convenings, communication strategies, and the work of the Learning Resource Center (LRC),
ITEST also seeks to build a professional community that shares lessons learned and works in
concert to build the nation’s STEM workforce. As the professional community develops and
more teachers are trained, the capacity of the system as a whole will expand to foster better
learning opportunities even in the absence of ITEST.
ITEST Program Evaluation Plan – Final

2

Exhibit 1. Logic Model for the ITEST Program Evaluation

ITEST Program Evaluation Plan – Final

3

Research Questions
The model suggests three main research questions:
1. What are the projects’ impacts? What are the achieved outcomes in key areas for
students and teachers in ITEST projects? Do youth who participate in ITEST projects
demonstrate greater interest in STEM activities and careers than nonparticipants? To
what extent are project evaluations rigorous?
2. What project models are most effective in delivering desired student and teacher
outcomes? What project characteristics contribute to these models’ success?
3. How can we best characterize and describe ITEST projects? What do the projects do?
Who do the projects serve? Where and when are ITEST project activities taking
place?
Overview of the Evaluation
We have designed a comprehensive, multimethod evaluation to address these research
questions. The design incorporates a portfolio review and case studies of ITEST projects and
outcomes. Exhibit 2 indicates the data sources that we will use to address each research question.

ITEST Program Evaluation Plan – Final

4

Exhibit 2. Research Questions and Data Sources
Data Source
Portfolio Analysis

Research Question

Program
description

Project
descriptions

Case Studies

Project
evaluations

Project
background

Project
strategies

Project
administration

Project
outcomes

1. What are the projects’ impacts?
What are the achieved outcomes in
key areas for students and teachers
in ITEST projects?

X

X

Do youth who participate in ITEST
projects demonstrate greater interest
in STEM activities and careers than
nonparticipants?

X

X

X

X

To what extent are project
evaluations rigorous?

X

2. What project models are most effective in delivering desired student and teacher outcomes?
What project characteristics
contribute to these models’ success?

X

X

X

X

X

X

X

X

3. How can we best characterize and describe ITEST projects?
What do the projects do?

X

X

Who do the projects serve?

X

X

Where and when are ITEST project
activities taking place?

X

X

ITEST Program Evaluation Plan – Final

X
X

X

X

5

III. PORTFOLIO REVIEW
To create a comprehensive description of the ITEST portfolio, we will use the logic model
presented in Exhibit 1 as the framework for describing the ITEST projects and their evaluations.
We will use this framework to describe what the ITEST projects do and in what content areas,
who the projects serve, where and when the ITEST projects take place, and how they have
changed over time. We also will characterize the intended and achieved outcomes in key areas
for students and teachers and assess the level of rigor of the project evaluations. The review will
include three components: (1) a description of the ITEST program that details program intent,
changes over time, and offers a contextual lens through which to understand individual projects;
(2) descriptions of ITEST projects that characterize important descriptive features of the projects
and identify project “models;” and (3) a review of project evaluations that characterizes and
assesses the quality of project evaluations. Our approach for each of the three components is
described in detail below.
Program Description
The first component of the portfolio review is a description of the ITEST program as a
whole, including its evolution over time. This description will be derived from a review of
ITEST program solicitations, interviews with NSF ITEST program officers, and analyses of the
LRC.
Examine ITEST solicitations

From 2003 through 2011, NSF has funded eight consecutive cohorts of ITEST projects.
While the overall structure and purpose of the ITEST program has remained intact, the program
solicitations have contained changes that impacted the types of projects funded, the populations
served by ITEST projects, and the project evaluation requirements. Because the solicitations
have changed over time, it is necessary to understand and describe these changes in order to
provide a comprehensive picture of the ITEST program. To do so, we reviewed all ITEST
program solicitations released between 2003 and 2009, and for each solicitation, we identified:


NSF’s stated goals for awarded projects;



Desired or required characteristics of ITEST projects (i.e., populations and activities that
the solicitation recommends or requires);



Requirements or recommendations for project evaluation methods.

We then compared and contrasted the program solicitations in order to describe changes in the
ITEST program over time. Our findings from this analysis are presented in the September 30th
annual report. As our program evaluation continues, we will review the more recent solicitations.
Findings from all solicitations will be incorporated into future reports on the portfolio analysis.

ITEST Program Evaluation Plan – Final

6

Interview program officers

In order to get a broad perspective on the ITEST program, we interviewed five ITEST
program officers who have been involved with the program since its inception. In our interviews,
we asked program officers to provide information about the background of the ITEST program,
the changes they have perceived since the program began, and how they would characterize the
ITEST projects. We also solicited their input with regard to perceived gaps in the ITEST
portfolio, important outcomes from ITEST projects, and examples of good formative and
summative evaluations. The program officers provided us with a wealth of descriptive
information, and we will use this information in our general description of the ITEST program,
as well as our review of how the program and projects have changed over time.
Characterize the role of the Learning Resource Center (LRC)

Funded by NSF in conjunction with the first cohort of ITEST projects, the LRC is charged
with providing technical assistance and support to ITEST projects. As shown in the logic model,
the LRC plays an important role in the ITEST community. In this role, the LRC disseminates
information about ITEST projects, convenes annual meetings of ITEST principal investigators
(PIs), and strives to build community among ITEST grantees. In order to describe the role of the
LRC within the ITEST community, we will review the LRC website, examine external
evaluations of the LRC, and interview LRC staff.
Analysis

Data will be analyzed in order to draw a general picture of the ITEST program, including
NSF’s intent and the program’s evolution over time. As such, we will take a qualitative analytic
approach that draws on the principles of content analysis (e.g. Holsti, 1969) to identify broad
themes across the data and track changes over time. In particular, we will detail the evolution of
program goals, desired or required project characteristics, and evaluation expectations. This
approach will offer an important contextual background to guide direct data collection and to
ascertain whether program intent was realized.
Project Descriptions
The second component of the portfolio review is a description of the entirety of projects
funded through the ITEST program. This review will include all project types: strategies, scaleups, research projects, and convenings. To describe the portfolio of projects, we will gather and
review data from the Management Information System (MIS) maintained by the LRC and the
NSF eJackets in order to describe project goals, activities, partnerships, populations served, and
dissemination strategies. We will then use these data from the MIS and eJackets to identify
project models. Throughout the portfolio analysis task, we will consider where we may add the
most value to existing information collected and presented by the LRC given our access to the
eJacket system.

ITEST Program Evaluation Plan – Final

7

Gather and review data from the MIS

Over the past 8 years, the LRC has collected a great deal of descriptive information on
ITEST projects through its project profiles and annual questionnaire of project PIs (starting in
2009). The project profiles contain short descriptions of ITEST projects in all eight cohorts,
which describe project goals and activities. They also contain basic descriptive data, which can
be used to categorize ITEST projects across an array of dimensions such as project type (e.g.,
convening, research, strategies), populations served (e.g., students, teachers), geographic location
(e.g., urban, rural, suburban) and area of focus (e.g., engineering, computer science, geography).
The annual survey of project PIs delves deeper into who participates in ITEST projects, how
often they participate, and in what kind of activities they participate. The LRC maintains these
data in its MIS. So as not to duplicate data collection efforts, we requested and received access to
these data from NSF and have begun preliminary analyses to characterize and categorize ITEST
projects.
Gather and review data from eJackets

NSF’s eJackets are electronic file folders that contain project documents such as project
proposals, annual and final reports (including evaluation reports), and proposal funding
justifications. We have begun to review each project’s eJacket in order to supplement the
descriptive data from the MIS, identify project partners and participating organizations, and
identify dissemination strategies.
Coding Scheme

Our coding scheme is consistent with the taxonomy presented in LRC’s MIS. Since the LRC
has had considerable interaction with the projects, we will use this knowledge base as a starting
point for basic descriptive data for the ITEST program. The field variables we will track, and
their corresponding codes, are summarized in Exhibit 3 below.

ITEST Program Evaluation Plan – Final

8

Exhibit 3. Variables and Codes
Variable

Codes

Grade Served

K-2; 3-5; 6-8; 9-12

Topic Area

Bioscience; computer science; engineering; environmental science;
mathematics

Project Components

Tech-based; class-work; career skills; field work; participation of scientists;
mentoring; career planning; engaging STEM researchers

Technologies
Employed

Geospatial technologies, programming tools, multi-media tools, data
analysis/computation tools, visualization/computer modeling, hand-held
devices, social networking tools, game development, electronics/robotics tools,
engineering/design tools, virtual reality, communication tools, imaging tools,
other tech tools

Use of Technologies

Learning tool, data collection (e.g. probes), data sharing (e.g. databases,
social media), dissemination of findings (e.g. web page, social media)

Intervention Target
and Numbers
Involved

Student Focused [no count; fewer than 25; 25-50; 51-100; more than 100]
Teacher Focused [no count; 1-10; 11-25; 26-50; more than 50]
Hybrid [no count; 1-10; 11-25; 26-50; more than 50]

Project Format

Students [summer (<2 weeks); summers (>2 weeks); in school; after-school;
weekends; youth employment; distance learning; online networking]
Teachers [after-school; weekends; PD days; during school; summer institute
with youth; summer program; distance learning; online networking]

Award Size

Dollar amount

Grantee Organization

College/university; government; industry; K-12; museum; non-profit; youth
organization

Project Evaluators

Firm; individual; university institute; not specified

Evaluation Budget

As % of total budget

Dissemination

Presentation [conference; workshop; media; training; meeting]
Publication [conference paper; journal article; book; book chapter; other]
Product [website; curricula; software/hardware]

Partnerships

K-12; industry; college/university; non-profit; government; youth organization;
community/tribal organization; museum; professional association

Region

New England; Middle Atlantic; East North Central; West North Central; South
Atlantic; East South Central; West South Central; Mountain; Pacific

Urbanicity

Urban; rural; suburban; multiple

Analysis of MIS and eJacket Data

We will analyze the MIS and eJacket data in three sequential steps. The first step will be to
calculate descriptive statistics such as frequencies and variances for each of the variables
identified in Exhibit 3 above. These calculations will provide a description of the entire
population of projects in the ITEST program. The second step of the analysis will involve the
creation of a set of cross tabulations through which we can explore the relationship between
different project characteristics. For example, if NSF is interested in whether there has been a
discernible shift in the type of partnerships across cohorts, we would create a table as illustrated
in Exhibit 4. Similarly, if NSF is interested in how technologies are used in conjunction with
ITEST Program Evaluation Plan – Final

9

specific project components (e.g. data collection probes in conjunction with participation of
scientists) we could explore that cross tabulation. Such analyses would also allow for
descriptions of change over time and between projects in different location types (urban vs.
rural), for example.
Exhibit 4. Illustrative Table Shell: ITEST Project Partnerships by Cohort
Cohort
1

Cohort
2

Cohort
3

Cohort
4

Cohort
5

Cohort
6

Cohort
7

Cohort
8

K-12
College/University
Non-profit
Government
Youth organization
Community organization
Museum
Professional association

The final step of the analysis will be model development, for which we will identify a set of
common project models. This process involves identifying a set of project characteristics that
“cluster” together. An example of a model might be urban-based projects involving partnerships
between a school system and museums or other science-rich institutions to provide after-school
experiences to K-12 students. The project models will be identified inductively from the data in
the MIS and eJackets. Thus, the degree to which we will be able to create a set of unique models
across settings and cohorts will depend on the empirical data.
Project Evaluations
The third component of the portfolio review is an analysis of the project evaluations. The
goal of the evaluation review is to broadly characterize the types of evaluations conducted by
ITEST project evaluators, assess the quality and rigor of these evaluations, and document the
achieved outcomes in key areas for students and teachers. Data on project evaluations will be
extracted from the eJacket system. For completed projects, we will review the project
evaluations included in the final reports. For on-going projects, we will review the evaluation
plans described in the proposals and the evaluation reports included in the annual reports.
While the ITEST solicitations require that ITEST project PIs conduct an evaluation of their
ITEST project, there are few requirements that guide the type of evaluation to be conducted. As
such, project evaluations employ disparate evaluation methodologies and serve a wide array of
purposes. Given this diversity, we will begin our evaluation of each project evaluation by
identifying the purpose of the evaluation (formative or summative) and characterizing the
evaluation methodologies (e.g., experimental, quasi-experimental, one group pre/post,
qualitative) employed in the project evaluations. Next, using the evaluation plans and evaluation
reports, we will identify the outcomes of interest for students and/or teachers in each project
evaluation. We will then categorize the outcomes and present a summary of the types of
ITEST Program Evaluation Plan – Final

10

outcomes targeted by the evaluations, as well as the relative success of projects in achieving the
outcomes across all ITEST projects. Finally, we will assess the quality and rigor of the ITEST
project evaluations. We will draw on the criteria used by the NSF-funded Online Evaluation
Resource Library (OERL), which defines quality criteria for sound project evaluation plans,
instruments, reports based on best practices information from the Joint Committee on Standards
for Educational Evaluation, and the American Evaluation Association’s Guiding Principles for
Evaluators. The following questions, based on the OERL, will guide our review of each ITEST
project evaluation:1
1. Does the evaluation reflect an understanding of the project logic model, including
elements such as the goals/objectives, short-term and long-term outcomes, activities,
stakeholders, and context?
2. Does the evaluation make clear what questions are to be answered?
3. What is the evaluation design? Is the evaluation design in line with the goals of the
evaluation?
4. What is the sampling frame for the evaluation? What types of data are collected? How
are the data collected? When is the data collected?
5. What instruments are used to collect data? What is the quality of the data collection
instruments?
6. How are the data analyzed? Are the analysis methods appropriate, given the research
questions and purpose of the evaluation? Are the interpretations of the results supported
by the data?
7. What conclusions and recommendations were reported? Were the recommendations and
conclusions backed by the data?
We will summarize answers to these questions to provide a description of the level of quality
and rigor of project evaluations across all ITEST projects.
Data Collection and Coding Scheme

To assess the evaluations, we will code each project’s evaluations along a number of
dimensions that will enable us to answer the above questions based on the OERL. Exhibit 5
shows the specific variables we are interested in and the coding categories we will utilize, based
on the pilot work done so far.

1

Note that these criteria apply to projects that have served youth or educators, not convenings, workshops, or
research projects, nor the LRC.

ITEST Program Evaluation Plan – Final

11

Exhibit 5. Variables and Codes for Evaluation Review
Variable

Codes

Evaluation type

Formative; summative

Evaluation goal

Implementation; impact

Student data sources

Attitude survey; evaluation survey; background survey; knowledge survey;
observation; interview; focus group; standardized tests; ITEST attendance;
grades; course enrollment; student work; logs; school attendance

Teacher data sources

Attitude survey; evaluation survey; background survey; knowledge survey;
interview; observations; logs

Parent and staff data
sources

Survey; interview

Evaluation designs

Experiment; quasi-experiment; pre-post with control; pre-post no control; post
only with control; post only no control; qualitative

Instrumentation

Project-constructed; externally-constructed; reliability reported; validation of
measures; psychometric properties

Analyses

Descriptive statistics; T-test; ANOVA; chi-square; regression; ANCOVA;
qualitative only

Quality of evaluation

Integrity of comparison groups; sampling frame; coherence with project logic
model; appropriateness of analytic methods; transparency (e.g. limitations of
reported findings are adequately presented)

Analysis

Similar to the analysis of the project descriptions, the first step of the evaluation analysis will
focus on descriptive statistics. This step will provide a descriptive summary of the range of
evaluation purposes, methodologies, and outcomes assessed. The second step of the analysis will
apply the evaluative criteria discussed above to assess the rigor of each evaluation and the value
of the information it offers for the individual project and for the program overall. This evaluative
lens also will enable us to identify exemplary evaluation practices and key challenges that
projects encounter in conducting rigorous evaluations.2
In order to ensure the reliability of findings, we piloted our coding scheme and assessment
criteria across a sample of evaluation reports using stratified random sampling. Using a statistical
analysis program, we randomly selected three projects from each of cohorts 1 through 7 for
review, for a total of 21 evaluation reports. As a part of the piloting process, two researchers
independently read and reviewed the 21 reports, entering their evaluation codes into separate
Excel databases. Upon completion of the reviews, the two Excel databases were combined and
researchers’ reviews were assessed for comparability. Inter-rater reliability was assessed for two
questions: 1) are the interpretations supported by the data; and 2) what were the evaluation
challenges.3 This pilot enabled us to revise the instrument so that it adequately captures salient
2

3

As discussed in the “Integrated Analysis” section below, these findings will be considered together with case
study data to offer insight into NSF’s ongoing efforts to make the programs it funds more evaluable.
Researchers achieved very high inter-rater reliability on these items, achieving 100% agreement on whether the
interpretations in the report were supported by the data and 97% agreement on evaluation challenges.

ITEST Program Evaluation Plan – Final

12

information from the full spectrum of evaluation reports. It also will enable us to train
researchers in the use of the instrument so that data capture is consistent across coders. Moving
forward, we will apply our coding scheme and assessment criteria to the full set of evaluation
reports. We also will assess inter-rater reliability, as described above.

IV. CASE STUDIES
Over the course of the study, we will conduct 24 case studies to understand how ITEST
projects function, what characteristics contribute to a model’s success, and the impact of the
projects on participants. We discuss the general approach to the 24 case studies in this section.
Case Study Sampling
Case study projects will be selected from “strategies” and “scale-up” projects. We are
excluding convening and research projects from the case studies because they are not a central
data source to answer the research questions of the evaluation (listed in Exhibit 2). Further, we
will only include a scale-up project if we are able to collect the full complement of data at a
single location.
We will employ a two-step strategy for selecting projects for case studies.
1. We will select 12 promising projects to enable us to answer the research questions: What
project models are most effective in delivering desired student and teacher outcomes?
What project characteristics contribute to a model’s success? Promising projects will be
identified through our review of project evaluations, discussed in the previous section,
and recommendations from project officers. The goal will be to identify projects that
have demonstrated the greatest impact on participants through a credible evaluation. Note
that the sample may include a small number of projects that no longer receive ITEST
funding, but which have been identified as highly successful and which have adequate
data from which to draw.
2. After identifying 12 promising projects, we will attempt to select the remaining 12 cases
so that the entire sample of 24 case study projects reasonably approximates the full range
of ITEST projects along the following dimensions: topic area, project components, and
project format.4 While the small number of cases precludes a sample that is statistically
representative along these dimensions, every effort will be made to achieve reasonable
face validity for a sample of 24 cases that reflects the breadth of the ITEST portfolio. All
active projects will be eligible for selection as part of this strategy with the exception of
notably underperforming projects, since data from these projects will not help answer the
research questions.

4

Please see Exhibit 3 for codes detailing the range of categories for each dimension.

ITEST Program Evaluation Plan – Final

13

Case Study Data Collection and Instrument Development
Two-person evaluation teams will visit each project for 3 days—sufficient time to become
immersed in a project. Projects that are larger in scope may require up to 2 additional days on
site; projects that are no longer active may require less time on site. On-site activities will
include interviews with the principal investigator, project staff, local evaluators, and partners as
well as focus groups of participants. When practicable, we will schedule site visits at times that
enable us to observe project activities such as a teacher professional development session or an
afterschool class.
We considered three primary factors in developing interview protocols: (1) what do we want
to measure, (2) of whom will we ask questions, and (3) what types of questions do we want to
ask.
First, the logic model sets forth the constructs to be measured—what we want to know about.
The case studies will focus on key elements of the theory of change outlined in the logic model,
including the supports provided by the ITEST projects, improvements in the learning
opportunities for students and teachers, and outcomes for participants.
Second, for each of the constructs, we determined the type of respondent with the knowledge
to answer the question reliably. For example, the principal investigator and other project staff
have information about the project context, funding streams, and project curriculum. The local
evaluator will be able to describe the evaluation design, findings, and challenges encountered.
Participants have information about their frequency of participation in the project and their
intentions for seeking other similar opportunities. Where appropriate, we will ask multiple
respondents questions to measure a single construct. This strategy enables us to triangulate
findings across respondents and increase the reliability of the data collection. Examples of
interview and focus group topics by respondent can be found in Exhibit 6. Draft interview
protocols are included in the Appendix.

ITEST Program Evaluation Plan – Final

14

Exhibit 6. Illustrative Interview and Focus Group Topics by Respondent
Respondent
Student
(focus
group)

Parent
(focus
group)

PI/Co-PI

Project history, goals, context

X

X

Project structure, partners,
funding streams

X

X

Model Components

X

X

X

Implementation successes and
challenges

X

X

X

Perceived short-term and longterm impacts

X

X

X

X

X

X

Aspects of the project perceived
to be most/least beneficial

X

X

X

X

X

X

Lessons learned for STEM
learning models

X

X

X

Evaluation design, goals,
methods, challenges

X

Topic

Local
Evaluator

Teacher
(focus
group)

Project
Partner

X

Partner goals

X

Reasons for involvement

X

X

X

X

Frequency of involvement

X

X

X

X

Intentions of seeking similar
opportunities

X

X

X

X

X

X

X

Prevalence of and efficacy in
using technology
Project sustainability

X
X

X

X

Third, the types of questions to pose depend on the information being collected. Most
interview questions are open-ended to encourage respondents to talk about the issues that are
important to them and their projects. Some questions, however, were designed to be closedended to enable the research team to count instances of a construct. We designed the protocols
according to widely accepted principles that increase the reliability and validity of the
instruments (see, for example, Patton, 2002).
To maximize the reliability of case study data, site visitors will be trained before going into
the field and will receive a manual containing all materials relevant to case study data collection
(e.g., selection criteria for respondents, protocols, consent forms, debriefing forms). The training
will help team members develop a common understanding of the data collection and analysis
goals. SRI has used a similar training model in other evaluations and has found that it increases
the reliability of data collected by multiple researchers because shared understanding maintains
consistency in data collection across projects and facilitates cross-project comparisons.
ITEST Program Evaluation Plan – Final

15

Case Study Data Analysis
The goal of the case study data analysis will be to first analyze the interview and focus group
data within each project in an effort to understand the goals, structures, implementation, and
outcomes as they are occurring in the context of each project. The evaluation team will then look
across projects to identify cross-cutting themes and patterns related to the implementation and
effectiveness of the program more broadly.
We will follow an iterative approach to analyzing the qualitative data, one that begins before
each site visit, continues while on-site, and proceeds through the drafting of internal case study
debriefing guides to cross-project analysis (See Appendix for the draft debriefing guide). Before
we conduct the case studies, we will collect and review relevant documents (e.g., project
proposals, annual reports, websites). The formal analytic process will begin as we sketch the
outlines of each project on the basis of the documents we collect. Analysis will continue during
the case studies themselves. Two researchers will conduct each case study, and throughout the
visit, the team will discuss with each other their initial impressions about key features of the
ITEST project and the degree to which the emerging story matches study hypotheses (drawn
from the logic model). More formally, the researchers will meet each day of the visit to go
through the case study debriefing form and formulate preliminary responses. Researchers will
discuss with each other what they learned in their interviews and, if necessary, fill in any gaps
and examine initial hypotheses in subsequent interviews. Researchers also will discuss themes
that emerge that they may not have anticipated. Engaging in this analytic process while on-site
serves to tailor and refine data collection to capture the most important features of local
implementation. It also allows researchers to generate and test hypotheses while still in the field.
Once each visit is completed, researchers will draft case study reports. Drafting such reports
requires the researchers to reduce their field notes to descriptive prose within the structure of a
formal debriefing form. This translation of field notes to a case study report involves sorting all
the data collected in each site (interviews, observations, and document reviews) by the topic
areas that define the sections of the debriefing form (e.g., project context, project components,
implementation successes and challenges, and project effectiveness). Within each section or
major topic area, the researchers will code for information on specific subtopics. The researchers
then will use the sorted data to draft each section of the case study report. Because the
researchers will draw on information from a variety of respondents, they will use the case study
report to synthesize their findings and note apparent contradictions. As they translate their field
notes into the case study report, they will use specific examples and quotes as evidence for their
assertions. Distilling field notes into a case study report in this way serves three purposes. First,
it reduces the amount of data we must manage for further analysis. Second, it establishes a
consistent within-case analytic process across projects. Third, it anticipates the cross-project
analysis by seeing that each pair of researchers address the topics we expect to focus on as we
look across projects. The debriefing form that will guide the case study reports reflects the
evaluation questions and topic areas discussed in this document.
The case study reports are meant to facilitate cross-project analysis. Once the individual
reports are completed, formal cross-project analysis will begin. The goal of the analysis is to
compare, contrast, and synthesize findings and propositions from the single projects to make
statements about the sample or segments of the sample (e.g., informal or formal projects). We
ITEST Program Evaluation Plan – Final

16

will begin the cross-project analysis process with a debriefing meeting. A debriefing of this type
is an efficient means of developing themes for cross-project analyses. Individual researchers,
assigned to specific topics, then will conduct more fine-grained analyses and report back to the
larger group before we begin the process of integrating findings from across the data sources.
Integrated Analysis
The previous sections describe our analytic approaches for each of the data sources. We also
will conduct an integrated analysis that looks across the multiple data sources. The integrated
analysis will focus on three lines of inquiry critical to ITEST and to NSF’s ongoing support of
similar programs: 1) through what mechanisms, and under what circumstances, are the ITEST
projects resulting in changes in formal and informal learning opportunities, teacher outcomes,
and student dispositions and capabilities; 2) which project models are most effective in achieving
desired goals; and 3) how can the ITEST program, and similar NSF-funded programs, be made
more evaluable?
To understand through what mechanisms, and under what circumstances, ITEST projects
result in positive outcomes, we will expand on findings from the portfolio review by drawing on
the rich and highly contextualized case study data. Findings from the portfolio review will
provide program-level data on project components and outcomes. Case study data will provide
details about the particular contexts in which the projects are operating and the learning
experiences provided, and specifics on how project implementation evolves over time. By
analyzing these data together, we will be able to identify the characteristics of successful projects
across the ITEST portfolio.
To understand what project models are most effective in achieving project goals, we will use
case study data to test and elaborate the project learning models that emerge from the portfolio
review. Whereas the portfolio review will offer broad outlines of the STEM learning models
employed across the ITEST program, case study data will enable us to refine these models in
light of a nuanced understanding of on-the-ground practices. In order refine the learning models,
we will use an inductive analytic strategy (Erickson, 1986). We will begin the task by analyzing
data from the portfolio analysis to develop working theories of potential models. We will then
test and refine these models with iterative passes through the more descriptive and explanatory
case study data. Once we are satisfied that additional passes through the data will yield no new
insights, we will apply a comparative lens across models to better understand which are most
effective and which are most problematic in achieving desired goals.
To understand how the ITEST program, and similar NSF-funded programs, can be made
more evaluable, we will scrutinize data about ITEST evaluation practices across the portfolio,
using case study data to better understand the successes and challenges projects encounter in
conducting rigorous evaluations. In this effort, we will seek to understand what kinds of
information are most useful to individual projects and to NSF, as well as how that information
may be best collected, analyzed, and communicated. Moreover, we will explore ways in which
the needs of individual projects for timely and tailored information may be balanced with NSF’s
need for program-level information that is commensurable across projects.

ITEST Program Evaluation Plan – Final

17

Taken together, the components of the integrated analysis will help NSF tailor solicitations in
ways that lead to projects with a higher likelihood of success, fund projects employing more
promising models, and rigorously evaluate the effectiveness of funded projects. As such, this
evaluation will help NSF to continue making investments likely to lead to desired outcomes.

VI. INSTITUTIONAL REVIEW BOARD
It is SRI's policy that investigators respect and protect the rights and welfare of individuals
recruited for or participating in research conducted by or under the auspices of SRI. SRI strictly
adheres to the Federal Policy for the Protection of Human Subjects, or the “Common Rule,” as
codified in separate regulations by a number of Federal departments and agencies. SRI's
involvement of human subjects in research comes under the terms of a formal assurance with the
Office for Human Research Protections of the Department of Health and Human Services. All
SRI staff or contractors who conduct, support, or review research involving human subjects must
comply with the regulations identified in that assurance, as well as applicable state and
institutional policies and standards of professional conduct and practice. SRI’s Institutional
Review Board (IRB) has the primary responsibility for the oversight of the protection of human
subjects involved in any SRI research project in accordance with such regulations.

VII. TIMELINE
Exhibit 7 details the timeline for project activities and associated deliverables.
Exhibit 7. Evaluation Timeline
Year 1
2010

Task
Revise Design

Year 2

2011

F

W

Sp

●

●

●

2011

S

F

2012

W

Submit OMB package

●

Analyze portfolio

●

F

W

Sp

●

●

●

●

●

●

●
●

S

2013

S

●

Conduct case studies
Create reports

Sp

2012

●

●

Convene expert panel

Year 3

●

●

●
●

●

●

●

●

●

●

*F = October, November, December; W = January, February, March; Sp = April, May, June; S = July, August, September

ITEST Program Evaluation Plan – Final

18

REFERENCES
American Evaluation Association (2004). American Evaluation Association: Guiding principles
for evaluators. Retrieved from http://www.eval.org/Publications/GuidingPrinciples.asp
American Museum of Natural History (AMNH). (2010). The Science Learning City: A new
model of middle school science education. New York, NY: Author.
Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school:
critical conceptual and methodological issues of the construct. Psychology In The Schools,
45, 369–386.
Basista, B. & Mathews, S. (2002). Integrated science and mathematics professional development
programs. School Science and Mathematics, 102(7), 359370.
Brown, B., Reveles, J., & Kelly, G. (2004). Scientific literacy and discursive identity: A
theoretical framework for understanding science learning. Science Education, 89(5),
779-802.
Chinn, C. A., & Malhotra, B. A. (2002). Epistemologically authentic inquiry in schools: A
theoretical framework for evaluating inquiry tasks. Science Education, 86(2), 175–218.
Davis-Kean, P. (2007). Educating a STEM Workforce: New Strategies for U-M and the State of
Michigan. Paper presented at Educating a STEM Workforce Summit, Ann Arbor, May 21.
Erickson, F. (1986) Qualitative methods in research on teaching. In M. C. Wittrock (ed.), The
handbook of research on teaching (3rd. ed., pp. 119161). New York: Macmillan.
Girod, M. (2005). Tech Report Attitudes Measure. [White Paper]. Lawrence Hall of Science:
University of California, Berkeley.
Holsti, O. R. (1969). Content analysis for the social sciences and humanities. Reading, MA:
Addison-Wesley.
Hurtado, S., Cabrera, N. L., Lin, M. H., Arellano, L., & Espinosa, L. L. (2009). Diversifying
science: Underrepresented student experiences in structured research programs. Research in
Higher Education, 50(2), 189214.
Kind, P., Jones, K. & Barmby, P. (2007). Developing Attitudes towards Science Measures.
International Journal of Science Education, 29(7), 871893.
Kozoll, R. & Osborne, M. (2004). Finding Meaning in Science: Lifeworld, Identity, and Self.
Science Education, 88, 157181.
Malcolm, S., Teich, A.H., Jesse, J.K., Campbell, L.A., Babco, E.L., & Bell, N.E. (2005).
Preparing Women and Minorities for the IT Workforce: The Role of Nontraditional
Educational Pathways. Washington, DC: American Association for the Advancement of
Science
Messersmith, E. E., Garrett, J. L., Davis-Kean, P. E., Malanchuk, O., & Eccles, J. S. (2008).
Career Development from Adolescence through Emerging Adulthood: Insights from
Information Technology Occupations. Journal of Adolescent Research, 23(2), 206227.
National Research Council. (1996). National Science Education Standards. Washington, DC:
National Academies Press.
ITEST Program Evaluation Plan – Final

19

National Research Council. (2007). Taking science to school: Learning and teaching science in
grades K-8. Washington, DC: National Academies Press.
National Research Council. (2009). Learning science in informal environments: People, places,
and pursuits. Washington, DC: The National Academies Press.
Online Evaluation Resource Library (n.d.). Quality criteria for evaluation plans. Retrieved from
http://oerl.sri.com/plans/planscrit.html
Patton, M. Q. (2002). Qualitative Research & Evaluation Methods 3rd Edition. Thousand Oaks,
CA: Sage Publications, Inc.
Pintrich, Paul R.; de Groot, Elisabeth V. Motivational and self-regulated learning components of
classroom academic performance. Journal of Educational Psychology, 82(1), Mar 1990,
33-40.
Thomas, G. P., Anderson, D., & Nashon, S. (2008). Development of an instrument designed to
investigate elements of students’ metacognition, self-efficacy and learning processes: The
SEMLI-S. International Journal of Science Education 30(13), 17011724.

ITEST Program Evaluation Plan – Final

20

APPENDIX:
DRAFT DATA COLLECTION INSTRUMENTS

ITEST Program Evaluation Plan – Final

21

Principal Investigator Interview Protocol
Personal Background
1) Please tell me about yourself.
a. What is your current role? How long have you had this position?
b. How long have you been involved with this ITEST project?
Project Background
2) Tell me a little bit about how the ITEST project came to be?
a. What was the motivation to apply for the award?
b. Was the project based on a previously funded ITEST project?
3) Tell me a little bit about the planning process?
a. Who was involved in the planning process? Probe for:
a. Key staff from your institution
b. Key external partners
c. Key stakeholders
b. If your project works with schools, was there a process for selecting schools to
participate?
4) Is there additional funding for the project beyond the NSF ITEST funding?
a. What are those specific funding sources?
b. How have they been used?
5) What were the project’s key goals as conceived at the beginning of the project?
c. Have those goals changed since the beginning of the project?
6) Were there any other external/contextual factors that had a large influence on the design of
the project? Probe for:
a. Resources (or lack thereof) in the local community or schools
b. Local school district initiatives
c. Priorities of your institution
d. Priorities of other local institutions
Project Participants
7) Who participates in the program? Probe for:
a. Special characteristics of participants
b. Numbers of participants

ITEST Program Evaluation Plan – Final

22

8) How are participants recruited to the program?
a. Which methods are most successful (and how do you know)?
b. What do you think motivates participants to become involved?
9) Do most participants persist through the entire program?
Project Activities
10) Briefly describe what your project does.
a. Include activities and number of contact hours.
b. Is there any variation in the experience of the participants?
c. What technologies are used (or learned) by participants or by the project at large?
d. How are technologies used (e.g., as a learning tool, for data collection, for data
sharing, to disseminate findings)
11) Which strategies have been implemented well and what has facilitated implementation?
12) Which strategies have been the most difficult to implement?
a. What is being/has been done to address these challenges?
13) How has the project evolved over time?
a. What changes were made? Why?
b. Have these changes been effective?
c. What changes do you see (if any) going forward?
Evaluation and Outcomes
14) Do you have any data on the effectiveness of your project?
15) Describe your external evaluation.
16) How are the evaluation results used?
17) What have been the key short-term and long-term outcomes for the teachers involved in
your project?
a. How do you know?
b. Do outcomes vary based on group (e.g., cohort, grade level, school)
18) What have been the key outcomes for the students involved in your project?
a. How do you know?
b. Do outcomes vary based on group (e.g., cohort, grade level, school)?
19) What have been the key contributions and benefits to project staff (e.g., knowledge,
relationships, connections and networks, etc.)
ITEST Program Evaluation Plan – Final

23

20) What have been the key contributions and benefits to the host institution (e.g., increased
capacity, shifts in mission, etc.)?
a. What about for external partners involved in the project?
Sustainability and Scale-up
21) What are the current plans for the project when the grant period ends?
a. What resources are needed to continue the project?
b. If you plan to continue in some way, what is the plan for providing the resources to
support the project?
22) Has the project model been implemented at other institutions or in other communities?
a. If yes: What part(s) of the model was implemented? Were there any challenges in
scaling-up the model?
b. If no: What would it take to scale up and reach multiple institutions or communities?
What do you see as the barriers to scaling up this type of work?
The ITEST Community
23) To what extent does your project interact with the LRC?
a. What do you see the most important role or roles of the LRC?
b. How would you describe your interactions with the LRC?
c. What have you gained, if anything, from the LRC?
24) To what extent do you work with, share advice, or otherwise interact with other ITEST
projects?
a. What do you interact about? How often? Where?
b. What have you gained, if anything, from these interactions?
c. Do you know of any examples where insights you’ve shared about your project have
helped to improve other projects?
25) To what extent are project staff contributing to the larger ITEST or STEM education
community. Probe on:
a. Attending ITEST PI meetings
b. Participating in LRC working groups
c. Uploading instruments to the LRC
d. Publishing journal articles or presenting papers at conferences.
e. Creating products that can be of use to STEM educators

ITEST Program Evaluation Plan – Final

24

Closing Questions
26) What have been the major successes of your project? What about major challenges?
27) What are the key features and lessons learned from this project that might be of interest to
others engaged in similar work with teachers or youth?
28) What is the likelihood that participating in this project will influence the STEM academic and
career outcomes for youth participants, or students of teacher participants?

ITEST Program Evaluation Plan – Final

25

Project Partner Interview Protocol
Personal Background
1) Please tell me about yourself and your organization.
a. What is your current role? How long have you had this position?
b. Have you done work related to STEM education in the past?
c. Do you have experience working with the targeted participant group (e.g.,
teachers, middle school girls, etc)?
2) When did you start working with this ITEST project (e.g., in the design phase, after it was
operational)?
a. Did you have an existing relationship with the PI or project staff? If not, how did
you become involved in the project.
Project Background
3) Were you involved in the planning process? If you were, tell me a little bit about the
planning process?
4) How would you characterize your organization’s level of involvement in this project?
a. Number of staff members involved
b. Types of project roles filled by your organization’s staff.
c. Other resources contributed to the project
5) What are the key goals of this project?
a. Have these goals changed since the beginning of the project?
b. How do they relate to your organization’s goals?
6) Were there any external/contextual factors that had a large influence on the design of the
project? Probe for:
a. Resources (or lack thereof) in the local community or schools
b. Local school district initiatives
c. Priorities of your organization
d. Priorities of other local institutions
Project Participants
7) What is your role, if any, in recruiting participants?
a. Which methods are most successful (and how do you know)?
b. What do you think motivates participants to become involved?

ITEST Program Evaluation Plan – Final

26

Project Activities
8) Briefly describe the project’s interventions that you are involved with.
a. Include activities and number of contact hours.
b. Is there any variation in the experience of the participants?
c. What technologies do you use and for what purposes?
9) Which strategies have been implemented well and what has facilitated implementation?
10) Which strategies have been the most difficult to implement?
a. What is being/has been done to address these challenges?
11) How has the project evolved over time?
a. What changes were made? Why?
b. Have these changes been effective?
c. What changes do you see (if any) going forward?
Evaluation and Outcomes
12) Do you have any data on the effectiveness of the project?
13) How are the evaluation results used?
14) What have been the key short-term and long-term outcomes for the teachers involved in
your project?
a. How do you know?
b. Do outcomes vary based on group (e.g., cohort, grade level, school)
15) What have been the key outcomes for the students involved in your project?
c. How do you know?
d. Do outcomes vary based on group (e.g., cohort, grade level, school)?
16) What have been the key benefits of this work for your organization (e.g., increased capacity,
shifts in mission, etc.)?
Sustainability and Scale-up
17) What are the current plans for your organization’s involvement in the project when the
grant period ends?
a. What resources are needed to continue the project?
b. If you plan to continue in some way, what is the plan for providing the resources
to support the project?

ITEST Program Evaluation Plan – Final

27

18) Has the project model been implemented at other institutions or in other communities?
a. If yes: What part(s) of the model was implemented? Were there any challenges
in scaling-up the model?
b. If no: What would it take to scale up and reach multiple institutions or
communities? What do you see as the barriers to scaling up this type of work?
The ITEST Community
19) To what extent do you think this project contributes to the STEM education community.
Probe on:
a. Publishing journal articles or presenting papers at conferences.
b. Creating products that can be of use to STEM educators
20) To what extent do you work with, share advice, or otherwise interact with other ITEST
projects?
a. What do you interact about? How often? Where?
Closing Questions
21) What have been the major successes of the project? What about major challenges?
22) What are the key features and lessons learned from this project that might be of interest to
others engaged in similar work with teachers or youth?
23) What is the likelihood that participating in this project will influence the STEM academic and
career outcomes for youth participants, or students of teacher participants?

ITEST Program Evaluation Plan – Final

28

Evaluator Interview Protocol
1) Please tell me about yourself
a. What is your current role? How long have you had this position?
2) When did you start working with the ITEST project (e.g., in the design phase, after it was
operational)?
a. Relationship with the project staff?
3) Tell me a little bit about the planning process for the evaluation?
a. (If involved in design phase) Was evaluation a significant concern during the
planning of the project?
b. How did you determine the focus of the evaluation?
c. What were your considerations when designing the methodology?
4) What are the purposes of the evaluation (e.g. formative, summative)?
5) Can you describe the design of your evaluation? (e.g., quasi-experimental, qualitative with
control group, etc)
a. What are your evaluation questions?
b. What types of data do you collect? Probe for:
i. Surveys
ii. Focus groups/interviews
iii. Knowledge assessments
iv. Attitude assessments
v. Standardized test scores
vi. Observation data
vii. Student records (including future enrollment)
c. Have you developed your own instruments?
i. If so, have you shared those with the ITEST communities?
ii. If no, where did you get your instruments?
d. How often are data collected?
6) Have there been any challenges in collecting data? (e.g., if project involves external
partners, have they been cooperative with the evaluation; if evaluation involves a control
group, has it been easy or difficult to identify and collect data for them)
7) Do you plan on making (or have you already made) changes to your evaluation design?
Why? Please describe the changes.
8) At this point in the project, do you have data on the outcomes of interest? What have you
found?

ITEST Program Evaluation Plan – Final

29

9) How do you share results with project staff?
a. Do you share your results more widely (e.g., web site, conferences, publications)
10) Do you see any evidence that the project is using the evaluation results?
a. Give an example.
11) To what extent do you work with, share advice, or otherwise interact with other ITEST
project evaluators?
a. What have you gained, if anything, from these interactions?
12) To what extent do you as an evaluator interact with the LRC?
a. What is the nature of most of your interaction with the LRC?
b. What have you gained, if anything, from the LRC?

ITEST Program Evaluation Plan – Final

30

Teacher Focus Group Protocol
1) Let’s begin by going around and saying names and your position(s) at the school.
2) How did you hear about the ITEST project?
a. What motivated you to participate?
b. Do you participate with other teachers from your department? school? district?
c. Have you ever done anything else similar to this type of a project?
3) What activities are you engaged in as part of the project?
a. How often do you attend?
4) As a result of your work with this project, have you gained specific knowledge or skills?
Probe for each of the following:
a. Use of technology
b. Curriculum or instructional plans
c. Hands-on activities for the classroom
d. Creating a community of learners
e. Content knowledge
f. Other?
5) How, if at all, have you implemented what you learned in your classrooms? Ask for
SPECIFIC EXAMPLES.
6) As a result of your work with this project, do you think differently about your future as a
STEM educator?
7) What activities have been most valuable to you in your development as a STEM teacher?
Why?
8) What activities have been least valuable to you in your development as a STEM teacher?
Why?
9) What, if any, new roles have you taken on at your school as a result of your participation in
this project?
10) Have you been able to share what you learned with colleagues who did not participate?
11) Would you participate in another similar project if you had the chance? Why?


ITEST Program Evaluation Plan – Final

31

Student Focus Group Protocol
1) Let’s begin by going around and saying your name and your grade level.
2) How did you hear about this project?
a. Why did you decide to participate?
b. Have you done anything like this before?
c. Did you know many of the other people in the program before it started?
3) What kinds of things do you do in this program?
a. What technologies or devices did you use?
b. How often do you attend?
4) What do you like most about the program?
5) What do you dislike about the program?
6) What kinds of things have you learned from this program? Get specific examples.
7) How many [science/math/technology] classes do you plan to take in high school? Do you
plan on taking any AP [science/math/technology] classes?
8) What do you want to do after high school? If they want go to college, ask:
a. What do you want to study in college
b. What do you want to do when you graduate college?
9) Has this program changed what you want to do in or after high school? What about after
college? If yes:
a. How has it changed?
b. Why did it change?
10) Have you learned about a career that you didn’t know existed, or didn’t know much about,
before you participated in this program?
11) Do you know how to do things on a computer or with other technology, that you didn’t
know about before you participated in this program?
12) Does participating in this program make a difference in how much you like your
[science/math/technology] class during the school day?
a. Does participating make a difference in how much (or what kind of)
science/math/technology you plan to take in high school?
13) Would you want to do something like this again?
ITEST Program Evaluation Plan – Final

32

14) Have you done anything since participating in this project that you think is related to the
kinds of things you did with the project? Probe for:
a. Invented, designed or built something on your own.
b. Participated in a science or engineering fair or event.

ITEST Program Evaluation Plan – Final

33

ITEST Program Evaluation
Case Study Debrief Guide
Project title:
Grantee institution:
Principal investigator:
Year awarded:
Award type:
Dates of site visit:
Site visitors:
1. Summary
Provide a 1-page story of what we learned, focusing on a brief overview of the project, the
environment in which it operates, key strengths, and struggles.
2. Project Background
What factors impact the design, implementation, and impact of the ITEST project?
2a.

Describe how the ITEST project came about. What was the motivation or the impetus
for the project? Who were the early stakeholders/partners? Is this a new project or
were grant monies used to expand/modify an existing project? If based on an existing
project was the existing project ITEST funded?

2b. What STEM areas are addressed in the project? What are the goals of the project (e.g.,
building content knowledge, providing research experiences, serving underserved
communities)?
2c.

Describe the local context and the implications for the project (e.g., Have resources or
lack of resources available in the local community affected the design or
implementation of the project? Have priorities of the school districts or local
institutions affected the project?)

2d. How does the ITEST project cohere or conflict with other salient initiatives of the
participating institutions?
3. Project Strategies
Provide a comprehensive description of the project and its participants.
3a.

Who does the project serve? Provide numbers of teacher and student participants and
their characteristics.

3b. What recruitment strategies are used? Which are most successful (and how do they
know)? From participants’ perspectives, what motivates them to become involved?
3c.

Describe the core strategies the project is using. What are the major programmatic
elements for teachers and/or students? How often do participants meet? What is the
duration of each strategy? Is there any variation in experiences of participants?

ITEST Program Evaluation Plan – Final

34

3d. What technologies are used in the project? For what purposes?
3e. What is the environment in which the project occurs and what resources are available
(e.g., technology)?
3f.

Which strategies have been implemented well and what has facilitated
implementation?

3g.

Which strategies have been the most difficult to implement? Why? What is being/has
been done to address these challenges?

3h. Have there been any changes to the project over time? What changes were made and
why?
4. Project Administration
Provide a comprehensive description of the project staff and funding mechanisms.
4a.

Describe the structure of the project staffing. What organizations are project staff
from and what are their various roles? Describe the relationship among staff from
different organizations. What enables them to work well together, if they do? Are
there any challenges to the working together?

4b. Has the project leveraged funding from other sources to support the project? Describe
specific funding sources and how these additional resources have been used.
5. Project Outcomes
Provide an overview of the outcomes measured by the project for both formative and
summative purposes.
5a.

Describe the data they collect, how they collect it, and what they do with it. Include
formative evaluations and summative evaluations of outcomes. How useful have the
evaluations been to the PI and other project staff?

5b. What short- and long-term outcomes have been identified by project staff? Are there
differences in outcomes across participants (e.g., by grade level, or length of
participation)? To what do they attribute those differences?
5c.

What did the participants report getting out of the project and to what did they
attribute those outcomes? Are there differences across participants?

5d. What have the PI and other project staff report getting out of the project (e.g.,
increased knowledge and experience, a wider professional network)?
5e. Have there been any institutional changes due to the project (e.g., increased capacity,
refocusing of resources)? What prompted these changes?

ITEST Program Evaluation Plan – Final

35

6. Sustainability and scale-up
What are the prospects for project sustainability and scale-up?
6a.

What do project staff anticipate will come of the project when the grant runs out?
(e.g., Will it be sustained as is, end, be modified, or shifted into another grant?) What
would be needed to ensure sustainability? What are the barriers to sustaining the
project?

6b. Has the project model been implemented at other institutions or in other
communities? What would be necessary to make scale-up possible? What are the
barriers to scale-up?
7. The ITEST community
What has been the contribution of the ITEST community, including the LRC, on the project?How
has the project contributed to the larger ITEST community?
7a.

Do what extent does this project connect with the LRC? With other ITEST projects?

7b. Have these connections had an effect on the design or implementation of this project?
7c. Have these connections contributed to building the capacity of the project to achieve
its overarching goals?
7d. To what extent are project staff contributing to the larger ITEST community (e.g.,
attending PI meetings, participating in LRC working groups, uploading instruments to
the LRC)?
8. Other Interesting Points
Are there any other issues that are important for our understanding of the implementation or
outcomes of this project?

ITEST Program Evaluation Plan – Final

36


File Typeapplication/pdf
AuthorMarilyn Gillespie
File Modified2012-04-10
File Created2012-02-03

© 2024 OMB.report | Privacy Policy