3145-new ITEST part A

3145-new ITEST part A.pdf

Data Collection for the Innovative Technology Experiences for Students and Teachers Program Evaluation (ITEST)

OMB: 3145-0222

Document [pdf]
Download: pdf | pdf
Request for Clearance: National Science Foundation, Directorate of
Education and Human Resources, Division of Research on Learning in
Formal and Informal Settings
Data Collection for the Innovative Technology Experiences for Students and
Teachers Program Evaluation (ITEST)

SUPPORTING STATEMENT A
Introduction
This request for Office of Management and Budget (OMB) clearance asks for a 3-year
clearance for new data collection for the Innovative Technology Experiences for Students and
Teachers (ITEST) Program Evaluation, which is administered by the National Science
Foundation (NSF)’s Directorate for Education and Human Resources (EHR), Division of
Research on Learning in Formal and Informal Settings (DRL).
NSF’s ITEST program aims to strengthen the formal and informal learning experiences of
K-12 students in order to cultivate their interests in and capacities for science, technology,
engineering and mathematics (STEM) careers, with a special emphasis on technology. It also
seeks to improve the leadership and knowledge of the community that conducts outreach for
STEM learning. Since 2003, ITEST has funded nearly 200 projects to implement, study, and
scale up effective strategies to motivate and equip young students to pursue STEM careers. Yet
evidence about the program’s overall impact is thin, and data are lacking on more and less
effective project features—what works and what does not.
To address the lack of evaluative program data, NSF has contracted with SRI International
and Inverness Research to perform an evaluation of the ITEST program. The evaluation will
consist of two distinct components. The first component is a comprehensive review of the
existing project files – referred to as eJackets – collected by NSF. The second component of the
evaluation consists of case studies of 24 ITEST projects. The case studies will be based on faceto-face interviews with respondents at ITEST project sites or telephone interviews with off-site
respondents. The two data sources – the portfolio review and case study – will then inform an
integrated analysis in an effort to address the research questions.
This request for clearance is for the case study component of the study. This introduction
provides details on elements not specifically included in this data collection request in order to
provide context on the role of the case studies within the broader program evaluation. Specific
details of the case study portion of the study are included in the body of Supporting Statements A
and B.

Logic Model
The ITEST program invests in projects that vary widely by type (i.e., research, scale-up, and
strategies), context (e.g., urban, rural), content focus (e.g., bioscience, engineering, computer
1

science), setting (e.g., informal, formal), and target population (e.g., teachers, students), among
others. Despite these variations, all ITEST projects share a common goal: to expand the STEM
workforce by increasing students’ capacity for and interest in STEM studies. We begin this
section with an overview of the theory of change that specifies how ITEST seeks to attain this
goal. We then use the logic model to specify a set of overarching research questions. Note that
the logic model represents the program and not the evaluation. Later in the document we specify
the aspects of the logic model that the evaluation will be measuring.
Given the complex and multifaceted nature of ITEST, we have developed a logic model to
aid in decisions about key aspects of the program that merit research and evaluation. The model
is a systematic way to present the relationships among the resources available to create and
deliver the program, the activities the program offers, and the anticipated changes or results. It
identifies the ultimate outcomes the program seeks to achieve and the assumptions regarding
how the program is hypothesized to contribute to those outcomes. The logic model also identifies
intermediate outcomes.
Exhibit 1 portrays the logic underlying the ITEST program, beginning with the program’s
goal of increasing participation in the STEM workforce at the far right bottom of the graphic.
Increasing workforce participation generally requires that students successfully attain STEM
degrees. We use this term to refer to graduate and undergraduate degrees in a STEM discipline
(whether through traditional or nontraditional pathways) (Malcolm et al., 2005) as well as to
technical degrees requiring fewer than 4 years of postsecondary schooling. Successful attainment
of such degrees generally requires students to leave secondary school with the dispositions and
capacities for future STEM study.
Our review of the research points to six central dispositional constructs. Students must have
an interest in STEM learning and STEM careers (Girod, 2005; Appleton et al., 2008). Students
must possess a self-identity as science learners (Girod, 2005; Kozoll & Osborne, 2004) as well as
a sense of self-efficacy (Pintrich & DeGroot, 1990; Thomas, Anderson, & Nashon, 2008; Kind,
Jones, & Barmby, 2007). They must have a favorable disposition toward their future
participation in science (Kind, Jones, & Barmby, 2007). They must be aware of the value of
science in society (Kind, Jones, & Barmby, 2007; Davis-Kean, 2007; Messersmith et al., 2008).
Finally, they must believe that they have knowledge of STEM careers (Hurtado et al., 2009).
In terms of student capacities for STEM learning, we rely on the National Research Council
consensus documents on learning in formal and informal settings (NRC, 2007; 2009). Learners
practice and develop scientific capacities as they (1) “come to generate, understand, remember,
and use concepts, explanations, arguments, models, and facts related to science;”
(2) “manipulate, test, explore, predict, question, observe, and make sense of the natural and
physical world”; and (3) “participate in scientific activities and learning practices with others,
using scientific language and tools” (NRC, 2009, p. 4).

2

For students to develop these dispositions and capacities, they must have access to productive
learning opportunities. Students must have opportunities to engage with and develop their
fluencies with scientific concepts, explanations, arguments, and models (Chinn & Malhotra,
2001) as well as with scientific practices of reasoning, metacognition, data analysis, and mastery
of the use of scientific language (Brown, Reveles, & Kelly, 2004). In our own work, we have
defined productive experiences as those comprising “rigorous investigation informed by the
scientific process that results in new content knowledge and knowledge of how to learn, relevant
to students’ lives, and providing the opportunity for inspiration that translates into motivation”
(American Museum of Natural History, 2010, p. 11).
For teachers to create active, STEM-rich environments, they need opportunities to learn
deeper science content and to participate in scientific inquiry themselves (Basista & Mathews,
2002). Many ITEST projects focus on creating just such productive learning opportunities for
teachers. The overarching goals are to support teachers to have the capacity to develop coherent
learning plans, to guide student learning through authentic experiences, to assess their teaching
and student learning on an ongoing basis, to design and manage authentic learning environments,
and to develop a community of learners in their classrooms (see NRC, 1996).
Most ITEST projects provide direct service—meaningful learning opportunities—for youth
and their teachers. Scale-up projects implement models in a large-scale setting such as across a
state, region, or the nation. Other projects involve improving the knowledge base on effective
STEM experiences by conducting research or convening researchers and practitioners. Through
its convenings, communication strategies, and the work of the Learning Resource Center (LRC),
ITEST also seeks to build a professional community that shares lessons learned and works in
concert to build the nation’s STEM workforce. As the professional community develops and
more teachers are trained, the capacity of the system as a whole will expand to foster better
learning opportunities even in the absence of ITEST.

3

Exhibit 1. Logic Model for the ITEST Program Evaluation

4

Overview of the Evaluation
The model suggests three main research questions, as shown in Exhibit 2.
Exhibit 2. Research Questions
1. What are the projects’ impacts? What are the achieved outcomes in

key areas for students and teachers in ITEST projects? Do youth who
participate in ITEST projects demonstrate greater interest in STEM
activities and careers than nonparticipants? To what extent are project
evaluations rigorous?
2. What project models are most effective in delivering desired student

and teacher outcomes? What project characteristics contribute to
these models’ success?
3.

How can we best characterize and describe ITEST projects? What do
the projects do? Who do the projects serve? Where and when are
ITEST project activities taking place?

We have designed a comprehensive, multimethod evaluation to address these research
questions. The design incorporates a portfolio review and case studies of ITEST projects and
outcomes. Exhibit 3 indicates the data sources that we will use to address each research question.
OMB clearance is not required for the portfolio review since there is no original data collection,
but a description of this method is included in this introduction given its critical role in
addressing the research questions.

5

Exhibit 3. Research Questions and Data Sources
Data Source
Portfolio Review

Research Question

Program
description

Project
descriptions

Case Studies
Project
evaluations

Project
background

Project
strategies

Project
administration

Project
outcomes

1. What are the projects’ impacts?
X

X

Do youth who participate in ITEST
projects demonstrate greater interest
in STEM activities and careers than
nonparticipants?

X

X

X

X

6

What are the achieved outcomes in
key areas for students and teachers
in ITEST projects?

To what extent are project
evaluations rigorous?

X

2. What project models are most effective in delivering desired student and teacher outcomes?
What project characteristics
contribute to these models’ success?

X

X

X

X

X

X

X

X

3. How can we best characterize and describe ITEST projects?
What do the projects do?

X

X

Who do the projects serve?

X

X

Where and when are ITEST project
activities taking place?

X

X

X
X

X

X

Portfolio Review
To create a comprehensive description of the ITEST portfolio, we will use the logic model
presented in Exhibit 1 as the framework for describing the ITEST projects and their evaluations.
We will use this framework to describe what the ITEST projects do and in what content areas,
who the projects serve, where and when the ITEST projects take place, and how they have
changed over time. We also will characterize the intended and achieved outcomes in key areas
for students and teachers and assess the level of rigor of the project evaluations. The review will
include three components: (1) a description of the ITEST program that details program intent,
changes over time, and offers a contextual lens through which to understand individual projects;
(2) descriptions of ITEST projects that characterize important descriptive features of the projects
and identify project “models;” and (3) a review of project evaluations that characterizes and
assesses the quality of project evaluations. Our approach for each of the three components is
described in detail below.
Program Description
The first component of the portfolio review is a description of the ITEST program as a
whole, including its evolution over time. This description will be derived from a review of
ITEST program solicitations, interviews with NSF ITEST program officers, and analyses of the
LRC. A portion of this review has already been completed and has informed changes to the study
design.
Examine ITEST solicitations. From 2003 through 2011, NSF has funded nine consecutive
cohorts of ITEST projects. While the overall structure and purpose of the ITEST program has
remained intact, the program solicitations have contained changes that impacted the types of
projects funded, the populations served by ITEST projects, and the project evaluation
requirements. Because the solicitations have changed over time, it is necessary to understand and
describe these changes in order to provide a comprehensive picture of the ITEST program. To do
so, we reviewed all ITEST program solicitations released between 2003 and 2009, and for each
solicitation, we identified:
•

Stated goals for awarded projects;

•

Desired or required characteristics of ITEST projects (i.e., populations and activities
that the solicitation recommends or requires);

•

Requirements or recommendations for project evaluation methods.

We then compared and contrasted the program solicitations in order to describe changes in
the ITEST program over time. As our program evaluation continues, we will review the more
recent solicitations.
Interview program officers. In order to get a broad perspective on the ITEST program, we
interviewed five ITEST program officers who have been involved with the program since its
inception. In our interviews, we asked program officers to provide information about the
background of the ITEST program, the changes they have perceived since the program began,
and how they would characterize the ITEST projects. We also solicited their input with regard to
perceived gaps in the ITEST portfolio, important outcomes from ITEST projects, and examples
of good formative and summative evaluations. The program officers provided us with a wealth
7

of descriptive information, and we will use this information in our general description of the
ITEST program, as well as our review of how the program and projects have changed over time.
Characterize the role of the Learning Resource Center (LRC). Funded by NSF in
conjunction with the first cohort of ITEST projects, the LRC is charged with providing technical
assistance and support to ITEST projects. As shown in the logic model, the LRC plays an
important role in the ITEST community. In this role, the LRC disseminates information about
ITEST projects, convenes annual meetings of ITEST principal investigators (PIs), and strives to
build community among ITEST grantees. In order to describe the role of the LRC within the
ITEST community, we will review the LRC website, examine external evaluations of the LRC
(included in the NSF eJacket system), and interview LRC staff.
Analysis. Data will be analyzed in order to draw a general picture of the ITEST program,
including NSF’s intent and the program’s evolution over time. As such, we will take a qualitative
analytic approach that draws on the principles of content analysis (e.g. Holsti, 1969) to identify
broad themes across the data and track changes over time. In particular, we will detail the
evolution of program goals, desired or required project characteristics, and evaluation
expectations. This approach will offer an important contextual background to guide direct data
collection and to ascertain whether program intent was realized.
Project Descriptions
The second component of the portfolio review is a description of the entirety of projects
funded through the ITEST program. This review will include all project types: strategies, scaleups, and research projects. To describe the portfolio of projects, we will gather and review data
from the Management Information System (MIS) maintained by the LRC and the NSF eJackets
in order to describe project goals, activities, partnerships, populations served, and dissemination
strategies. We will then use these data from the MIS and eJackets to identify project models.
Throughout the portfolio analysis task, we will consider where we may add the most value to
existing information collected and presented by the LRC given our access to the eJacket system.
Gather and review data from the MIS. Over the past 8 years, the LRC has collected a great
deal of descriptive information on ITEST projects through its project profiles and annual
questionnaire of project PIs (starting in 2009). The project profiles contain short descriptions of
ITEST projects in all eight cohorts, which describe project goals and activities. They also contain
basic descriptive data, which can be used to categorize ITEST projects across an array of
dimensions such as project type (e.g., research, scale-up, strategies), populations served (e.g.,
students, teachers), urbanicity (e.g., urban, rural, suburban) and area of focus (e.g., engineering,
computer science, geography). The annual survey of project PIs delves deeper into who
participates in ITEST projects, how often they participate, and in what kind of activities they
participate. The LRC maintains these data in its MIS. So as not to duplicate data collection
efforts, we requested and received access to these data and have begun preliminary analyses to
characterize and categorize ITEST projects.
Gather and review data from eJackets. NSF’s eJackets are electronic file folders that
contain project documents such as project proposals, annual and final reports (including
evaluation reports), and proposal funding justifications. We have begun to review each project’s

8

eJacket in order to supplement the descriptive data from the MIS, identify project partners and
participating organizations, and identify dissemination strategies.
Analysis. The final step of the analysis will be model development, for which we will
identify a set of common project models. This process involves identifying a set of project
characteristics that “cluster” together. An example of a model might be urban-based projects
involving partnerships between a school system and museums or other science-rich institutions
to provide after-school experiences to K-12 students. The project models will be identified
inductively from the data in the MIS and eJackets. Thus, the degree to which we will be able to
create a set of unique models across settings and cohorts will depend on the empirical data.
Project Evaluations
The third component of the portfolio review is an analysis of the project evaluations. The
goal of the evaluation review is to broadly characterize the types of evaluations conducted by
ITEST project evaluators, assess the quality and rigor of these evaluations, and document the
achieved outcomes in key areas for students and teachers. Data on project evaluations will be
extracted from the eJacket system. For completed projects, we will review the project
evaluations included in the final reports. For on-going projects, we will review the evaluation
plans described in the proposals and the evaluation reports included in the annual reports.
Characterizing the Evaluations. While the ITEST solicitations require that ITEST project
PIs conduct an evaluation of their ITEST project, there are few requirements that guide the type
of evaluation to be conducted. As such, project evaluations employ disparate evaluation
methodologies and serve a wide array of purposes. Given this diversity, we will begin our
evaluation of each project evaluation by identifying the purpose of the evaluation (formative or
summative) and characterizing the evaluation methodologies (e.g., experimental, quasiexperimental, one group pre/post, qualitative) employed in the project evaluations. Next, using
the evaluation plans and evaluation reports, we will identify the outcomes of interest for students
and/or teachers in each project evaluation. We will then categorize the outcomes and present a
summary of the types of outcomes targeted by the evaluations, as well as the relative success of
projects in achieving the outcomes across all ITEST projects. Finally, we will assess the quality
and rigor of the ITEST project evaluations. We will draw on the criteria used by the NSF-funded
Online Evaluation Resource Library (OERL), which defines quality criteria for sound project
evaluation plans, instruments, reports based on best practices information from the Joint
Committee on Standards for Educational Evaluation, and the American Evaluation Association’s
Guiding Principles for Evaluators. The following questions, based on the OERL, will guide our
review of each ITEST project evaluation:1
1. Does the evaluation reflect an understanding of the project logic model, including
elements such as the goals/objectives, short-term and long-term outcomes, activities,
stakeholders, and context?
2. Does the evaluation make clear what questions are to be answered?

1

Note that these criteria apply to projects that have served youth or educators, not research projects.

9

3. What is the evaluation design? Is the evaluation design in line with the goals of the
evaluation?
4. What is the sampling frame for the evaluation? What types of data are collected? How
are the data collected? When is the data collected?
5. What instruments are used to collect data? What is the quality of the data collection
instruments?
6. How are the data analyzed? Are the analysis methods appropriate, given the research
questions and purpose of the evaluation? Are the interpretations of the results supported
by the data?
7. What conclusions and recommendations were reported? Were the recommendations and
conclusions backed by the data?
We will summarize answers to these questions to provide a description of the level of quality
and rigor of project evaluations across all ITEST projects.
Analysis. Similar to the analysis of the project descriptions, the first step of the evaluation
analysis will focus on descriptive statistics. This step will provide a descriptive summary of the
range of evaluation purposes, methodologies, and outcomes assessed. The second step of the
analysis will apply the evaluative criteria discussed above to assess the rigor of each evaluation
and the value of the information it offers for the individual project and for the program overall.
This evaluative lens also will enable us to identify exemplary evaluation practices and key
challenges that projects encounter in conducting rigorous evaluations. 2

Case Studies
Over the course of the study, we will conduct case studies of 24 ITEST projects to
understand how the projects function, what characteristics contribute to a model’s success, and
the impact of the projects on participants. We discuss in detail our approach to the 24 case
studies in the body of the OMB package since they are the topic of this request for clearance.

Integrated Analysis
Once the portfolio review and case study analysis are completed, we will conduct an
integrated analysis that looks across the multiple data sources. The integrated analysis will focus
on three lines of inquiry critical to ITEST and to NSF’s ongoing support of similar programs:
1) through what mechanisms, and under what circumstances, are the ITEST projects resulting in
changes in formal and informal learning opportunities, teacher outcomes, and student
dispositions and capabilities; 2) which project models are most effective in achieving desired
goals; and 3) how can the ITEST program, and similar NSF-funded programs, be made more
evaluable?
To understand through what mechanisms, and under what circumstances, ITEST projects
result in positive outcomes, we will expand on findings from the portfolio review by drawing on
the rich and highly contextualized case study data. Findings from the portfolio review will
2

As discussed in the “Integrated Analysis” section below, these findings will be considered together with case
study data to offer insight into NSF’s ongoing efforts to make the programs it funds more evaluable.

10

provide program-level data on project components and outcomes. Case study data will provide
details about the particular contexts in which the projects are operating and the learning
experiences provided, and specifics on how project implementation evolves over time. By
analyzing these data together, we will be able to identify the characteristics of successful projects
across the ITEST portfolio.
To understand what project models are most effective in achieving project goals, we will use
case study data to test and elaborate the project learning models that emerge from the portfolio
review. Whereas the portfolio review will offer broad outlines of the STEM learning models
employed across the ITEST program, case study data will enable us to refine these models in
light of a nuanced understanding of on-the-ground practices. In order refine the learning models,
we will use an inductive analytic strategy (Erickson, 1986). We will begin the task by analyzing
data from the portfolio analysis to develop working theories of potential models. We will then
test and refine these models with iterative passes through the more descriptive and explanatory
case study data. Once we are satisfied that additional passes through the data will yield no new
insights, we will apply a comparative lens across models to better understand which are most
effective and which are most problematic in achieving desired goals.
To understand how the ITEST program, and similar NSF-funded programs, can be made
more evaluable, we will scrutinize data about ITEST evaluation practices across the portfolio,
using case study data to better understand the successes and challenges projects encounter in
conducting rigorous evaluations. In this effort, we will seek to understand what kinds of
information are most useful to individual projects and to NSF, as well as how that information
may be best collected, analyzed, and communicated. Moreover, we will explore ways in which
the needs of individual projects for timely and tailored information may be balanced with NSF’s
need for program-level information that is commensurable across projects.
Taken together, the components of the integrated analysis will help NSF tailor solicitations in
ways that lead to projects with a higher likelihood of success, fund projects employing more
promising models, and rigorously evaluate the effectiveness of funded projects. As such, this
evaluation will help NSF to continue making investments likely to lead to desired outcomes.

A. Justification
A.1. Circumstances Requiring the Collection of Data
NSF’s Innovative Technology Experiences for Students and Teachers (ITEST) program aims
to strengthen the formal and informal learning experiences of K-12 students to cultivate their
interests in and capacities for science, technology, engineering and mathematics (STEM) careers,
with a special emphasis on technology. It also seeks to improve the leadership and knowledge of
the community that conducts outreach for STEM learning. Since 2003, ITEST has funded nearly
150 projects to implement, study, and scale up effective strategies to motivate and equip young
students to pursue STEM careers. Yet evidence about the program’s overall impact is thin, and
data are lacking on more and less effective project features—what works and what does not.
To address the lack of evaluative program data, NSF has contracted with SRI International
and Inverness Research to perform an evaluation of the ITEST program. The evaluation will
11

consist of two distinct segments as described in the introduction. The first segment is a
comprehensive review of the entire portfolio of ITEST projects that does not require original
data collection. The review will be conducted using existing electronic files, known as eJackets,
maintained by NSF. The second segment of the evaluation consists of case studies of 24 ITEST
projects based on three-day site visits.

A.2. Purposes and Uses of the Data
The overall purpose of the data collection is program evaluation. The data obtained from the
data collections will be used to document the effectiveness and outcomes of the ITEST program
and to assess achievement of program goals. Documenting the short and long-term impacts of
the ITEST program will inform future program policy decisions and contribute to the wider NSF
discussion on the future of science, technology, engineering, and mathematics (STEM)
education.
Specifically, the evaluation of the ITEST program is designed to answer the research
questions shown in Exhibit 2 above. The findings from the integrated analysis also will help NSF
tailor solicitations in ways that lead to projects with a higher likelihood of success, fund projects
employing more promising models, and rigorously evaluate the effectiveness of funded projects.
As such, this evaluation will help NSF to continue making investments likely to lead to desired
outcomes.

A.3. Use of Information Technology to Reduce Burden
This collection will involve face-to-face and telephone interviews during site visits to
institutions with ITEST projects. While speaking with individual respondents in person is
preferred, telephone interviews will be scheduled if telephone interviews would reduce burden
on the respondent.

A.4. Efforts to Identify Duplication
The eJacket system at NSF and the management information system (MIS) database at the
ITEST Learning Resource Center (LRC) contain data on all ITEST-funded projects. All
previously collected data has been gathered and catalogued by the research team as part of the
portfolio review task, and as a result, the case studies will only collect data in areas where it has
not been previously collected. Site visitors will be trained on the existing data sources and will
tailor protocols to ensure maximum efficiency while on site.

A.5. Efforts to Minimize Burden on Small Businesses or Other Entities
It is unlikely that this program evaluation will have an impact on small business. Site visits
will include speaking with ITEST grantee partners, who may represent large companies, small
businesses, K-12 school districts, higher education institutions, government offices, non-profits,
informal institutions, and professional membership organizations. Partners will be asked
questions about their ITEST project, how it is being implemented, and the extent to which
various organizations and stakeholders have been involved and been affected by the project. If
the program ultimately succeeds in increasing youth interest in STEM careers, many small
businesses may benefit in the longer term.

12

A.6. Consequences of Not Collecting the Information
If the information is not collected, NSF will not be able to document the effectiveness and
outcomes of the ITEST program. Moreover, it will not be able to assess the degree to which the
program is meeting its goals. This lack of information may hamper program management and
monitoring capabilities. In addition, NSF will be unable to comply fully with the Congressional
mandate that NSF evaluate its science, technology, engineering, and mathematics (STEM)
education programs.

A.7. Special Circumstances Justifying Inconsistencies with Guidelines in 5
CFR 1320.6
The data collection will comply with 5 CFR 1320.6.

A.8. Consultation Outside the Agency
One notice has been published to solicit comments from the public. The notice was published
in the Federal Register on January 24, 2011 (Volume 76, Number 15, pages 4137–38). A copy of
the text of the notice is included in Appendix B. No substantive public comments were received
in response to the notice.
The evaluation design was developed in consultation with NSF staff in the Directorate for
Education and Human Resources (EHR) and the Division of Research on Learning (DRL)
through which the evaluation of the ITEST program is funded, LRC staff, and a panel of
consultative experts selected for their experience and content knowledge on evaluation design
and STEM. The panel of consultative experts is shown in Exhibit 4.
Exhibit 4: Panel of Consultative Experts
Name

Affiliation

Dr. Melvin Mark

Department Head, Department of Psychology, Pennsylvania State University

Mr. Jason Lee

Executive Director, Detroit Area Pre-College Engineering Partnership

Dr. Nichole Pinkard

Visiting Associate Professor of Interactive Media, School of Computing,
DePaul University

Dr. Karen Peterman

Independent evaluation consultant; former external evaluator of multiple
ITEST projects

Dr. Gerald Knezek

Professor of Learning Technologies and Director of the Institute for the
Integration of Technology into Teaching & Learning, University of North
Texas

A.9. Payments or Gifts to Respondents
No payment or gifts will be provided to participants in any data collection activities.

A.10. Assurance of Confidentiality
Interviewees will be advised that any information on specific individuals will be maintained
in accordance with the Privacy Act of 1974. The data that are collected will be available to only
NSF officials and staff and to the evaluation contractor. The data will be processed according to
Federal and State privacy statutes. Detailed procedures for making information available to
13

various categories of users are specified in the Education and Training System of Records (63
Fed. Reg. 264, 272 January 5, 1998). That system limits access to personally identifiable
information to authorized users. The data will be used in accordance with criteria established by
NSF for monitoring research and education grants and in response to Public Law 99-383 and 42
USC 1885c. The information requested may be disclosed to qualified researchers and contractors
in order to coordinate programs and to a Federal agency, court or party in a court, or Federal
administrative proceeding, if the government is a party.
Participants in the case studies will be assured that the information they provide will not be
released in any form that identifies them as individuals except as may be required by law.
Evaluation findings about the ITEST projects will be reported in aggregate form in all reports.
The contractor, SRI International, has extensive experience in collecting information and
maintaining the confidentiality, security, and integrity of data.
The following standards and procedures will safeguard the privacy of interviewees and the
security of the data that are collected, processed, stored, and reported.
•

Project team members will be educated about the Privacy Act of 1974, the need to ensure
study participants about confidentiality of their responses, and ways data and other
sensitive materials are to be handled. They will be cautioned not to discuss interview
results with others outside the evaluation. Within the evaluation team, discussions will be
restricted to the essential needs of a particular set of case studies.

•

All individuals will be informed that their participation in the ITEST evaluation study is
voluntary and that, if they are willing to participate, their privacy will be assured, except
as may be required by law. Participants will also be informed of the purposes of the data
collection and the potential uses of the data collected.

•

Prospective interviewees will be given a Consent Form (Appendices C and D) that
includes the same assurance of confidentiality, as well as the purposes of the study,
potential risks and discomforts, and benefits of participation.

•

Personal information (names, addresses, phone numbers, email addresses) will be
collected solely for the purpose of identifying and contacting study participants, and will
not be distributed outside the evaluation team, except as required by law.

•

All recordings of interviews, interview notes, and other project-related documents will be
stored on secure servers that are accessible only to authorized staff members. Access to
response databases, as well as to other electronic and hard-copy materials used to record
collected data, will be limited to Patrick Shields (PI) and only those researchers who are
granted access by the PI.

•

All interview results recorded on paper containing identifiable data will be shredded as
soon as the need for the hard copies no longer exists.

14

•

All basic computer files will be duplicated on backup servers to allow files to be restored
in the event of unrecoverable loss of the original data. These backup files will be stored
under secure conditions in an area separate from the location of the original data.

•

Reports to NSF will include participants’ responses only in aggregate form. Responses
will not be associated with any specific institution or individual. No information that
could be used to identify individuals or their institution will be revealed to anyone outside
the study team, except as may be required by law. The primary analysis for case studies
will be a cross-case analysis; individual projects will be described only to support crosscase themes or as exemplars.

A.11. Questions of a Sensitive Nature
There are no questions of a sensitive nature in the data collection. All respondents will be
informed that providing the requested information is voluntary. Respondents may choose not to
provide information that they feel is privileged.

A.12 Estimates of Response Burden
In this clearance request, the evaluation study relies on interviews with ITEST Principal
Investigators and Co-PIs, project staff, project partners, and evaluators, and focus groups with
teachers, students, and parents. The interview and focus group protocols used in this data
collection appear in Appendix A. This section provides estimates for the response burden of the
case studies.
We are seeking OMB approval for seven different types of interview protocols contained in
Appendix A. 3 For all respondent groups, except for the Principal Investigator (PI), burden
consists of the time spent being interviewed at their sites. Interviews are expected to last no
longer than one hour. Principal Investigators will spend an additional 3 hours, in addition to the
interview time, working with their staff on: compiling lists of faculty, staff, students, and
partners to be interviewed; helping to arrange interviews; gathering documents; and meeting with
the site visitors. Respondents will not incur any equipment, postage, or travel costs. Exhibit 5
provides the estimated burden by each respondent type.

3

There are eight types of respondents but the PI and Co-PIs share a protocol; thus, there are only seven protocols.

15

Exhibit 5: Burden Hours by Respondent Type
Respondent Type

Respondents
Total
Respondents*
per Site*

Burden Hours Per
Respondent

Total
Burden
Hours

Principal Investigators

1

24

4 (includes 3 hours
preparation time)

96

Co-PIs

2

48

1

48

Project Staff

2

48

1

48

Evaluators

1

24

1

24

Project Partners

3

72

1

72

Parents

5

120

1

120

Teachers

12

288

1

288

Students

12

288

1

288

TOTAL, All
Interviewees

38

912

984

*The number of respondents is an estimate of the maximum burden amount, not an absolute value.

For the entire duration of this data collection activity, the total number of respondents is
estimated to be 912, with a total burden of 984 hours. This burden estimate represents a
maximum possible amount. The actual burden is likely to be somewhat smaller, depending on
the types of projects visited.
The table below gives the overall cost, based on labor burden, for all respondents and also the
cost for each type of respondent. The total cost for all interviews is estimated to be $22,178.64.
The cost for each type of respondent is calculated by multiplying the total annual burden hours
by their average hourly rate. Exhibit 6 displays the calculation by respondent type.

16

Exhibit 6: Cost to Respondents for Burden Hours, by Respondent Type
Number of
Respondents

Burden Hours
Per
Respondent

Total
Burden
Hours

Average
Hourly
Rate 4

Principal
Investigators

24

4

96

$36.09

$3,464.64

Co-PIs

48

1

48

$36.09

$1,732.32

Project Staff

48

1

48

$36.09

$1,732.32

Evaluators

24

1

24

$36.09

$866.16

Project Partners

72

1

72

$36.09

$2,598.48

Teachers

288

1

288

$24.09

$6,937.92

Students

288

1

288

$7.25

$2,088.00

Parents

120

1

120

$22.99

$2,758.80

TOTAL, All
Interviewees

912

Respondent Type

984

Estimated Total
Costs

$22,178.64

A.13 Estimate of Annualized Capital and Maintenance Costs to Respondents
There are no respondent costs associated with these data collections beyond those included in
the estimates presented in Section A.12.

A.14 Estimates of Annualized Costs to the Federal Government
The estimated total cost to the Federal government of all data collection, analysis, and
reporting activities associated with the ITEST program evaluation is $2,113,534. The average
annual cost to the Federal government is estimated at $704,511. This estimate includes costs

4

The estimated hourly rate for PIs, Co-PIs, evaluators, project partners, and project staff is based on national median
salaries for associate professors in computer and information sciences, education, engineering, engineering
technologies, and mathematics and statistics. The average median salary of these five job titles combined is
$75,062. Divided by the 2,080 hours in a standard work year, this calculates to an average hourly rate of $36.09.
The source of this information is the 2010/2011 National Faculty Salary Survey, conducted by the College and
University Professional Association for Human Resources (CUPA-HR), www.higheredjobs.com/salary. The rate
for teachers is based on national median salaries for the job titles of elementary school, middle school, and high
school teacher. The source of this information is the Department of Labor Educational Services 2010–2011
Edition earning as of May 2008, which can be found at http://stats.bls.gov/oco/cg/cgs034.htm#earnings. The rate
for parents is based on average hourly earnings in June 2011 for all employees as reported by the U.S. Bureau of
Labor and Statistics. The source of this information can be found at http://stats.bls.gov/bls/newsrels.htm#OCWC.
The hourly rate for students is based on minimum wage information effective July 24, 2011, obtained from the
U.S. Department of Labor at http://www.dol.gov/dol/topic/wages/minimumwage.htm.

17

already invoiced, plus budgeted future costs that will be charged to the government for the
portfolio review, data collection, analysis, and reporting.
The ITEST contract period covers three years, from FY 2010 to FY 2013. The portfolio
analysis task will be conducted in 2011, 2012, and 2013. The case studies will be conducted in
2012 and 2013. The final report will be delivered in September 2013.

A.15 Changes in Burden
There are no changes in burden as this is the initial request for clearance.

A.16 Schedule and Plans for Data Collection and Reports
The evaluation of the ITEST program is being conducted over the course of three fiscal
years, FY 2010 through FY 2013. Work on this evaluation began in late 2010 with a review of
existing ITEST project reports and project evaluations and development of a logic model and
evaluation plan, in consultation with ITEST program officers and other NSF EHR and DRL
staff. The portfolio review began in the summer 2011 and will continue through the summer
2013. The 24 case studies will begin immediately following OMB clearance, and end during the
summer of 2013 at the latest. Analysis will be ongoing from the beginning of data collection
through September 2013, when the final report will be completed.
Exhibit 7: Schedule of ITEST Data Collections and Reports
Year 1
2010

Task

F

Year 2

2011

W

2011

Sp

Analyze portfolio

Year 3

2012

S

F

W

Sp

●

●

●

●

2012

S

●

Conduct case studies**

F

●

2013

W

●

Sp

S

●

●

●
●

Final Report
* F = October, November, December; W = January, February, March; Sp = April, May, June; S = July, August, September
** Pending OMB approval

A.17 Approval to Not Display Expiration Date
Not applicable.

A.18 Exceptions to Item 19 of OMB Form 83-I
No exceptions apply.

18


File Typeapplication/pdf
File TitleSupporting Statement
AuthorKyle Goss
File Modified2012-04-13
File Created2012-04-13

© 2024 OMB.report | Privacy Policy