3145-0200 Part A - updated 1-20-2011

3145-0200 Part A - updated 1-20-2011.pdf

Math and Science Partnership Program Evaluation (MSP-PE)

OMB: 3145-0200

Document [pdf]
Download: pdf | pdf
MATH AND SCIENCE PARTNERSHIP
PROGRAM EVALUATION (MSP-PE)

Supporting Statement for
Continuation of Data Collection

November 8, 2010

National Science Foundation
Division of Undergraduate Education
Math and Science Partnership Program

GLOSSARY OF TERMS AND ABBREVIATIONS

ED

U.S. Department of Education

ED-MSP

Mathematics and Science Partnerships Program administered by the U.S.
Department of Education; a counterpart to NSF’s MSP Program

Focused and
Not-Focused Schools

One of MSP-PE’s substudies examines trends for two groups of schools:
1) those schools on which the partnership had focused its activities (focused
schools), and 2) those schools in the same district and grade level that had not
been the subject of any partnership activity (not-focused schools).

IHE

Institutions of higher education

LEA

Local education agency

MSP-MIS or MIS

Math and Science Partnership (Program’s) Management Information System.
The MIS is designed to obtain annual data from each MSP Program-funded
awardee. The data describe the implementation and progress of the individual
awardees.

MSP-PE

Math and Science Partnership Program Evaluation

MSP Program or NSF-MSP

Math and Science Partnership Program administered by the National Science
Foundation

Partnerships

Math and Science Partnerships funded by the National Science Foundation under
the MSP Program

PD

Professional development

PIs or co-PIs

Principal investigators or co-principal investigators

R&D

Research and development

RETA

Research, Evaluation, and Technical Assistance

RQ

Research question

SEA

State educational agency

STEM (education)

Science, technology, engineering, and mathematics (education)

Substudy

MSP-PE consists of a series of substudies with each substudy covering a
different facet (e.g., student achievement) of the MSP Program.

COSMOS, November 8, 2010

iii

INTRODUCTION TO THE SUPPORTING STATEMENT
This document presents the supporting statement for the continuation of data
collection as part of an evaluation of the National Science Foundation’s (NSF) Math and
Science Partnership (MSP) Program. OMB approved an earlier clearance on June 20, 2006
(OMB No. 3145-0200), and the three years allotted for the original clearance then expired
on June 30, 2009. The present document seeks mainly to extend the original clearance so
that data collection can restart and then continue through June 2014 (this target date
assumes a three-year re-approval that would start in June 2011).
NSF supports research and education in science, technology, engineering, and
mathematics (STEM) through extramural awards (grants, contracts, and cooperative
agreements) to over 2,000 institutions of higher education (IHEs) and other research and
education institutions in all parts of the United States. The awards serve NSF’s broader
mission, which is to help the United States to: maintain a position of eminence at the
global frontier of “fundamental and transformative research” and to sustain a “world class
science and engineering workforce”—while also fostering the scientific literacy of all
citizens (National Science Board, 2005). The workforce includes not only practicing
scientists and engineers but also teachers (and in particular K-12 teachers) of mathematics
and science. To support current and future generations, the successful workforce must
draw from students who have gained a strong mathematics and science education. The
critical nature of K-12 systems arises from their positioning at the beginning of such
education.

COSMOS, November 8, 2010

1

The Math and Science Partnership Program
Within NSF, the MSP Program is administered by the Directorate for Education and
Human Resources (EHR), which is responsible for the continued vitality of the nation’s
STEM education as well as its improvement. The MSP Program is among many within
EHR devoted to this quest, which covers not just K-12 but also undergraduate and graduate
education programs.
The MSP program is distinct from other EHR programs in that it fosters math and
science partnerships (partnerships) between STEM discipline departments and K-12
school districts.1 The partnerships require extended participation by STEM discipline
faculty (faculty having a STEM field as their primary field of research, compared to other
faculty who might have been teaching mathematics or science at the IHE level but whose
field of research might have been some other field, such as education). One result of this
requirement has been the presence in the MSP Program of many Research I and Research
II universities, as categorized under the Carnegie Classification system. More often than
not, these universities serve as the partnerships’ lead organizations.
Awards Made by the MSP Program. The MSP Program began making awards to
support the partnerships in 2002. From 2002 to 2004, NSF made five-year awards to 48
partnerships (many later receiving no-cost extensions or supplemental awards that
lengthened their awards to six or seven years). From 2006 through 2009 NSF then made

1

Technically, most school districts cover the preschool grade and are “pre-K-12” systems, and the
activities of the MSP Program can include this full grade range. However, for the sake of convenience, the
districts are referred to as “K-12” systems throughout this document.
COSMOS, November 8, 2010

2

new rounds of awards, adding 32 partnerships. Under the most recent solicitation for
proposals (NSF 10-556), NSF made yet another round of awards during the late summer
and fall of 2010.
Exhibit 1 summarizes the total number of awards made by the MSP Program through
2009. The exhibit shows that the IHE-K-12 partnerships receive the bulk of the program’s
funds, but the MSP Program also makes other kinds of awards related to the partnerships.
For example, the Research, Evaluation, and Technical Assistance (RETA) awards are
devoted to conducting related research, providing technical assistance to the partnerships,
or fostering dissemination and communication among the partnerships.
Awardees’ Acceptable Educational Activities. For the partnerships, the congressional
legislation authorizing the MSP Program (National Science Foundation Authorization Act,
P.L. 107-368, 2002) identified a broad range of acceptable educational activities. The
flexibility was intended to suit local education conditions, which differ because of the
decentralized nature of the U.S. education system for grades K-12: State and local school
boards, the latter usually consisting of elected officials, establish each local system’s
instructional methods and curricula, and the local boards also are responsible for hiring and
firing all school personnel, including the K-12 system’s superintendent. The acceptable
activities could cover K-12, undergraduate, and graduate education as well as inservice
training to existing K-12 teachers (see Exhibit 2). Equally important, each partnership was
permitted to undertake more than a single activity, and every partnership has done so.

COSMOS, November 8, 2010

3

In its solicitations for proposals (e.g., NSF 10-556 as well as earlier solicitations),
NSF has defined five key features that are intended to embrace the broad range of
acceptable activities. The partnerships are to incorporate all of the features in their work:
1. Being partnership-driven (collaboration between IHE and K-12 systems,
including significant roles by IHE-STEM disciplinary faculty);
2. Aiming to enhance and sustain the quality, quantity, and diversity of K-12
teachers of mathematics and/or the sciences;
3. Ensuring K-12 students’ preparation for, access to, and encouragement in
succeeding in challenging courses and curricula;
4. Using evidence-based design and outcomes to contribute new knowledge
about teaching and learning in mathematics and science; and
5. Ensuring the sustainability of project work, reflected by comprehensive and
coordinated institutional change at both the college/university and the local
school district levels.
Nevertheless, the overall guidance does not specify any particular set of interventions or
activities to be implemented by awardees. As a result, the awardees’ projects may more
closely resemble those in a “field-initiated” research program, reflecting considerable
heterogeneity, rather than awards in a centrally-specified and consistently defined
intervention from grant to grant. On top of these variations, the awards may cover
different grade levels and emphasize different academic subjects within mathematics and
science.
The MSP Program as an R&D Program. The MSP Program has one other distinctive
facet. According to the first sentence in the introduction of its solicitations for proposals,
the program considers itself, first and foremost, an R&D program:

COSMOS, November 8, 2010

4

The MSP Program is a major research and development effort designed
to improve K-12 student achievement in mathematics and science
(NSF 09-507).
This designation fits with the fourth of the preceding program features. Awardees are
therefore urged to develop new ideas and innovations in mathematics and science
education, not just to implement acceptable activities (Yin, Hackett, & Chubin, 2008).
U.S. Department of Education Counterpart Program. NSF’s MSP Program has a
counterpart program, the “Mathematics and Science Partnerships (ED-MSP),”
administered by the U.S. Department of Education (ED) under its own authorizing
legislation (No Child Left Behind Act, P.L. 110-103, 2002). A U.S. House Committee
report (Committee on Science, 2003) describes the complementarity of the two programs
as follows: Whereas the NSF-MSP Program is to fund “innovative programs to develop
and establish new models of education reform, thereby remedying the lack of knowledge
about math and science research,” the ED-MSP Program is aimed at “broadly
implementing and disseminating new teaching materials, curricula, and training programs.”
The ED-MSP Program focuses on professional development for mathematics and
science teachers, to improve their content knowledge and pedagogical skills. In
implementing the program, ED first makes formula allocations to the states, which in turn
make awards to school districts, for up to a three-year period and ranging from $25,000 to
$2.5 million per award. Evaluating the ED-MSP Program falls outside of the scope of the
evaluation of the NSF-MSP Program. However, a few local sites have received both

COSMOS, November 8, 2010

5

NSF-MSP and ED-MSP awards, and the planned data collection will attend to the nature of
the relationships between the NSF and ED awards at these sites.
Summary of Request to OMB. The request for OMB review asks to extend the
clearance for three instruments to be administered in face-to-face interviews with the
partnerships’ staffs, including accessing and reviewing the partnerships’ records and
documents, under the NSF-MSP Program. One instrument calls for interviews of a
partnership’s principal investigator and project coordinator; a second calls for interviews of
co–principal investigators and partners; and the third calls for interviews of the local
partnership evaluator. The three instruments are slightly updated but are essentially the
same instruments that were the subject of the original OMB clearance.
Under the original OMB clearance, the original versions of all three instruments were
used to collect data from the 48 partnerships awarded from 2002 to 2004. Some of the
findings from this data collection are reported in the next section of this introduction.
The proposed data collection will use the slightly revised versions of the instruments
to cover 32 additional partnerships, which include the newly-awarded partnerships from
2006-09 as well as a few earlier partnerships receiving Phase II awards.2
Tables 1 and 2 at the end of the supporting statement describe the differences between the original clearance and this request.

2

The earlier partnerships were all eligible to apply for Phase II awards to continue or extend some
aspect of their original activities, and six of them received awards between 2006 and 2009.
COSMOS, November 8, 2010

6

The Evaluation of the Math and Science Partnership Program
The data to be collected with these field instruments are for the ongoing evaluation of
the NSF-MSP Program. This program evaluation started in 2004 and is known as the
Math and Science Partnership Program Evaluation (MSP-PE).
The purpose of the evaluation has been to assess the progress and accomplishments of
the MSP Program as a whole, not focusing on any specific awardee.3 The program
evaluation has been addressing three research questions (RQs):
RQ1. How has the MSP Program affected, influenced, or been associated
with changes in: a) K-12 student achievement in math and science;
b) the K-12 math and science teaching force; and c) other outcomes
associated with the program?
RQ2. How have STEM disciplinary faculty from institutions of higher
education (IHEs) participated in the MSP Program, and what has
been their role in the program’s achievements?
RQ3. What factors or attributes appear to have accelerated or
constrained progress in the MSP Program’s achievements?
To address these questions, and to accommodate the diversity of the partnerships’
activities within the MSP Program described earlier, the MSP-PE has had to develop a
distinctive evaluation design. Precluded by the diversity of acceptable partnership
activities has been the use of any singular evaluation design, as might resemble an
experimental or quasi-experimental study. Such single designs assume either a single
activity undertaken by a single awardee (the conventional “project” evaluation) or the same
kind of activity undertaken by multiple awardees (the conventional “program” evaluation).
3

The individual partnerships are the subject of separate, “local”-level evaluations that perform both
formative and summative functions, but only in relation to their specific partnerships.
COSMOS, November 8, 2010

7

In the MSP Program, in contrast, each awardee has undertaken multiple activities, and
these activities differ widely from awardee to awardee.
Also precluded by the nature of the partnerships’ activities has been the use of a
theory-based design, typically involving the development and testing of some sort of logic
model or theory of change (e.g., Kellogg Foundation, 2004). By definition, the
partnerships operate in a multi-institutional environment that includes school districts,
IHEs (community colleges, 4-year colleges, as well as universities), business and
community groups, and other institutional partners (e.g., science centers and museums).
Among these institutions is hypothesized a series of pathways whereby support for K-20
mathematics and science education can result in the desired career outcomes, including the
ultimate goal of enhancing the STEM workforce in this country (see far right portion of
Exhibit 3).
In principle, the logical flow through these pathways could serve as the needed logic
model. However, the multi-institutional environment effectively creates an open system
rather than the closed set of “input-activity-output-outcome-impact” components of any
desired logic model. Such a closed set is usually dominated by a single institution or
organizational environment. In contrast, evaluations of open systems, rendered as if they
were following a logic model, are notoriously difficult to design and implement, if not
impossible to conduct. The difficulty lies in the complexity and importance of external
conditions in an open system. They cannot easily be tracked, but they can swamp any
expectation of an orderly input-activity-output-outcome sequence.

COSMOS, November 8, 2010

8

Given the inappropriateness of an experimental (or quasi-experimental) design, as
well as the difficulties faced by any theory-based design, NSF defined the needed MSP-PE
evaluation as one that would consist of a series of substudies, with each substudy covering
a different facet of the MSP Program. To date, and with data collected under the original
OMB Clearance (OMB No. 3145-0200), the MSP-PE has completed 39 substudies, of
which 19 have been published in peer-reviewed journals.
The substudies have mainly aimed at the first and second of the three research
questions. In effect, the initial set of substudies collectively emulates an outcome-based
evaluation (e.g., Schalock, 1995; Newcomer, 1997; Hatry, Cowan, & Hendricks, 2004);
and Morley & Lampkin, 2004). Such a strategy therefore represents a viable means of
assessing the MSP Program, in the following sense: by highlighting outcomes, the strategy
covers the summative orientation implied by the first research question; and by relying on
multiple substudies covering multiple outcomes, the strategy accommodates the diverse
nature of the MSP Program. As the MSP-PE progresses into the future, other substudies
can now begin to examine the third research question more closely.

Brief Summary of Findings for Five Outcomes
To highlight briefly the findings from the MSP-PE to date, the remainder of this
introduction discusses five outcomes from the MSP Program that have been assessed by

COSMOS, November 8, 2010

9

one or more of the 39 substudies.4 Two of the outcomes are related to research question
one (RQ1), involving the progress (or not) shown by the MSP Program with regard to:
1) K-12 student achievement trends among students whose teachers participated in the
partnerships’ professional development and related activities; and 2) changes in teacher
content knowledge after the teachers had participated in such activities. A third outcome is
related to RQ2: 3) the extent and nature of participation in the MSP Program by IHESTEM faculty. The fourth outcome deals with: 4) the progress of the MSP Program as an
R&D program; and the fifth outcome deals with: 5) the early signs of sustainability of the
partnerships’ activities beyond the period of formal funding by the MSP Program.
1. K-12 Student Achievement Outcomes. One substudy examined student
achievement on state assessment tests in mathematics and science at the elementary,
middle, and high school levels (grades 5, 8, and 11), for the four-year period from 2003-04
to 2006-07 (Dimitrov, 2009). The substudy contained three analyses.
The first analysis tested the correlation between achievement trends and school
participation in the MSP Program. The findings showed that:
The more years that schools met the participation criterion during
the four-year period from 2003-04 to 2006-07, the higher were their
proficiency scores in 2006-07.
This correlation was statistically significant at the p<.01 level for three of the test
combinations (two academic subjects times three grade levels) and therefore suggested a

4

The discussions of the five outcomes are accompanied by references to the relevant substudies. For a
synthesis of these substudies and the five outcomes, including a presentation of the original data from the
substudies, see Yin, 2010.
COSMOS, November 8, 2010

10

positive relationship between participation in the MSP Program and K-12 student
achievement.
The second analysis examined the trends for two groups of schools: those schools on
which the partnership had focused its activities (Focused schools) and those schools in the
same district and grade level that had not been the subject of any partnership activity (Notfocused schools). The results showed that:
Within partnering school districts, Focused schools had statistically
significant and positive achievement trends from 2003-04 to 200607 at the p<.05 level for all six test combinations (two academic
subjects times three grade levels), whereas
Within the same partnering districts, the trends for the Not-focused
schools were negative at the p<.05 level in elementary and middle
school mathematics; had no significant changes in middle and high
school science; and were positive at the p<.05 level in high school
mathematics and elementary school science.
Because the predicted pattern appeared in four of the six combinations of subjects and
grade levels, the findings were interpreted as providing further support for a positive
relationship between participation in the MSP Program and K-12 student achievement.
The third analysis covered the Focused schools alone. Within these schools, the
trends among four subgroups of students showed greater gains for African-American
students in comparison to three other groups (Asian American students, White students,
and Hispanic students). These results suggested that achievement gaps between the
African-American and other subgroups were diminishing during the four years, also a
desired educational outcome.

COSMOS, November 8, 2010

11

2. K-12 Teacher Content Knowledge. Such knowledge represents one of the
important outcomes dealing with teachers of mathematics and science. The evaluation
assessed changes in teachers’ content knowledge by conducting a synthesis of the annual
reports submitted by the partnerships that had provided findings on teachers’ content
knowledge before and after participating in a partnership’s professional development
activities (Moyer-Packenham & Westenskow, in press).5 The synthesis showed that:
The vast majority of the reports (covering 63 percent of the teachers in
mathematics and 78 percent in science) reported statistically significant
gains on tests administered after a partnership’s professional development
activities, compared to performance on similar tests prior to the activities.
These results were reinforced by the findings from another substudy which only covered
the eight “Institute” awards, finding that six of the eight reported statistically significant
gains (Davis, 2009).
3. Involvement in Partnerships’ Activities by STEM Discipline Faculty. The
evaluation examined the extent of involvement by STEM discipline faculty, who were
defined as those faculty having a STEM field as their primary field of research, compared
to other faculty who might have been teaching mathematics or science at the IHE level but
whose field of research might have been some other field, such as education (Alligood,
Moyer-Packenham, & Granfield, 2009).

5

Unlike the data on K-12 student achievement, the reports did not provide the actual data representing
the teachers’ original scores, but only gave the partnerships’ own findings on teacher content knowledge.
Therefore, and again unlike the analysis of the K-12 student achievement data, the cross-awardee analysis of
teacher content knowledge consisted of a research synthesis of the partnerships’ findings.
COSMOS, November 8, 2010

12

Because the purpose of the faculty’s involvement was to deepen the partnerships’
mathematics and science content, the desired involvement needed to take the form of
services provided by the STEM discipline faculty, not just their participation in partnership
functions. The relevant services could cover: leading or assisting in the professional
development for K-12 teachers; offering programs and courses for preservice teachers;
assisting K-12 students with science projects, math nights, and science fairs; and assisting
school districts in making curriculum selections or defining district assessments and lesson
plans.
Based on a review of data reported by the partnerships and the site visits conducted by
the evaluation team, the assessment found that (Moyer-Packenham et al., 2009):
Whether categorized by grade span or subject, STEM discipline faculty
were more involved in every kind of partnership activity than any other single
kind of service provider (including IHE education faculty and
K-12 teacher/leaders).
Of the roughly 900 IHE faculty involved in the MSP Program, 55 percent were STEM
discipline faculty and 45 percent were education faculty. These findings support the
conclusion that the MSP Program has successfully engaged STEM discipline faculty in its
activities. Part of the success may be attributed to the program’s requirement that every
partnership have a STEM discipline faculty member as its project director. However, to
their credit, the partnerships have successfully extended such top leadership to the
recruitment of many other STEM discipline faculty into the partnerships’ work.
4. Advances as an R&D Program. Because the MSP Program is considered an R&D
program, NSF has strongly urged the partnerships to contribute new ideas in mathematics
COSMOS, November 8, 2010

13

and science education and not just to deliver educational services. To assess this outcome,
the evaluation used a proxy measure: the extent and nature of articles authored by the
awardees and that had appeared in peer-reviewed journals (Yin, Hackett, & Chubin, 2008).
Such articles were taken as indicators of R&D contributions because each journal, by
agreeing to publish a paper through its peer-review process, in effect had vouched for the
quality and newness of the ideas in the paper, as well as the soundness of the paper’s
research methods (Davis & Yin, 2009).
As part of this assessment, the relevant publications had to appear during the life
cycle of the awardees, and therefore from 2004 to 2009. Reviewing these publications, the
substudy (Davis & Yin, 2009) found that:
The 77 awardees reported 304 published works, with 172 or
57 percent of these appearing in 83 peer-reviewed journals.
Further examination revealed that no small group of awardees dominated the publications
and undesirably accounted for the bulk of the publications, as the 172 articles had been
produced by 39 (or 51 percent) of 77 awardees. The findings led to the conclusion that the
MSP Program has been successfully meeting its objective of serving as an R&D program,
especially since the initial search had identified over 628 candidate items (including the
172 articles), and many of these other items took the form of presentations that may appear
as additional publications beyond 2009. At the same time, the evaluation was not able to
identify any comparative benchmark, from other R&D programs, to help define the
expected number of peer-reviewed publications that might emanate from an R&D program.

COSMOS, November 8, 2010

14

5. Signs of Sustainability of the Partnerships’ Work beyond the Period of Formal
Funding by the MSP Program. Truly sustainable partnerships require a symbiotic
relationship whereby the partners contribute and derive mutual benefits—e.g., partners
must collaborate to produce a joint product that neither can produce alone, and the
availability of the product must then benefit all the collaborators (Yin, 2009). The benefits
will support a partnership in the long run, independent of external sources of support.
A review of the partnerships’ activities revealed that most of the activities benefited
either the IHE or K-12 partner, but not both. For instance, when STEM faculty assist K-12
teachers and students, the STEM faculty do not necessarily derive any particular benefit in
relation to their STEM careers. Conversely, when preservice teachers enroll in an IHE
partner’s courses and programs, the partnering district does not necessarily derive any
predictable benefit, because the preservice graduates may become employed anywhere, and
not necessarily at the partnering district.
The review did uncover two kinds of activities that may produce mutual benefits and
therefore hold promise as the basis for sustained partnership. The first kind involves
changes in IHE tenure and promotion rules—to recognize IHE faculty for their
participation in K-12 education activities (e.g., Kutal, Rich, Hessinger, & Miller, 2009).
The desirability of this type of initiative for fostering partnerships is well-known, but only
a small number of partnerships have taken such steps, and the desired changes are likely to
follow an uncertain path that also takes a long time to occur. First, the rules may pertain to
some academic departments but not others. Second, even after favorable rules are in place,

COSMOS, November 8, 2010

15

the changes then need to be translated into concrete actions in reviewing the work and
advancement of an individual faculty member.
The second kind of activity has received less recognition, possibly because it has not
been frequently found in the partnerships between IHE-STEM faculty and K-12 systems
that predated the MSP Program. Yet, within the MSP Program, the activity has been
widespread. It involves STEM faculty designing, modifying, or enhancing courses in
STEM departments (Yin, 2009):
From 2003-04 to 2005-06, the partnerships in the MSP Program had
offered 257 courses by STEM discipline departments at 57 IHEs.
This second kind of activity differs from the conventional preservice activity in that
existing K-12 teachers, not just preservice and undergraduate students, may enroll in the
courses (see Exhibit 4 for examples among the partnerships). Yet, the courses also differ
from the conventional inservice activities that usually take place in workshops, summer
institutes, or at school sites—but outside of the formal IHE curriculum.
From a sustainability perspective, to the extent that a large number of existing K-12
teachers enroll in the courses, the activity appears to offer the desired mutual benefits. The
IHEs and STEM faculty benefit from the increased enrollment in their programs, and the
K-12 districts and their teachers benefit from the more intense lessons in mathematics and
science than might occur in the traditional professional development workshops or summer

COSMOS, November 8, 2010

16

institutes,6 and local school districts may divert their professional development funds
toward the enrollment in these courses. Many of the teachers also become candidates for
advanced degrees. The IHE courses for existing K-12 teachers may therefore become selfsustaining and the basis for lasting IHE (STEM)-K-12 district partnerships.7

6

The assumption about this favorable comparison about the intensity and quality of the lessons is based
on the fact that formal course offerings at a university require departmental review that serves as a quality
control measure not present in the offering of off-site K-12 workshops and institutes—see Shapiro et al.,
2006, p. 7).
7

The most desirable situation would be where a local university (or group of collaborating universities)
offered an array of courses meeting the substantive needs of the local K-12 teachers, and where the K-12
district(s) then restricted professional development options (and limited the use of their resources), at least in
mathematics and science, to the university courses. Barring the intermittent cutbacks in districts’ professional
development budgets, the local resources would then support the arrangement on a sustaining basis.
COSMOS, November 8, 2010

17

A. JUSTIFICATION
A.1. Circumstances Requiring the Collection of Data
The information collection for which OMB clearance is being sought is part of a
program evaluation of the National Science Foundation’s (NSF) Math and Science
Partnership (MSP) Program. The Program is one of 11 programs authorized under the NSF
Authorization Act of 2002 (P.L. 107-368, December 19, 2002) and is administered by
NSF’s Directorate for Education and Human Resources’ (EHR). EHR prepared and
competed a Statement of Work (SOW) for the program evaluation. COSMOS Corporation,
teamed with scholars at George Mason University and Vanderbilt University,8 was
awarded the evaluation contract.
The program evaluation started in 2004 and collected data on the MSP Program’s
initial set of partnership awards, collecting site visit data from 48 partnerships. The data
collection used instruments and procedures approved under OMB No. 3145-0200 (June 20,
2006 and expiring on June 30, 2009). Since that time, the MSP Program has made 32
additional partnership awards, covering 26 new partnerships and the Phase II work of 6 of
the earlier partnerships. Site visits are to be made to these 32 additional awards, and the
clearance requested in this package covers the instruments and procedures related to this
new set of site visits.

8

Two of the scholars at the two universities subsequently re-located to Utah State University and Brown
University but are still part of the evaluation team.
COSMOS, November 8, 2010

18

The MSP Program is recognized as an important research and development effort at
NSF for integrating the work of higher education, especially that of STEM disciplinary
faculty, in support of the development, implementation, and sustaining of partnerships
among institutions of higher education (IHEs), K-12 schools and school systems, and other
important stakeholders.
The evaluation addresses three primary questions. Other questions may arise during
the course of the evaluation, and therefore the evaluation will include but is not limited to
these questions. The three main questions all are found in the original Statement of Work
used by NSF in commissioning the MSP Program Evaluation (MSP-PE):
RQ1. How has the MSP Program affected, influenced, or been associated
with changes in: a) K-12 student achievement in math and science;
b) the K-12 math and science teaching force; and c) other outcomes
associated with the program?
RQ2. How have STEM disciplinary faculty from institutions of higher
education (IHEs) participated in the MSP Program, and what has
been their role in the program’s achievements?
RQ3. What factors or attributes appear to have accelerated or
constrained progress in the MSP Program’s achievements?
NSF started the program evaluation in 2004, making an award to a team of evaluators
led by COSMOS Corporation, an external contractor. Some of the evaluation’s findings
from this earlier period have been described in the introduction to this supporting
statement.
The information from the program evaluation already has provided and should
continue to provide an understanding of the MSP Program’s outcomes. The contractor’s

COSMOS, November 8, 2010

19

earlier reports to NSF included informal periodic reporting, formal quarterly and annual
reports, and separate substudies. NSF used the analyses to make mid-course modifications
in support of the MSP Program, to prepare and publish its own reports, and to respond to
requests from Committees of Visitors, Congress, and the Office of Management and
Budget, particularly as related to the Government Performance and Results Act (GPRA)
and the Program Effectiveness Rating Tool (PART).
A.2. Purposes and Uses of the Data
The primary purpose for this information is program evaluation. The program
evaluation will answer the research questions enumerated in A.1. The evaluation’s major
purpose is to provide summative assessments of the outcomes of the MSP Program. These
include the program’s contributions to K-12 student achievement; to the strengthening
(e.g., quantity, quality, and diversity) of the K-12 teaching force; and to other outcomes
such as the role and participation by IHE STEM faculty in the program’s activities. The
evaluation also will assess the program’s role and contributions as an R&D program as
well as the prospects for the sustainability of the partnerships.
Besides providing summative assessments, the evaluation aims to contribute to the
identification of the processes that influence or interfere with the outcomes of the features
studied, including the conditions that account for the demonstrated quality and
innovativeness of the program.

COSMOS, November 8, 2010

20

A.3. Use of Information Technology To Reduce Burden
The evaluation will collect only the minimum information necessary for addressing
the evaluation questions. The data collection procedures minimize respondent burden and
will use reporting formats that are best suited for the type of information to be gathered. In
compliance with OMB directives, paper data collection instruments in this evaluation will
be supplemented with an electronic version as an option.
A.4. Efforts to Identify Duplication
The MSP-PE evaluation does not duplicate other NSF efforts. For example, project
data on program funding are drawn from the NSF administrative database called the
FastLane Project Reports system (OMB Control Number 3145-0058). Project monitoring
data for the MSP Program are gathered via the Program’s Monitoring Surveys cleared
under OMB 3145-0199 and have been made available to the COSMOS team. Data from
these collections are used to pre-fill items, where possible, to further minimize the overall
response burden.
Neither the FastLane nor Monitoring Surveys involves site visits to the partnerships.
To the extent possible, the evaluation will use the data from these preceding sources and
pre-fill or delete items from its own site visit instruments as appropriate, to avoid
redundancy and reduce burden on respondents. Similarly, no other national databases
capture completely the information sought by this evaluation.

COSMOS, November 8, 2010

21

A.5. Impacts on Small Businesses
No small businesses are known to be partners of any of the MSP Program’s
partnerships. Therefore, no data will be collected from any small business organizations.
A.6. Consequences of Not Collecting the Information
If the information is not collected, NSF will not have independent, external
documentation of the outcomes of the MSP Program and thus will not be able to meet its
accountability requirements. Moreover, NSF will be unable to comply fully with the
congressional mandate that the Foundation evaluate its MSP Program.
A.7. Special Circumstances Justifying Inconsistencies with Guidelines in 5 CFR
1320.6
The data collections will comply with 5 CFR 1320.6.
A.8. Consultation Outside the Agency
Federal Register Notice. A 60-day notice to solicit public comments was published
in the Federal Register on July 7, 2009 (see Appendix A). Only one comment was
received, but it had no substantive content and also did not address issues of cost or hour
burden.
Consultation Outside of the Agency. Consultations on the research progress have
occurred throughout the evaluation work and will continue to take place as the evaluation
progresses. In particular, the evaluation team has engaged a small group of experts who
have not been involved in the data collection and who have and will continue to provide
their expert opinions. The purpose of such consultation is to ensure the technical
soundness of the evaluation and the relevance of its findings, as well as to verify the
COSMOS, November 8, 2010

22

importance, relevance, and accessibility of the information sought in the evaluation. The
members of the expert group represent the nation’s leading researchers and scholars in
mathematics and science education as well as the broader field of evaluation. During the
earlier phase of the program evaluation, the group included:













Robert Boruch, Ph.D., is University Trustee Chair Professor in the Graduate School
of Education and the Statistics Department of Wharton School at the University of
Pennsylvania, the Co-Director of the Center for Research and Evaluation of Social
Policy (CRESP), and the Co-Director of the Policy Research, Evaluation, and
Measurement Program (PREM).
Sharon Johnson Lewis is the Director of Research for the Council of the Great City
Schools. In that role, Ms. Lewis has been responsible for developing and
maintaining a research program that articulates the status, needs, attributes,
operations, and challenges of urban public schools and the children whom they
serve.
Douglas Osheroff, Ph.D., is the J.G. Jackson and C.J. Wood Professor of Physics in
the School of Humanities and Sciences and the Gerhard Casper University Fellow
in Undergraduate Education at Stanford University.
Charles S. Reichardt, Ph.D., is Professor of Psychology at the University of
Denver. A self-described methodologist and statistician, Dr. Reichardt’s research
focuses on the logic and practice of causal inference in both laboratory and field
settings.
Warren Simmons, Ph.D., directs the Annenberg Institute for School Reform at
Brown University. The Institute was established in 1993 to generate, share, and act
on knowledge that improves conditions and outcomes in American schools,
particularly in urban areas and in schools serving disadvantaged students.
Mary Lee Smith, Ph.D., is the Regents’ Professor in Arizona State University’s
Division of Educational Leadership and Policy Studies. At ASU, Dr. Smith worked
with Gene V. Glass as he developed meta-analysis methodology and published with
him a book and numerous scholarly articles about the effects of psychotherapy (one
of the most often-cited studies in psychology).
Philip Uri Treisman, Ph.D., is Professor of Mathematics at the University of Texas
at Austin and Executive Director of the Charles A. Dana Center. Dr. Treisman has
received numerous honors and awards for his efforts to strengthen American
education.

COSMOS, November 8, 2010

23



Alan Tucker, Ph.D., is Director of Undergraduate Studies and the Distinguished
Teaching Professor of Applied Mathematics and Statistics Department at SUNYStony Brook. His current professional service includes: Chair of the MAA
Education Council; Chair of the MAA Metropolitan New York section; lead author
on the 1994 MAA evaluation, Assessing the Calculus Reform Movement; Director
of the MAA project, Case Studies in Exemplary Undergraduate Mathematics
Programs; as well as membership on a dozen MAA, AMS and NRC committees.

All of these individuals will be asked to serve in a similar capacity in relation to the new
data collection. If any of them are unable to serve, comparable replacements will be found.
In addition, the partnership awardees from whom the site visit data are to be collected
have had explicit knowledge of the planned data collection. These awardees were required,
as part of their conditions of award, to collaborate with the evaluation’s data collection.
A.9. Payments or Gifts to Respondents
No payments or gifts will be provided to respondents.
A.10. Assurance of Confidentiality
Respondents will be advised that any information on specific individuals will be
maintained in accordance with the Privacy Act of 1974. Data collected are available to
NSF officials and staff and other contractors hired to manage the data and data collection
software. Data are processed according to Federal and state privacy statutes. Detailed
procedures for making information available to various categories of users are specified in
the Education and Training System of Records (63 Fed. Reg. 264, 272, January 5, 1998).
That system limits access to personally identifiable information to authorized users. Data
submitted will be used in accordance with criteria established by NSF for monitoring
research and education grants and in response to Public Law 99-383 and 42 USC 1885c.
The information requested may be disclosed to qualified researchers and contractors in
COSMOS, November 8, 2010

24

order to coordinate programs and to a Federal agency, court or party in a court, or Federal
administrative proceeding, if the government is a party. Confidentiality issues are
addressed in the cover letter announcing the evaluation and its site visits to the partnerships
(see Appendix B).
A.11. Questions of a Sensitive Nature
In some cases, the instruments request information from respondents including their
name and title. These data are collected in order to monitor the site visit procedures and to
check for consistent data collection across the partnerships. Any individualized data that
are collected are provided only to the evaluation staff who are conducting the studies, using
the data as authorized by NSF. Any public reporting of data is in aggregate form.
A.12. Estimates of Hour Burden
This evaluation includes site visit instruments and the review of field documents and
other records maintained by the partnerships being evaluated. By covering a census of the
MSP Program’s partnerships, the result will be a comprehensive description of the range
and variety of the partnerships supported by the MSP Program. Such a descriptive profile
complements the more numeric profile cumulated through the program’s management
information system (MSP-MIS) and inquires more intensely about key claimed linkages,
e.g., the relationship between the partnerships’ activities and expectations about K-12
student performance (see the probe for Q. A1b in Appendix C).
The MSP-PE site visit instruments contain a set of open-ended questions, to be used
in the field, both in interviewing respondents and examining documents and archival

COSMOS, November 8, 2010

25

evidence. A distinctive part of the instruments is reflected by questions asking for copies
of the partnerships’ own data. Overall, the instruments follow established procedures for
collecting field-based evidence in a systematic manner (Yin, 2009). To present these
questions and collect responses in the most efficacious manner, site visit team members
need to have been trained and prepared adequately regarding the various topics of inquiry,
and such training and preparation will be a formal part of the data collection procedures.
The three MSP-PE site visit instruments (see Appendix C) include: 1) interviews
with the lead partnership staff (principal investigator and project coordinator), 2)
interviews with other partnering staff (co-principal investigators and partners), and 3)
interviews with the partnership evaluator. A description of the site visit procedures, site
visit training, and site visit reporting is provided in Appendix D. Among other items, the
procedures identify the persons to be interviewed and the amount of time estimated for the
interviews. In turn, these estimates become the basis for estimating the burden rates and
costs of the data collection, as discussed next.
A.12.1 Number of Respondents, Frequency of Responses, and Annual Hour Burden
The total number of respondents is 352. The frequency of responses is one time only.
The estimated total response burden is 960 person-hours. The data collection occurs over a
three-year period, therefore the annual number of respondents and person hours would be
the preceding estimates divided by three: 117.3 and 320.

COSMOS, November 8, 2010

26

Respondents include principal investigators, project coordinators, co-principal
investigators, other partners, and partnership evaluators. A total of 32 partnerships will be
site visited over a three-year period, and an estimated 11 respondents for each partnership
will be asked to provide information. The burden was calculated using the total number of
partnerships to be covered, the number of interviewees per partnership, and the number of
interview hours per interviewee (see Exhibit 5).
A.12.2. Hour Burden Estimates by Each Form and Aggregate Hour Burdens
The requested clearance covers three site visit instruments or forms. The first
instrument requires 448 burden hours; the second 256; and the third 256. Across all
instruments, the total burden is 960 person hours (see Exhibit 6).
A.12.3 Estimate of Cost to Respondents for Hour Burdens
The total cost to the respondents is estimated to be $43,456. Over a three-year period,
the annualized cost is estimated to be $14,485.
The following estimated hourly wage rates (in constant dollars as of 2008-09), which
were found in the U.S. Department of Education’s National Center for Educational
Statistics Integrated Postsecondary Education Data System and the U.S. Department of
Labor’s Bureau of Labor Statistics, were used to create an estimate of the various
respondents’ wages: principal investigator and project coordinator—$49/hr.; co-principal
investigators and partners—$49/hr.; and partnership evaluator—$35/hr. The calculations
are shown in Exhibit 7.

COSMOS, November 8, 2010

27

A.13. Estimate of Total Capital and Startup Costs and of Operation, Maintenance,
and Purchase Costs to Respondents or Recordkeepers
There is no overall annual cost burden to respondents or recordkeepers that results
from the evaluation other than the time spent responding to questions in the site visit
instruments that are attached to this request.
It is usual and customary for individuals involved in K-12 and postsecondary
education activities in the United States to keep descriptive records. The information being
requested is, in part, from records that are maintained as part of normal educational
practice. Furthermore, the respondents are active participants in programs or projects
funded by NSF. In order to be funded by NSF, institutions must follow the instructions in
the NSF Grant Proposal Guide (GPG) that is cleared under OMB 3145-0058. The GPG
requires that all applicants submit requests for NSF funding and that all active NSF
awardees do administrative reporting via FastLane, an Internet-based forms system. Thus,
the primary respondents to the data collection tasks for this evaluation make use of
standard office equipment (e.g., computers), Internet connectivity that is already required
as a startup cost and maintenance cost under OMB 3145-0058, and free software (e.g.,
Netscape or Microsoft Explorer) to respond.
A.14. Estimates of Total Costs to the Federal Government
The estimated total cost to the government of all data collection, analysis, and
reporting activities for this evaluation is $1,238,797. The estimated costs are shown in
Exhibit 8.

COSMOS, November 8, 2010

28

A.15. Changes or Adjustments
Not applicable.
A.16. Plans for Tabulation and Publication
A team of evaluators led by COSMOS Corporation is conducting this third-party
evaluation of the MSP Program on behalf of NSF and will only publish results after NSF
completes a review of the document proposed to be published. In short, all products of the
collections are the property of NSF. After the products are delivered, NSF determines how
the quality of the products can be improved to merit later publication. For NSF’s own
publications, it is often only after seeing the quality of the information delivered by the
evaluation that NSF decides the format (raw or analytical) and manner (in the NSFnumbered product Online Document System (ODS) or simply a page of the NSF Web site)
in which to publish. NSF classifies its formal publications as reports, not statistical reports.
Presentations of data and project-related information will be made at relevant venues such
as meetings of the principal investigators and at national conferences.
Before the conclusion of the evaluation, both NSF and the partnerships may use
preliminary data to improve management and performance. For example, data generated
by this evaluation may appear as inputs to other internal and external NSF reports (e.g., the
GPRA Annual Performance Plan). At this time, NSF has not set a timeline for publishing
interim reports from this evaluation. As a general matter, and as with many agencies, NSF
is reducing its reliance on formal (i.e., traditional) publication methods and publication
formats.

COSMOS, November 8, 2010

29

A.17. Approval to Not Display Expiration Date
Not applicable.
A.18. Exceptions To Item 19 of OMB Form 83-I
Not applicable.

COSMOS, November 8, 2010

30

Exhibit 1
SUMMARY OF MSP PROGRAM AWARDS ($ in millions)

Type of Award
Partnerships:
Comprehensive
Targeted
Institute
Phase II
Subtotal
Related Awards:
Research, Evaluation,
and Technical
Assistance
(RETAs)
Innovation through
Institutional
Integrity (I3)
Start (Planning Awardees)
Other Awards (e.g.,
workshops and
conferences)
Subtotal
GRAND TOTAL

Cohorts 1-3
2002-04
Awardees Awards Amount

Cohorts 4-6
2006-09
Awardees Awards Amount

TOTAL Cohorts 1-6
2002-09
Awardees Awards Amount

12
28
8
0
48

13
29
8
0
50

282.4
228.6
44.2
0
555.1

0
11
15
6
32

0
11
15
6
32

0
55.3
64.7
12.5
132.5

12
39
23
6
80

13
40
23
6
82

282.4
283.8
108.9
12.5
687.6

29

35

60.3

12

14

16.6

41

49

76.9

0

0

0

2

2

1.4

2

2

1.4

0
8

0
8

0
13.3

19
13

19
13

5.5
12.5

19
13

19
21

5.5
25.8

37

43

73.6

46

48

36.0

83

91

109.6

85

93

628.6

78

80

168.5

163

173

797.2

Source: National Science Foundation, “Awards Database: Program Information,” downloaded from
NSF’s Web site on April 26, 2010. The MSP-PE is the source for the number of awardees, which is tracked
because several awardees may have received more than one award.

COSMOS, November 8, 2010

36

Exhibit 2
RELEVANT ACTIVITIES FOR MATH AND SCIENCE PARTNERSHIPS (MSPs)
(A)
(B)
(C)
(D)
(E)

recruiting and preparing students for math and science education careers;
offering PD to strengthen the capabilities of math and science teachers;
offering innovative preservice and inservice programs on using technology;
developing distance learning programs;
developing a cadre of master teachers to promote reform and improvement in schools;

(F)
(G)
(H)
(I)
(J)

offering preparatory and certification programs for existing STEM professionals to start teaching careers;
developing tools to evaluate activities conducted under this subsection;
developing or adapting curricular materials incorporating contemporary research on the science of learning;
developing initiatives to increase quantity, quality, and diversity of K-12 math and science teachers;
using STEM professionals in private businesses to help recruit and train math and science teachers;

(K) developing and offering math and science enrichment programs (e.g., afterschool and summer);
(L) providing research opportunities for students and teachers; and
(M) bringing STEM professionals into K-12 classrooms.
Source: National Science Foundation Authorization Act of 2002 (P.L.107-368).

COSMOS, November 8, 2010

37

Exhibit 3
THE PARTNERSHIPS’ ACTIVITIES WITHIN A K-20 FRAMEWORK

COSMOS, November 8, 2010

38

Exhibit 4
STEM COURSES OFFERED TO K-12 TEACHERS:
EIGHT ILLUSTRATIVE MATH AND SCIENCE PARTNERSHIPS
Partnership 1:
At one of the partnering IHEs, faculty in the mathematics, science, and education departments
developed 11 new math and 10 new science graduate-level courses for existing middle grade teachers of
math and science. STEM faculty taught those courses, which can lead to a master’s degree offered by
the College of Arts and Sciences with a specialization in mathematics and science for middle school
teachers (grades 6th-8th). In 2007-08, 98 teachers from the partnering district participated.
Partnership 2:
The partnership supported existing K-12 teachers to take graduate-level mathematics courses.
The faculty teams also formed workgroups to review and approve new curricula for elementary and
secondary mathematics education at one partnering IHE and revised one new course in mathematics
elementary education at a second IHE.
Partnership 3:
The partnership initiated a fellows program for existing teachers to participate in a master’s of
mathematics for teaching program or to earn a certificate of advanced graduate study. Enrolling in
2004, the first cohort included 14 existing teachers, of whom 8 were to graduate in the fall of 2008.
Partnership 4:
Design teams, primarily composed of faculty from the partnering IHE, developed new courses on
various mathematics topics. Four of the courses became part of a new minor, and existing middle
school teachers were eligible to enroll. By the summer of 2006, 53 teachers had enrolled, with 41
completing at least two courses. In addition, by 2006-07, 341 existing teachers had taken other
graduate courses at the IHE.
Partnership 5:
The partnership has supported its five IHE partners in collaboratively developing a common
science education course sequence. At one of the partnering IHEs, 25 undergraduate students and 27
existing K-12 teachers were the latest cohort of students enrolled in the course sequence.
Partnership 6:
The partnership helped one of its partnering IHEs to offer five courses. The courses can lead to a
mathematics endorsement, and 17 existing K-12 teachers enrolled in them during the spring of 2007.
The partnership also supported other IHE faculty to redesign science and mathematics courses, to
encourage students to pursue teaching careers.
Partnership 7:
The partnership supported the provision of mathematics methods courses co-taught by IHE
faculty and district coaches. In 2005-06, eight K-12 teachers enrolled, to gain a higher level of
mathematics knowledge and to move toward a credential to meet No Child Left Behind requirements.
Partnership 8:
The partnership supported four of its IHE partners to create new or redesigned undergraduate
science courses. Both existing and aspiring science teachers have enrolled in these courses.
Source: MSP’s Annual Reports; and MSP-PE Site Visits, 2006-08 (Yin, 2009).
COSMOS, November 8, 2010

39

Exhibit 5
CALCULATIONS USED TO ESTIMATE BURDEN
Person Hours

Respondent Type
- Lead Partnership Staff

Total No. of
Respondents
Across
Partnerships
32 x 2 staff = 64

Burden
Hours per
Respondent
7 hrs.

Total
448 (64 x 7=448)

Annual
149.3

- Other Partnering Staff

32 x 8 staff = 256

1 hr.

256 (256 x1=256)

85.3

- Partnership Evaluator
Total

32 x 1 staff = 32
352
(32 partnerships
x11 staff =352)

8 hrs.

256 (32 x 8=256)
960

85.3
319.9

Exhibit 6
ESTIMATED BURDEN HOURS FOR MSP-PE SITE VISIT INSTRUMENTS
Person Hours

Respondent Type
- Lead Partnership Staff

No. of
Respondents
at Each
Partnership
2

Total No. of
Respondents
Across
Partnerships
64 (32 x 2=64)

Burden
Hours per
Respondent
7 hrs.

Total
448 (64 x 7=448)

Annual
149.3

- Other Partnering Staff

8

256 (8x32=256)

1 hr.

256 (256 x1=256)

85.3

- Partnership Evaluator
Total

1
11

32 (1x32=32)
352
(32x11=352)

8 hrs.

256 (32 x 8=256)
960

85.3
319.9

COSMOS, November 8, 2010

40

Exhibit 7
ESTIMATED COST TO RESPONDENTS

Respondent Type
- Lead Partnership Staff

Total No. of
Respondents
64 (2x32)

Burden
Hours per
Respondent
7 hrs

Average
Hourly
Rate
$ 49/hr

Total Cost
$ 21,952 (64 x 7 x $49)

- Other Partnering Staff

256 (8x32)

1 hr

$ 49/hr

$ 12,544 (256 x 1 x $49)

- Partnership Evaluator
Total

32 (1x32)

8 hrs

$ 35/hr

$ 8,960 (32 x 8 x $35)
$ 43,456

Exhibit 8

TOTAL COSTS TO THE FEDERAL GOVERNMENT
FOR THE DATA COLLECTION

Personnel

$473,263

Other Direct Costs
Travel and Per Diem
Communication and Supplies

$133,920
$15,000

Indirect Costs
Fringe Benefits
Overhead

$179,840
$360,839

Fee
TOTAL COST

COSMOS, November 8, 2010

$75,935
$1,238,797

41

DRAFT
Table 1
Differences Between June 2006 Clearance and November 2010 Extension
(INTRODUCTORY TEXT, SAMPLE, AND BURDEN)

Original
OMB
Clearance

Introductory
Text

Time Period
Covered by
the
Clearance

No. of
Site Visits

Description of
Partnerships
to be Site
Visited

No. of
Respondents at
each Site Visit

No. of Annual
and Total
Respondents

Annual and
Total Response
Burden

Annual and
Total Cost
to All
Respondents

Costs to the
Gov’t for the
Data
Collection

Updated and
Revised as
of June 2005

June 2006 to
June 2009

48 over a
3-year
period

48 Cohort 1-3
Partnerships

18 Respondents
per Partnership:

288 Annual
Respondents

608 Annual
Person-hours

$6,720

$2,244,256

PI and
Coordinator (n=2)

288 x 3 yrs. =
864 Total
Respondents

608 x 3 yrs. =
1,824 Total
Hours

11 Respondents
per Partnership:

117.3 Annual
Respondents

320 Annual
Person-hours

PI and
Coordinator (n=2)

117.3 x 3 yrs. =
352 Total
Respondents

320 x 3 yrs. =
960 Total Hours

OMB:
3145-022

$6,720 x 3 yrs. =
$20,160

Co-PI/Advisors/
Partners (n=15)

Exp. Date:
6/30/09

Evaluator (n=1)
Request for
Extension of
Original
Clearance
(11/8/10)

Updated and
Revised as
of June 2010

June 2011 to
June 2014

32 over a
3-year
period

32 Cohort 4-6
Partnerships

Co-PI/Advisors/
Partners (n=8)
Evaluator (n=1)

MSP–PE Draft, 1/19/11

1

$14,485
$14,485 x 3 yrs.=
$43,456
Reflects a 41%
hourly rate
increase compared
to earlier rates, for
each of the
respondent classes
(per the U.S.
Bureau of Labor
Statistics, 2009
data)

$1,238,797

DRAFT
Table 2
Differences Between June 2006 Clearance and November 2010 Extension
(SITE VISIT INSTRUMENT)
SECTION OF THE SITE VISIT INSTRUMENT
A. Partnerships

B. Evidence-based
Design and
Outcomes

C. Teacher Quality,
Quantity, and
Diversity

D. Challenging
Courses and
Curricula

E. Role of IHE
Disciplinary
Faculty

F. Explanations
Regarding the
Partnerships’s
Work (old Section
title: Rival
Explanations)

Substantive
Changes
(New, Deleted,
or Replaced
Questions)

1. Insert new q. 1a
(“Partners”)
2. Replace q. 2b
(“Extent of
Sharing”) with q. 2b
(“Creation and
Maintenance”)
3. Insert new
probe for new q. 2b
4. Replace q. 4c
(“Formal
Evaluation”) with
new 4c
(“Explanation of
Partnership
Processes”)

1. Replace q. 2c
(“Data Sharing with
MSP–PE”) with new
q. 2c (“Review of
Data”)
2. Replace q. 3
(“Formal
Presentations and
Publications”) with
new q. 3 (“Evaluation
Management”)

1. Replace q. 2a
(“Teacher Quality,
Quantity, and Diversity
Activities”) with new q.
2a (“In-depth
Description of Two Main
Activities”), including the
addition of the new
Exhibit B.
2. Addition of an
illustrative example to q.
1b.
3. Insert new q. 2f
(“Implementation
Outcomes”)

1. Addition of an
illustrative example to
q. 1b.

1. Addition of an
illustrative example
to q. 1b.

2. Replace 2a
(“Course and
Curriculum Activities”)
with new q. 2a (“Indepth Description of
Main Activity”).

2. Insert new q. 2f
(“Implementation
Outcomes”)

3. Insert new q. 2c
(“Instructional
Practices”)
4. Insert new q. 2f
(“Implementation
Outcomes”)

5. Insert new q. 5c
(“Family and
Parental
Involvement”)

Minor
Copyedits

•
•

Minor edits to correct verb tense, voice, and punctuation; and
Change in terminology as follows: partnership (instead of MSP); partnership activities (instead of MSP projects); and
evaluation activities (instead of data collection activities).

MSP–PE Draft, 1/19/11

2

1. Insert new q. 1
(“ Building and
Maintaining a Math
and Science
Partnership”)
2. Replace 3
original questions on
rival explanations
with 2 new, reworded questions on
rival explanations

G.
Background
Information on
“Discovery”
and its
Processes
Section G
Deleted

REFERENCES

Alligood, K. T., Moyer-Packenham, P. S., & Granfield, P. G., “Research Mathematicians’
Participation in the MSP Program,” Journal of Educational Research & Policy
Studies, 9(2): 23-42, Fall 2009.
Committee on Science, Subcommittee on Research, “Implementation of the Math and
Science Partnership Program: Views from the Field,” Hearings, U.S. House of
Representatives, 107th Congress, 2nd Session, October 30, 2003.
Davis, D., "Placing the MSP Program's Institute Partnerships In a Career Development
Context,” MSP-PE, COSMOS Corporation, Bethesda, MD, October 2009.
Davis, D., & Yin, R. K., “Articles Published in Peer-Reviewed Journals: Progress by
MSPs and RETAs in Contributing to Education Research and Practice,” MSP-PE,
COSMOS Corporation, Bethesda, MD, September 2009.
Dimitrov, D., “Longitudinal Trends in Math and Science Partnership-Related Changes in
Student Achievement with Management Information System Data,” MSP-PE, George
Mason University, May 2009.
Hatry, H. P., Cowan, J., & Hendricks, M., “Analyzing Outcome Information: Getting the
Most from Data,” The Urban Institute, Washington, DC, 2004.
Kellogg Foundation, “Using Logic Models to Bring Together Planning, Evaluation, and
Action: Logic Model Development Guide,” Battle Creek, MI, 2004.
Kutal, C., Rich, F., Hessinger, S. A., & Miller, H. R., “Engaging Higher Education Faculty
in K-16 STEM Education Reform,” in J. S. Kettlewell and R. J. Henry (Eds.),
Increasing the Competitive Edge in Math and Science, Rowman & Littlefield
Education, Lanham, MD, 2009.
Morley, E., & Lampkin, L. M., “Using Outcome Information: Making Data Pay Off,” The
Urban Institute, Washington, DC, 2004.
Moyer-Packenham, P. S., Kitsantas, A., Bolyard, J. J., Huie, F., Irby, N., & Oh, H.,
“Participation by STEM Faculty in Math and Science Partnership Activities for
Teachers,” Journal of STEM Education, 10(3-4): 17-36, July-December 2009.

COSMOS, November 8, 2010

R-1

Moyer-Packenham, P. S., & Westenskow, A., “Processes and Pathways: How do
Mathematics and Science Partnerships Measure and Promote Growth in Teacher
Content Knowledge,” School Science and Mathematics, in press.
National Science Foundation, “Award Search: Program Information,” downloaded from
NSF’s Web site on April 26, 2010.
National Science Board, “2020 Vision for the National Science Foundation,” NSB 05-142,
Arlington, VA, December 28, 2005.
National Science Foundation, “Math and Science Partnership (MSP),” Program
Solicitation, NSF 02-061, Arlington, VA, 2002a.
National Science Foundation, “Math and Science Partnership Program (MSP),
Comprehensive and Targeted Projects,” Program Solicitation, NSF 02-190,
Arlington, VA, 2002b.
National Science Foundation, “Math and Science Partnership: Research, Evaluation, and
Technical Assistance (MSP-RETA),” Program Solicitation, NSF 03-541, Arlington,
VA, 2003a.
National Science Foundation, “Math and Science Partnership Program (MSP): Targeted
Projects, Institute Partnerships: Teacher Institutes for the 21st Century, and Research,
Evaluation, and Technical Assistance (RETA),” Program Solicitation, NSF 03-605,
Arlington, VA, 2003b.
National Science Foundation, “Math and Science Partnership (MSP),” Program
Solicitation, NSF 06-539, Arlington, VA, 2006.
National Science Foundation, “Math and Science Partnership (MSP),” Program
Solicitation, NSF 08-525, Arlington, VA, 2008.
National Science Foundation, “Math and Science Partnership (MSP),” Program
Solicitation, NSF 09-507, Arlington, VA, 2009.
National Science Foundation, “Math and Science Partnership (MSP),” Program
Solicitation, NSF 10-556, Arlington, VA, 2010.
National Science Foundation, “Privacy Act of 1974: Revisions to Systems of Records;
New Systems,” Federal Register, 63(2): 264-273, January 5, 1998.

COSMOS, November 8, 2010

R-2

National Science Foundation Authorization Act of 2002, (P.L. 107-368), 42 U.S.C. 1861,
December 19, 2002.
Newcomer, K. E. (Ed.), Using Performance Measurement to Improve Public and Nonprofit
Programs, Jossey-Bass, San Francisco, CA, 1997.
No Child Left Behind Act of 2001, (P.L. 110-103), Secs. 2201-03, Title II, Part B, January
8, 2002.
Schalock, R. L., Outcome-based Evaluation, Plenum, New York, NY, 1995.
Shapiro, N., Benson, S., Maloney, P., Frank, J., Dezfooli, N. A., Susskind, D., & Muñoz,
M., “Report on Course and Curriculum Changes in Math and Science Partnership
(MSP) Programs,” prepared for the National Science Foundation by the CASHÉ
(Change and Sustainability in Higher Education) Project Team, University System of
Maryland, June 2006.
Yin, R. K., Case Study Research: Design and Methods, 4th ed., Sage Publications,
Thousand Oaks, CA, 2009.
Yin, R. K., “Establishing Long-Term Partnerships between K-12 Districts and Science,
Technology, Engineering, and Mathematics (STEM) Faculty,” MSP-PE, COSMOS
Corporation, Bethesda, MD, July 2009.
Yin, R. K., “Evaluation of NSF’s Math and Science Partnership (MSP) Program: A
Summary After Five Years,” MSP-PE, COSMOS Corporation, Bethesda, MD, March
2010.
Yin, R. K., Hackett, E. J., & Chubin, D. E., “Discovering ‘What’s Innovative:’ The
Challenge of Evaluating Education R&D Efforts,” Peabody Journal of Education,
83(4): 674-690, 2008.

COSMOS, November 8, 2010

R-3


File Typeapplication/pdf
File TitleSection A
Authorjscherer
File Modified2011-01-20
File Created2010-11-10

© 2024 OMB.report | Privacy Policy