MSP-PE Interview with Lead Partnership Staff

Math and Science Partnership Program Evaluation (MSP-PE)

MSP-PE Lead Partnership Staff

MSP-PE Interview with Lead Partnership Staff

OMB: 3145-0200

Document [pdf]
Download: pdf | pdf
MATH AND SCIENCE PARTNERSHIP PROGRAM EVALUATION (MSP-PE)

INTERVIEW INSTRUMENT FOR SITE VISITS
TO PARTNERSHIPS IN THE MSP PROGRAM
I. Interview with Lead Partnership Staff
(Principal Investigator and Project Coordinator)

Information from this interview will be retained by the National Science Foundation,
a federal agency, and will be an integral part of its Privacy Act System of Records in
accordance with the Privacy Act of 1974 and maintained in the Education and
Training System of Records 63 Fed. Reg. 264, 272 (January 5, 1998). These are
confidential files accessible only to appropriate National Science Foundation (NSF)
officials, their staffs, and their contractors responsible for monitoring, assessing, and
evaluating NSF programs. Only data in highly aggregated form, or data explicitly
requested as “for general use” will be made available to anyone outside of the
National Science Foundation for research purposes. Data submitted will be used in
accordance with criteria established by NSF for monitoring research and education
grants, and in response to Public Law 99-383 and 42 USC 1885c.
Submission of the requested information is voluntary. The public reporting burden
for this collection of information is estimated to average 7 hours including the time
for reviewing instructions. Send comments regarding this burden estimate or any
other aspect of this collection of information, including suggestions for reducing this
burden, to Suzanne Plimpton, Reports Clearance Officer for OMB Collection 31450200, Facilities and Operations Branch, Division of Administrative Services, National
Science Foundation, 4201 Wilson Blvd., Suite 295N, Arlington, VA 22230.

Conducted by:

Conducted for:

COSMOS Corporation
3 Bethesda Metro Center
Suite 700
Bethesda, MD 20814

COSMOS, November 8, 2010

National Science Foundation
4201 Wilson Boulevard
Arlington, VA 22230

C-1

OMB No. 3145-0200
Expiration Date: XX/XX/XX

INTERVIEW INSTRUMENT FOR SITE VISITS
TO PARTNERSHIPS IN THE MSP PROGRAM
I. Interview with Lead Partnership Staff
(Principal Investigator and Project Coordinator)
7 hours each
Name and Title of Respondent(s):
Institutional Affiliation:
Date of Interview:
Interviewer:
Hello, my name is ____________, and I work for the MSP-PE project team. We are
carrying out a series of site visits as part of the evaluation of the MSP Program. The evaluation,
like the program, is sponsored by the National Science Foundation (NSF).
My questions will cover the partnership and its activities in math and science education.
Overall, the interview should take no more than 7 hours of your time during the three-day site
visit. All of your responses will be confidential.

A. PARTNERSHIPS

The first portion of the interview will cover the partnership’s priorities, activities,
outcomes, and content. Wherever possible, please note the timing or dates of the
activities.
1. Shared Partnership Vision and Priorities
a. Partners. Please identify the lead partner, core partners, and other partners in this
partnership. Within the partnering institution for higher education (IHE), which academic department(s)
are the leaders? [Obtain a list or handout giving the formal names of these departments.]
b. Creation and Maintenance. To what extent has the partnership created and maintained a
common vision and set of priorities? To what extent, and why, does each of the core partners understand
and maintain the same broad vision and priorities? What conditions have hindered or helped these
processes? [Probe for awareness that a major objective is to strengthen K-12 student performance, as well
as for the partnership to be an R&D project.] [Obtain documentation, including logic models that reflect
vision and priorities.]
2. Pre-existing Partnerships
a. Identity. What portions, if any, of the partnership had a pre-existing relationship that dealt
with goals similar to those of the current partnership? Likewise, had key partnership leaders from different
institutions previously collaborated?
[If none, skip Q. A2b. and Q. A2c.]

(Continued)

COSMOS, November 8, 2010

C-2

OMB No. 3145-0200
Expiration Date: XX/XX/XX

I. Interview with Lead Partnership Staff (Cont.)
b. Relationship to Partnership Startup and Implementation. Describe how each preexisting relationship might have helped or hindered the startup and implementation of the partnership (e.g.,
the length of time of the prior relationship and its possible overlap with the time of the MSP award;
sources of funding; activities; and accomplishments).
c. Relationship to Institutional Change and Sustainability. Indicate how any pre-existing
relationships also may be contributing to institutional change and the sustainability of the present
partnership.
3. Partnership’s Component Activities
a. Multiplicity of Activities. Defining an activity as a separate “project” or “program” with its
own distinct set of goals, resources, and timelines, and possibly locations, describe the array of activities
under the partnership (e.g., some partnerships have “sub”partnerships, which alone would be considered
separate activities).1 Is there only a single project or program, or are there two or more?
[If only one, skip Q. A3b. and Q. A3c.]
b. “Project” or “Program” Priorities. What is the relationship, if any, among the multiple
projects or programs, and which have greater staffing or dollar investments?
[Probe for specific funding levels for each project or program within the partnership.]
c. Coordination of “Projects” or “Programs.” How are the multiple projects or programs
coordinated? Will the partnership’s ultimate accomplishments mainly be the sum of their
accomplishments or will they represent some greater whole, and if so, how?

4. Partnership Processes
a. Joint Activities. What kind of partnership activities would have been impossible to carry
out without the formal partnership, compared to a single partner working alone or to more informal
arrangements? [Probe for specific examples.]
b. New or Modified Institutional Arrangements. Besides joint activities, has the partnership
been associated with new or modified institutional arrangements between or among the partners, and if so,
how? [Probe for K-16 vertical integration; cross-campus acceptance of courses for credit, at either K-12
or undergraduate levels; joint K-16 academic appointments; and mergers or reorganizations of
departments.]
c. Explanation of Partnership Processes. What conditions have facilitated, impeded, or
otherwise influenced the partnership’s progress in pursuing its goals and objectives in: 1) the formation
and continuing maintenance of the partnership; 2) designing and implementing the partnership’s activities
(projects and programs), such as the recruitment and retention of the IHE faculty or the K-12 teachers
participating in the activities; and 3) striving to achieve the relevant substantive outcomes, such as
increased teacher knowledge or student performance in mathematics and science education?
[Probe for conditions within the partnership, to avoid repeating Q A5d. below.]
(Continued)

1

The objective is not to list all of a partnership’s sponsored events (e.g., individual workshops) but to
understand how these events might be clustered under a coherent activity or “project” or “program” (e.g.,
summer institutes). Typical activities would therefore include a professional development institute or degree
program (preservice or inservice), a project to review and modify a district’s curriculum standards, internship
programs for K-12 teachers or students, a program to make subgrants to schools or districts for self-proposed
activities, or a program to encourage university-based research on mathematics or science education.
COSMOS, November 8, 2010

C-3

OMB No. 3145-0200
Expiration Date: XX/XX/XX

I. Interview with Lead Partnership Staff (Cont.)
5. Community and Other Contextual Conditions
a. Community Support. How has the partnership engaged the local community (e.g.,
businesses, residents, and community organizations) in supporting its vision and priorities or the
implementation of its activities?
b. Related STEM Activities with Other External Support. In what ways does the
partnership’s work include or overlap with related STEM activities supported by other external sources
(e.g., other NSF awards, ED-MSP awards, or foundation awards)? If related STEM activities exist, what is
their relationship, and how does coordination take place?
c. Family and Parental Involvement. What role does family and parental involvement play
in the partnership’s organization or activities? How has such involvement been encouraged?
d. External Policy and Other Conditions Relevant to the Partnership. How, if at all, have
external policies or other conditions influenced the partnership’s work and outcomes?
[Probe for:
1) community conditions (e.g., population shifts affecting student enrollment; IHE or K-12
labor relations; fiscal conditions; and community economic conditions);
2) state and local education conditions (e.g., state standards and policies such as
curriculum, assessment, graduation, or re-certification policies; state or local court
rulings; the role of state or local school board or elected officials; changes in local
practices or policies such as new class-size restrictions; and changes in IHE
requirements or practices; and
3) federal education conditions (e.g., conflicts and complementarities related to the No
Child Left Behind Act of 2001); and
4) K-12 student conditions (e.g., continuation or influx of racial, ethnic, or ELL students.)]

(Continued)

COSMOS, November 8, 2010

C-4

OMB No. 3145-0200
Expiration Date: XX/XX/XX

I. Interview with Lead Partnership Staff (Cont.)
B. EVIDENCE-BASED DESIGN AND OUTCOMES

Now let’s talk about evidence-based design and outcomes. I would like to discuss:
your array of data collection activities; two specific data collection activities in greater
detail; and how the evaluation has been managed.
1. Array of Data Collection Activities
a. Match with “Projects” or “Programs.” Define the data collection activities associated
with each of the “projects” or “programs” previously defined under Section A, including the evaluation
methods used. To what extent are the evaluation activities covering every facet of the partnership’s
operations and its activities?
b. Instrumentation and Sample Record. For each evaluation activity, obtain a copy of the
full instrumentation and a sample record. [See Exhibit A for examples of instruments on teacher quality.]

SKIP NEXT QUESTION (Q. B2)
2. Selected Data Collection Activities
a. Two Important Data Collection Activities. Select the two most important evaluation
activities focusing on: 1) K-12 student performance outcomes, and 2) the K-12 teaching force or K-16
infrastructure. For each of the two activities, what findings have been formally reported by the
partnership, and to whom? [Probe for reports, tables and exhibits, and slide presentations, and whether
the activities might be tracking the emergence of any innovations or new ideas for mathematics and
science education.]
b. Nature of Formal Research Designs. What is the research design or formal
methodological logic underlying each of these two evaluation activities? [Distinguish among:
longitudinal and cross-sectional designs; the use of any comparison or control groups; and true and other
forms of experimental or quasi-experimental designs.]
c. Review of Data. For the same two data collection activities, review the data supporting
the findings and conclusions to understand the strengths and weaknesses of the evidence. Describe any
cautions to be exercised in using the findings. [Suggestion: Review specific tables and exhibits that
present the key findings, discussing in detail the variables, measures, data, and design logic that led to the
findings.]
3. Evaluation Management
a. Implementation. How did the partnership solicit the evaluation and select the evaluators,
and to what extent did the partnership define the evaluation’s design and data collection activities? Has
the evaluation received guidance or instruments from the evaluators of other partnerships or from other
colleagues, and how useful has such guidance been?
b. Changes in Evaluation Direction. In what ways, if any, have the evaluation design or
data collection activities changed from those originally planned? What have been the main reasons for
these changes, if any?
c. Practical Lessons. What procedures or techniques have been used to overcome common
barriers in doing field-based evaluations, such as gaining the needed access to data, or overcoming missing
data or low response rates when surveys are conducted?
(Continued)

COSMOS, November 8, 2010

C-5

OMB No. 3145-0200
Expiration Date: XX/XX/XX

I. Interview with Lead Partnership Staff (Cont.)
Exhibit A
EXAMPLES OF INSTRUMENTS ON TEACHER QUALITY

I. Content Tests
a. Praxis II
b. Best Practice Measure Test
c. Conceptual Understanding Test
d. Mathematics Problem
II. Pedagogical Content Test
a. Learning Mathematics for Teaching
III. Surveys/Questionnaires
a. Horizon Online Survey
b. Test of Science Related Attitudes Survey
c. Survey of Enacted Curriculum
d. Survey of Classroom Practices
IV. Interview Protocols
a. DEMC Math Course Interview Protocol
b. Mathematics Teacher Leader Interview Protocol
V. Observation Instruments
a. Observation Protocol
b. Teaching and Learning Protocol
c. Horizon Classroom Observation Protocol
VI. Scoring Rubrics
a. Peer Classroom Observation Protocol Rubric
b. Reformed Teacher Observation Protocol Rubric
VII. Logs
a. Online Teacher Log
b. Facilitation Log
c. Activity Log

(Continued)

COSMOS, November 8, 2010

C-6

OMB No. 3145-0200
Expiration Date: XX/XX/XX

I. Interview with Lead Partnership Staff (Cont.)
C. TEACHER QUALITY, QUANTITY, AND DIVERSITY

Next, I would like to discuss the priorities, implementation, and outcomes for
activities aimed at teacher quality, quantity, and diversity. Throughout this discussion,
please clarify the grade levels and academic subjects involved, distinguishing especially
among elementary, middle, and high schools and between mathematics and science.
1. Partnership’s Priorities
a. Selection Rationale. How did the partnership select its K-12 teacher-oriented activities?
[Probe for: continuation of a pre-existing activity; use of a formal needs assessment (if so, obtain data); a
derived consensus among original partners; or use of a research and education theory.]
b. Relationship to Other Partnership Activities. As implemented, how have the activities fit
within the partnership’s overall work? [An illustrative relationship would be where the teacher training
aligns with a district’s mathematics curriculum that might have been newly-adopted with the partnership’s
assistance.]
c. Assumed Relationship to K-12 Student Performance. In principle, how are the teacheroriented activities related to K-12 student performance? [Probe for whether the relationship is “proximal”
(serving current K-12 students) or “distal” (serving some future group if not generation of K-12
students).]
2. Implementation and Outcomes
a. In-depth Description of Two Main Activities. Describe two main activities, 1) a
preservice activity (training of teachers-to-be), and 2) an inservice activity (training of existing K-12
teachers). The descriptions should cover the activities as enacted (e.g., the actual service providers,
curriculum, instructional methods, and logistical arrangements) [see Exhibit B for examples]. How do
these activities deal with teacher quality, quantity, and diversity? [Probe for whether the level of detail of
the descriptions is sufficient to replicate the activity at some other site, as well as the nature of the
partnership’s records of such detailed descriptions.] [Note: If the partnership only has preservice or
inservice activities but not both, cover two activities within preservice or inservice; if there is only one
activity, limit the discussion to it.] [Obtain copies of the training or curriculum materials and syllabi used
in these activities.]
b. Math and Science Content. How are these activities maintaining quality control over
their math and science content?
c. Instructional Practices. How are these activities maintaining quality control over the
quality of their instructional practices?
d. Implementation Progress. To what extent has each activity progressed toward its
originally-stipulated outcomes?
e. Implementation Conditions. What conditions have facilitated, impeded, or otherwise
influenced progress in implementing the activities? [Probe for over- or under-recruitment and conflicts
with teachers’ other requirements or work.]
f. Implementation Outcomes. What have been the results of the activities in terms of K-12
teachers’ knowledge or practices, or in terms of K-12 student performance?
(Continued)

COSMOS, November 8, 2010

C-7

OMB No. 3145-0200
Expiration Date: XX/XX/XX

I. Interview with Lead Partnership Staff (Cont.)
Exhibit B
DESCRIPTIVE PROFILE OF ENACTED PROFESSIONAL DEVELOPMENT (PD)
A. Participants
1. Role being targeted by PD (teacher, teacher leader, principal)
2. Part of a school-based team attending the PD (or not), with PD supporting a structured professional
learning community
3. Part of a team operating at the district level, even if representing different schools
4. Whether volunteer or mandated by district to attend PD
5. Amount of stipend, if any (assumes that everyone’s expenses are fully paid)
6. If PD was grade and subject specific, whether participant returned to same venue after the PD
B. Providers:
1. Individual profile of each provider, covering experience and expertise as a provider
2. Training and experience in the content of the PD (could be curricular or leadership content)
3. Whether has offered this particular PD before or not, and if so, how many times or seasons
4. Any kudos or feedback from prior participants about performance as a provider
C. PD Setting:
1. Length of PD (how many consecutive hours over how many consecutive days)
2. If off-site, length and mode of travel
3. Convenience: timing of PD in relation to participants’ normal school and vacation schedules
4. Size of PD classes, length of classes, and no. of classes per day
5. If residential PD, quality of accommodations and whether other family members are present
6. Nature of institution sponsoring the PD (IHE, state or district, or independent nonprofit)
D. PD Curriculum:
1. Subject, topics, and levels covered by PD "courses"
2. Texts and materials used in each "course"
3. What previous research, if any, was found about the course
4. Whether course was previously offered, and feedback reported from the experiences
5. Whether course covers specific ways of teaching the same material in the participant’s K-12 classroom
6. Whether course includes preparation of any kind of plan or action plan for later adapting what was
learned for the participant’s K-12 classroom
E. PD Instruction:
1. Whether a single, pair, or a team of providers for the same course (the same class?)
2. Amount of didactic versus participatory instruction
3. Extent and nature of cooperative or peer-peer learning
4. Use of hands-on exercises and materials
5. For inquiry-based contents, actual demonstration and practice in inquiry-based methods
6. Frequency and amount of PD homework, classroom assessments, and group projects
7. Whether participants receive feedback about or discuss their performance
F. PD Assessment:
1. Group scores on content knowledge tests
2. Group scores on any other tests of participants
3. Results of any observations of PD classes (e.g., to check for fidelity, time-on-task, etc.)
4. Assessment of PD providers’ performance (even if only based on participants’ ratings)

(Continued)

COSMOS, November 8, 2010

C-8

OMB No. 3145-0200
Expiration Date: XX/XX/XX

I. Interview with Lead Partnership Staff (Cont.)

D. CHALLENGING COURSES AND CURRICULA

I would like to discuss some similar issues, except by changing the main topic to
challenging courses and curricula. Throughout the discussion, please again clarify the
grade levels and academic subjects involved, distinguishing especially among
elementary, middle, and high schools and between math and science.
1. Partnership’s Priorities
a. Selection Rationale. How did the partnership select its K-12 curriculum-oriented
activities? [Probe for: continuation of a pre-existing activity; use of a formal needs assessment (if so,
obtain data); a derived consensus among original partners; or use of research and education theory.]
b. Relationship to Other Partnership Activities. As implemented, how have the activities fit
within the partnership’s overall work? [An illustrative relationship would be where the partnership has
helped to develop district or classroom assessments or lesson planning and also has provided related
teacher training.]
c. Assumed Relationship to K-12 Student Performance. In principle, how are the
curriculum activities related to K-12 student performance? [Probe for whether the relationship is
“proximal” (serving current K-12 students) or “distal” (serving some future group if not generation of
K-12 students).]
2. Implementation and Outcomes
a. In-depth Description of Main Activity. Describe the main curriculum-oriented activity.
How is the desire to attain a “challenging” curriculum operationalized or otherwise expressed in this
activity? [Probe for operational definitions of “challenging” curricula; obtain a formal list of the
curricula being used and a sample of the curriculum material.]
b. Math and Science Content. How is this activity maintaining quality control over its math
and science content?
c. Instructional Practices. How is this activity maintaining quality control over the quality
of the instructional practices used in the activity?
d. Implementation Progress. To what extent has the activity progressed toward its
originally-stipulated outcomes?
e. Implementation Conditions. What conditions have facilitated, impeded, or otherwise
influenced progress in implementing the activity?
[Probe for alignment with standards and assessments and changes in district curricula.]
f. Implementation Outcomes. What have been the results of the activity in terms of K-12
teachers’ classroom practices, or in terms of K-12 student performance?

(Continued)

COSMOS, November 8, 2010

C-9

OMB No. 3145-0200
Expiration Date: XX/XX/XX

I. Interview with Lead Partnership Staff (Cont.)

E. ROLE OF IHE DISCIPLINARY FACULTY

Again, I would like to explore the same issues as just discussed, except now
focusing on the role of IHE STEM discipline faculty (faculty who do their main research
in a STEM discipline field and who are members of a STEM discipline department).
1. Partnership’s Priorities
a. Selection Rationale. How did the partnership select its discipline faculty activities?
[Probe for: continuation of a pre-existing activity; use of a formal needs assessment (if so, obtain data); a
derived consensus among original partners; or use of research and education theory.]
b. Relationship to Other Partnership Activities. As implemented, how have the discipline
faculty’s activities fit within the partnership’s overall work? [An illustrative relationship would be where
the partnership supports discipline faculty research that also involves K-12 teachers or students as
interns.]
c. Assumed Relationship to K-12 Student Performance. In principle, how are the discipline
faculty’s activities related to K-12 student performance? [Probe for whether the relationship is
“proximal” (serving current K-12 students) or “distal” (serving some future group if not generation of K12 students).]
2. Implementation and Outcomes
a. Discipline Faculty’s Activities. To date, what have been the amount and quality of the
activities, and how have they met or not met the partnership’s original expectations? [Probe for the array
of activities and obtain relevant materials.]
b. Disciplinary Faculty Products/Contributions. Thus far, how have the discipline faculty
contributed to new ideas or innovations in mathematics and science education? [Probe for specific
products/contributions.]
c. Professional Advancement. How has involvement with K-12 work counted toward
professional advancement in the discipline faculty’s home department? Have there been any changes in
formal IHE policies, and if so, why or why not? [Probe for new research opportunities and career tracks,
as well as differences, if any, between junior and senior faculty.]
d. Implementation Progress. To what extent have the activities progressed toward their
originally-stipulated outcomes?
e. Implementation Conditions. What conditions have facilitated, impeded, or otherwise
influenced progress in implementing the activities? [Probe for the spanning of boundaries across IHE
departments or schools, identifying the widest spans and how they have been managed or coordinated.]
f. Implementation Outcomes. What have been the results of the activities in terms of K-12
teachers’ or students’ performance?

(Continued)

COSMOS, November 8, 2010

C-10

OMB No. 3145-0200
Expiration Date: XX/XX/XX

I. Interview with Lead Partnership Staff (Cont.)
F. EXPLANATIONS REGARDING THE PARTNERSHIP’S WORK

The earlier sections have already asked about the conditions that have possibly
facilitated, impeded, or otherwise influenced progress in implementing the partnership
and its activities (see Qs. A4c, C2e, D2e, and E2e). Without repeating those responses, I
would like to get a fuller and broader view of the workings of the partnership.
1. Building and Maintaining a Math and Science Partnership
a. Influential Conditions. How would you best describe the conditions that have most
influenced the running of the partnership and its ability to support high-quality math and science education
activities? [Probe for internal and external conditions.]
b. Other Implementation Conditions. What kinds of conditions would make it easier or
harder for the partners to work together and for the implementation of its activities?
2. Rival Explanations
a. Prior Relationships. To what extent did the partnership start afresh, compared to having
had collaborative relationships between key individuals or organizations that predated the start of the MSP
award?
b. Non-MSP Relationships. To what extent have non-MSP conditions (e.g., new district,
university, state, or federal education policies) been a driving force in promoting the partnership and its
activities?

COSMOS, November 8, 2010

C-11

OMB No. 3145-0200
Expiration Date: XX/XX/XX


File Typeapplication/pdf
File TitleSection A
Authorjscherer
File Modified2010-11-10
File Created2010-11-10

© 2024 OMB.report | Privacy Policy