MEMORANDUM
TO: Jasmeet Seehra, Office of Management and Budget (OMB), Desk Officer
CC: Leland Melvin, Associate Administrator NASA Office of Education,
James L. Stofan, Deputy Associate Administrator, NASA Office of Education,
Shelley Canright, Outcome Manager, NASA Office of Education
Mary F. Sladek, Outcome Manager, NASA Office of Education
Rob Lasalvia, Project Manager, NASA Education
Rick Gilmore, NASA Senior Education Programs Specialist
Shelly Martinez, Office of Management and Budget
Lori Parker, NASA Reports Clearance Officer
Frances Teel, NASA OCIO
FR: Brian L Yoder, PhD., Evaluation Manager, NASA Office of Education
RE: NASA Education Responses to OMB inquires about the NASA Explorer Schools Project and the related PRA package
DATE: 1 April, 2011
Dear Jasmeet:
NASA Education is please to submit to you answers to your questions about the NASA Explorer Schools (NES) project and the associated PRA package from our meeting at NASA Headquarters on Monday, 28, 2010. We look forward to speaking with you about the project and the package.
Respectfully
yours,
Brian Yoder
1. NES is not a new study. This is a proposal to reinstate with change an expired study, therefore, how does the proposed study relate to last year’s pilot study? What were results from the pilot research study and how did/does the NES program use them? How do they inform the design of the proposed study?
In 2008, an expert panel report from the National Research Council recommended that the NASA Explorer Schools (NES) model be redesigned. The report recommended that the new NES model reach more schools and students, and focus on motivating students around NASA themes so that they are exposed to and consider STEM careers. In response, NASA redesigned the NES project. The redesigned NES project was pilot tested over a 3 month period during the Spring of 2010 with 57 teachers from 48 schools. Data collection during the project pilot, which was conducted under an emergency clearance, informed project modifications for the full implementation. During the 2010-2011 academic year, the NASA Explorer Schools project entered its first full year of operations under the new model of implementation. The newly launched version of the NASA Explorer Schools project is intended to be NASA's classroom-based gateway for middle and high school students providing authentic learning experiences inspired by NASA's unique missions.
The pilot project implementation and data collection were intended to test core project features including curriculum support modules and electronic teacher professional development. The data collected during the pilot period, teacher and student surveys, interviews and focus groups, helped identify barriers to participation and provide feedback that was used to refine project.
Based upon the pilot study, project management and the evaluation team identified key research questions that are the basis of a formative evaluation phase intended to be implemented during the first full year of the project. The pilot study looked at discrete usage of features and products during a short period of time, and did not track usage patterns or barriers over the course of a full year. The 2011 evaluation design focuses on understanding usage of teaching materials, professional development and student engagement opportunities over the course of the school year.
Pilot data has been instrumental in refining first year operations and improving project usability for teachers. For example, the pilot revealed that teachers were using electronic professional development video segments in their classrooms to engage students, demonstrate activities, and present information in the modules. In response, NES created video segments that are meant to be used in the classroom directly with students. In another example, in response to teachers’ feedback that they would like more assistance in identifying which modules would be appropriate for their classrooms; NES has included identifiers and links to assist teachers in selecting appropriate modules for their classrooms. On each curriculum module homepage, NES has identified the: subject(s) covered, topic(s) covered, classification of activity type, targeted grade level, instructional objective, estimated time required to complete the activity, a list of materials needed, and alignment to national content standards. In addition, for lengthy modules, NES has selected featured lessons within these products.
The following sections in the current supporting statement explain the current research study in the context of the pilot study and future research efforts.
“To help fulfill the evaluation needs of the NES project, NASA developed an evaluation plan that started with data collection and program feedback during the pilot. The information collected during the pilot was used by the NES project to make project modifications prior to full implementation. The current stage of the NES evaluation plan involves formative feedback and collection of data on related outcomes; OMB approval is being sought for these components. The NES evaluation plan extends to include a future impact study, not included within the scope of the current request, if warranted by the findings of the current evaluation.
In the current stage of project development, NASA plans to collect data through a formative feedback process that is designed to explore the structures and processes of NES, the implementation of project components, and begins to explore outcomes related to project activities. The primary methods of data collection will include a review of program data, teacher surveys, teacher logs, student surveys, teacher interviews and student focus groups. There are a limited number of respondents within the general public who will be affected by this research, including teachers participating in NES and their students. NASA will use the NES project evaluation data analyses to inform project modifications as necessary and to begin to explore whether there is preliminary evidence of intended program outcomes. Should NES determine that an impact study is warranted, a separate request for OMB clearance would be made in the future.”
2. What does NASA anticipate to learn from the revised data collection? Who will use the results and how will the results be used? Will NASA or Abt publish the results? How will the study participants learn of the study results?
The NES project redesign process included the development of an evaluation plan to ensure that project development and modification would be informed by data, and that the NES project team is using data to guide their decisions. In order to continue to modify and improve the NES project, additional formative feedback is necessary. The next evaluation phase for NES, the formative evaluation beginning in spring 2011, will allow for continued project improvements based on implementation data. This formative evaluation will investigate whether the project overall and its individual components are being implemented as planned. The formative evaluation will document additional lessons from the full NES implementation that can inform program improvement. The evaluation strategy for fall 2012 is to obtain real-time implementation data on the program via teacher logs. This will complement the teacher interviews from spring 2012 in which teacher’s will be asked to reflect on usage from the past academic year. Additionally, we will begin to investigate whether there is evidence that program participants are exhibiting changes in the intended outcomes, including pre-post gains on outcomes of interest. Although the collection of outcomes data is not designed to measure project impact, it will provide some preliminary evidence on whether there are observable changes in outcomes of interest. This information can help inform future decisions about whether to make the investment that would be required for a rigorous impact evaluation. ABT Associates, Inc will prepare a report summarizing findings from the evaluation with NASA and NASA will share evaluation findings with project participants through the Virtual Campus website. It will consider publishing results more broadly as the project matures.
3. Have you considered adding a disclaimer to project reporting about the proposed study that can be easily understood that speaks to the limited use of data for program planning/management and as a program evaluation?
The following text can be added to data reports as a disclaimer: This evaluation was designed to provide feedback for program planning and management decisions and to begin to test program theory. The design does not test the impact of the NES project, nor does it warrant causal claims.
4. Can NASA better explain 1) the limitations of the proposed study’s research design and 2) future phases of the study with estimated timeline? Will the overall study questions get more rigorous over time? If so, how and when?
The NASA Explorer Schools project is committed to using evaluation methodologies appropriate to the maturity of the project design. In the first year of full implementation the project is focused on formative evaluation techniques with the intention of characterizing usage patterns, implementation best practices and barriers to sustained participation to ensure consistent delivery of project resources against project design. After this verification of the project model, the project will transition into a data collection that includes gathering of outcomes data to understand whether there is evidence that the project’s intended outcomes are present. Note, because of the nature of the design, a causal link between the project implementation and intended outcomes cannot be established, and no causal claims will be made. Instead, the data collected on outcomes, which is scheduled to begin in FY 2012, will be used to inform decisions about whether a more rigorous impact study should be designed and pursued. Note that the impact study is NOT included in the current approval request. The design of an impact study would be informed by the currently proposed data collection. A tentative timeline for the impact study is included in the table below.
Fiscal Year |
Academic Year |
NES Project Year |
Evaluation Stage/Activity |
FY 2010 |
2009-10 |
Pilot |
Program pilot and data collection (completed) |
FY 2011 |
2010-11 |
Year 1 |
Formative assessment/qualitative data collection |
FY2012 |
2011-12 |
Year 2 |
Formative assessment including collection of outcome (NOT impact) data/survey data on implementation and teacher and student outcomes |
FY2013 |
2012-13 |
Year 3 |
Review of formative assessment evaluation findings and decisions about future impact study. If warranted, design of impact study |
FY2014-2015 |
2013-15 |
Year 4-5 |
Conduct impact study as warranted |
The currently proposed data collection is designed to answer the following research questions:
Evaluation Research Questions (FY 2011 and FY2012)
|
|
What are the characteristics of schools, teachers, and students that participate? What does NASA provide as part of NES? What components of NES do teachers access and use? How is NES being implemented in schools and classrooms? How are teachers supporting their use of NES? What are barriers to implementation? What are reasons for partial participation? What are user’s impressions of materials?
What best practices do teachers use in the areas of curriculum integration, student engagement, technology use, community outreach and family Involvement? Are NES teachers collaborating with one another? What are teachers’ comfort and confidence with NES products? Do students associate perceived changes with NES activities? |
Does teachers’ comfort with STEM topics change with participation in NES?
What are the levels of student engagement in NES and STEM activities? Do student associate perceived changes with NES activities Is there a change in student attitudes towards STEM before and after the implementation of NES? Is there a change in student interest in other NASA activities? Is there a change in student interest in NASA-related STEM careers? |
If warranted, a future impact study would be designed to answer questions regarding the effect of participation in the project on teachers’ knowledge of NASA opportunities, knowledge of NASA-STEM content, and comfort with the STEM substantive knowledge necessary to teach NES modules, and students’ interest and engagement in STEM.
5. Explain how data collected will be analyzed this summer. How does the NES program expect to use the analyses to improve program delivery next year? What are the research questions?
The table below presents the study’s research questions and maps them to the proposed data collection activities, and a description of the planned analyses follow.
Table A.2 Map of Research Questions to Data Sources |
||
Research Question |
Data Sources Year 1 |
Data Sources Year 2 |
Participants |
|
|
What are the characteristics of schools, teachers, and students that participate? |
Program data |
Teacher pre/post surveys, Student survey, program data |
Implementation |
|
|
What does NASA provide as part of NES? |
NES staff interviews |
NES staff interviews |
What components of NES do teachers access and use? |
Teacher interviews, student interviews, program data |
Teacher post survey and logs |
How is NES being implemented in schools and classrooms? |
Teacher interviews, student interviews |
Teacher logs |
How are teachers supporting their use of NES |
Teacher interviews |
Teacher logs |
What are barriers to implementation? |
Teacher interviews |
Teacher pre/post surveys |
What are reasons for partial participation? |
Teacher interviews |
|
What are user’s impressions of materials? |
Teacher interviews, student interviews |
Teacher pre/post surveys and logs |
What best practices do teachers use in the areas of curriculum integration, student engagement, technology use, community outreach and family involvement? |
Teacher interviews |
Teacher logs |
Are NES teachers collaborating with one another? |
Teacher interviews |
Teacher logs |
Teacher outcomes |
|
|
What are teachers’ comfort and confidence with NES products? |
Teacher interviews |
Teacher pre/post surveys and logs |
Does teachers’ comfort with STEM topics change with participation in NES? |
Teacher interviews |
Teacher pre/post surveys |
Student outcomes |
|
|
What are the levels of student engagement in NES and STEM activities? |
Teacher interviews, student interviews |
Teacher logs |
Do students associate perceived changes with NES activities? |
Student interviews |
|
Is there a change in student attitudes towards STEM before and after the implementation of NES? |
Student interviews, Teacher interviews |
Student pre/post surveys |
Is there a change in student interest in other NASA activities? |
Student interviews |
Student pre/post surveys |
Is there a change in student interest in NASA-related STEM careers? |
Student interviews |
Student pre/post surveys |
Data collected in year 1 will be qualitative and will be analyzed by coding the interview and focus group transcripts. Analysis will consist of an iterative process in which the data are culled for themes that emerge about the components of NES that teachers access and use, the modifications they make, supports they use in implementing NES in their classroom, challenges they encounter in their use of the materials, and the best practices they employ in their use of the materials.
The findings will be presented to NES project management so that they can make decisions about whether modifications to the NES project are warranted prior to implementation in Year 2. For example, NES project management will review the emergent themes from users’ impressions of materials to identify content gaps in its resources. If teachers express a desire for a greater number of resources to support a technology curriculum, NES can modify its selection of resources to address this need. NES project management also expects to closely analyze barriers to participation and design and implement strategies to remove any barriers that the project can directly impact. The findings from year 1 will enable NES to make data driven improvements to project offerings and delivery mechanisms to address the classroom and educational needs of its target audience.
For data collected in year 2, descriptive and correlational analyses are planned. Analyses of quantitative survey data will include a detailed summary that utilizes appropriate descriptive statistics. For survey items using continuous scales, the study will calculate means and standard deviations to describe both central tendency and variation. Frequency distributions and percentages will be used to summarize answers given on ordinal scales. Correlational analyses will be used to investigate associations between project components and outcomes of interest. Statistical tests, such as 2 analyses or t-tests, will be used to test for differences between pre- and post-time points. The analyses of the interview and focus group data will include simple frequencies as well as descriptive summaries of emergent themes.
Findings will be presented to NES project management so that they can make judgments on fidelity of implementation, based on the descriptions of teacher use from teacher logs, whether the NES materials are being used in a frequency and manner in which the project has been designed. The descriptive data on change in teacher and student outcomes will be reviewed to determine whether there is preliminary evidence that the intended outcomes of the project are being observed. As with year 1, these results will be reviewed and inform decisions on project modifications. For example, if it is found that teachers are not using all of the individual components of the NES program (e.g. they use classroom modules without viewing the associated electronic professional development) we may concentrate communication campaigns on the advantage of using all the components of NES together and draw explicit links between the associated components.
6. Teacher survey: a) Add more of an explanation about the study to teachers on the intro to the survey, teacher interviews, and teacher log introductions. There should more of an explanation to teachers as to why this information is being collected and how it will be used. Something like --- “the feedback we receive from you will be used to…”
The initial materials that will be provided to recruit teachers into the data collection activities will provide information about the type of data being collected and its uses. A brief description will also be added at the beginning of each survey and log. Please note that the Teacher Invite information can be referenced in Appendix H. IRB approved notification materials of the NES Evaluation supporting statement.
7. Change the parental consent forms to ask the student’s demographic data questions. Improve the introductory statement to ensure it is an active consent form.
Because the project activities will take place in a commonly accepted education setting, involves regular education practices, and is investigating instructional curricula, we ask that OMB allow the currently proposed parental notification protocol. The study protocol was reviewed by the Institutional Review Boards at both EDC and Abt.
EDC IRB has exempted the data collection activities because they meet one or more of the criteria for exempt research provided for in 45CFR46.101(b)(1):
(b) Unless otherwise required by department or agency heads, research activities in which the only involvement of human subjects will be in one or more of the following categories are exempt from this policy:
(1) Research conducted in established or commonly accepted educational settings, involving normal educational practices, such as (i) research on regular and special education instructional strategies, or (ii) research on the effectiveness of or the comparison among instructional techniques, curricula, or classroom management methods.
The study received Abt IRB approval for a parent notification protocol, and we request permission to proceed with the IRB approved protocol that involved parent notification. Please note that the Parent Notification information can be referenced in Appendix H. IRB approved notification materials of the NES Evaluation supporting statement.
8. Student Survey. Remove questions on race/ethnicity. Reduce the 7-point scale on the student survey to a 5-point scale or explain the research behind the 7-point scale. Explain why the title on the student survey is broader than grades 5-9—the NES target grades. Was this a typo or is there an expectation that students outside that target area are participating?
The race/ethnicity questions have been removed from the student survey.
A 7-point scale was used instead of a 5-point scale to try and make the scales more sensitive to change. In last year’s Summer of Innovation evaluation, surveys used a 5-point scale, instead of a 7-point, and no pre/post differences were found. There is some worry that this lack of difference reflected the lack of sensitivity of the instrument, rather than stability of participants’ opinions. Further, surveys used in the NES pilot evaluation leveraged a 7-point scale, and though these analyses were done on the aggregate rather than paired data, we were able to detect pre/post differences.
The title on the student survey is broader than grades 5-9 because NES targets grades 4-12. NES is NASA's classroom-based gateway for middle (grades 4-8) and high school students (grades 9 - 12), which provides authentic learning experiences designed around NASA's unique missions while promoting student engagement in science, technology, engineering and mathematics, or STEM. NES uses a Virtual Campus website (http://explorerschools.nasa.gov) to provide professional development and support for educators and allows students to participate in NASA's missions of discovery and exploration. NES offers cross-cutting NASA STEM content modules for middle school and high school teachers to implement into their classrooms. To be a participant in NES you must be a U.S. citizen, have a current valid education certification as an administrator or educator in a state or nationally accredited education institution (grades 4-12) in the United States or a U.S. territory or a Department of Defense or State Department school.
9. How does NES establish a sustained commitment from teachers and students? What are we learning about teacher long duration participation from our evaluation?
The student outcomes of NES relate to interest and engagement. NASA’s precollege education programs serve a complementary role to the efforts of the National Science Foundation and the US Department of Education, which play a lead role in the nation’s precollege federal STEM education efforts. Most of NASA’s K-12 efforts align with the national goal of engagement, defined by the Academic Competiveness Council as to “increase students’ engagement in STEM and their perception of its value to their lives.”1 The National Research Council (NRC) report that reviewed NASA’s precollege education programs commented specifically on the NASA’s precollege programs with this focus:
Recommendation 2 The exciting nature of NASA’s mission gives particular value to projects whose primary goal is to inspire and engage students’ interest in science and engineering, and NASA’s education portfolio should include projects with these goals. Because engineering and technology development are subjects that are not well covered in K-12 curricula, projects aimed at inspiring and engaging students in these areas are particularly important.2
The NES project is designed to contribute to NASA’s education goals by increasing student interest and engagement with NASA resources, STEM activities, and STEM careers. The outcomes that will be investigated in this evaluation follow directly from the NES project logic model, as is the practice in theory-based evaluations and as recommended by the NRC report. Please note that the NES logic model can be referenced in Appendix A. NES Logic Model of the NES Evaluation supporting statement.
Again, NES uses a Virtual Campus website (http://explorerschools.nasa.gov) to provide professional development and support for educators and allows students to participate in NASA's missions of discovery and exploration. NES offers cross-cutting NASA STEM content modules for middle school and high school teachers to implement into their classrooms. In practice NES engages students through participation in classroom experiences led by teachers. NASA works to develop and maintain long term engagement with teachers in the NES project in a variety of ways:
Incentivizing Participation: NES designed a recognition structure that incentivizes in-depth, long-duration participation in the project. High performing teachers who demonstrate innovative and sustained use of NES resources both inside and outside the classroom are offered exclusive opportunities to participate in educator research experiences at the various NASA centers across the country. NES identifies those teachers through an application process where participants provide narratives regarding their participation in the project, impact of NES resources, improved trends in student interest in STEM, and integration of NES with existing classroom technologies and curriculum. These narratives also support NES’s efforts to ensure that its resources effectively bolster teachers’ efforts to enhance student achievement in STEM topics and will be accessed and utilized on a continuous basis.
Credit Bearing Electronic Professional Development and Ongoing Support: NES offers professional development opportunities focused on supporting teachers in the use of NASA resources. Opportunities provide Continuing Education Units (CEU) to teachers who participate in reoccurring sessions over the course of an academic year. CEU’s are provided by and administered through NES partner Oklahoma State University.
Customer Focused Content: NES consistently develops and posts new content on the Virtual Campus throughout the year to maintain the relevance and dynamism of the project. NES will also continually analyze participant feedback to make data driven decisions regarding the replacement of less popular resources with new content to address any gaps in current NES offerings. NES has retained a large number of teachers from the pilot effort and historic model of the NES project, and these “alumni” teachers continue to exhibit some of the highest resource usage rates this year, demonstrating the enduring value of NES resources and offerings over time.
Learning Communities: NES engages participants through a variety of social media tools to build a community of NES STEM Teachers. NES has begun to facilitate discussions surrounding participant-submitted stories of classroom best practices and strategies for the implementation of NES resources. These discussions will enable NES to identify opportunities to better align its offerings with classroom needs and facilitate integration of the NES model into established curricula. It will also serve to strengthen teachers’ investment in the project, self-identification as part of the NES Community, and sustained participation by ensuring that the NES project continually offers opportunities to collaborate with other educators and further develop their classroom expertise.
1 U.S. Department of Education, Report of the Academic Competitiveness Council, Washington, D.C. 2007, p.18.
2 National Research Council. (2008). NASA’s Elementary and Secondary Education Program: Review and Critique. Committee for the Review and Evaluation of NASA’s Precollege Education Program, Helen R. Quinn, Heidi A. Schweingruber, and Michael A. Feder, Editors. Board on Science Education, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press, p.6.
File Type | application/msword |
File Title | Hi Jasmeet: |
Author | byoder |
Last Modified By | byoder |
File Modified | 2011-04-01 |
File Created | 2011-04-01 |