Supporting_Statement_NES Evaluation 5 24 10

Supporting_Statement_NES Evaluation 5 24 10.docx

NASA Explorer Schools

OMB: 2700-0152

Document [docx]
Download: docx | pdf

Request for Clearance

NASA Explorer Schools (NES) Evaluation


List of Appendices related to this Supporting Statement

Appendix A – NES Program Logic Model

Appendix B – Student Focus Group Guide

Appendix C – Teacher Surveys – (Pre and Post) – screenshots and draft word versions

Appendix D – Teacher Log – screenshots and draft word versions

Appendix E – Student Survey (Grades 6-12 and 4-5)

Appendix F – Map of Research Questions to Data Sources

Appendix G – IRB-Approved Notification Materials

Appendix H – 60-day FRN published Oct 14 2010

Appendix I – Parent Consent Forms



Section A


Introduction


The National Aeronautics and Space Administration (NASA) Office of Education, requests that the Office of Management and Budget (OMB) approve, under the Paperwork Reduction Act of 1995, a two-year clearance for NASA to conduct data collection efforts related to gathering data to inform the monitoring and program improvement of the NASA Explorer Schools (NES) project. NES is designed to provide supplemental curricular materials that provide authentic classroom learning experiences for students in grades 4 through 12.


The NES project is still in early stages of implementation, and the data collection for this formative study is designed to collect information that will be used for program improvement and modifications and to begin to explore whether there is preliminary evidence that desired outcomes are being observed. The formative study is being conducted for NASA by its contractor Abt Associates Inc. (Abt) and Abt’s subcontractor Education Development Center (EDC), with assistance from Dillon-Goodson Research Associates.


The formative study will gather information about project implementation from NES participants through teacher surveys and teacher logs, and will measure baseline and post-program values on outcomes of interest through teacher surveys and student surveys. In addition, focus group interviews will be conducted with students in a small subset of classrooms.


NES provides authentic learning experiences, based on NASA’s missions, for middle school (grades 4-8) and high school (grades 9-12) students. Responding to recommendations from the National Research Council committee that reviewed NASA’s elementary and secondary education projects,1 NASA embarked on a redesign of the NASA Explorer Schools (NES) project in 2008.2 In its recent report, Prepare and Inspire: K-12 Education in Science, Technology, Engineering, and Math (STEM) for America’s Future, the President’s Council of Advisors on Science and Technology (PCAST) concluded that to improve education in STEM, the country needed to focus both on the preparation and inspiration of students.3 The NES model aligns with the focus of the PCAST report as the NES project represents a coherent effort by NASA to help prepare students in STEM and inspire them to pursue STEM careers, or at a minimum, become part of a STEM-literate citizenry.


The development of the new NES model involved a working group comprised of individuals from NES, the Mission Directorates, staff from NASA’s Office of Education (OE), strategic partners, teachers, administrators, and leading members of the national STEM education community. The redesigned NES model includes four core elements: (1) STEM curriculum support materials (modules); (2) electronic professional development (ePD); (3) virtual NASA news events (NASA Now); and (4) teacher, student, and school recognition opportunities. See Appendix A for a depiction of the program logic model.


The new NASA Explorer Schools (NES) model focuses on implementing high-quality NASA content and curricular support resources; these curricular modules were selected through a systematic process that involved a partnership among International Center for Leadership in Education (ICLE), International Technology and Engineering Educators Association (ITEEA), National Science Teachers Association (NSTA), and teacher practitioners. Twenty core products are available, representing all NASA Mission Directorates and current NASA missions in Earth and Space Science, mathematics, chemistry, and physics. Resources include curriculum support guides, design challenges, problem-based learning sets, mission-based educational support materials, lesson plans, multimedia resources, and hands-on engagement opportunities. The NES Virtual Campus website provides short duration professional development experiences for educators. In addition to training on specific STEM topics, the Virtual Campus provides teachers social networking opportunities with peers, NASA educators, and subject matter experts. Special tools and resources enable students and educators to make connections between areas of study and real-life applications. Tools include video clips of “teachable moments” and design challenges as well as links to external resources available to the STEM education community.


To help fulfill the evaluation needs of the NES project, NASA developed an evaluation plan that started with data collection and program feedback during the pilot. The information collected during the pilot was used by the NES project to make project modifications prior to full implementation. The current stage of the NES evaluation plan involves formative feedback and collection of data on related outcomes; OMB approval is being sought for these components. The NES evaluation plan extends to include a future impact study, not included within the scope of the current request, if warranted by the findings of the current study.


In the current stage of project development, NASA plans to collect data through a formative feedback process that is designed to explore the structures and processes of NES and the implementation of project components, and begins to explore outcomes related to project activities. The primary methods of data collection will include a review of program data, teacher surveys, teacher logs, student surveys, and student focus groups. There are a limited number of respondents within the general public who will be affected by this research, including teachers participating in NES and their students. NASA will use the NES project evaluation data analyses to inform project modifications as necessary and to begin to explore whether there is preliminary evidence of intended program outcomes. Should NES determine that an impact study is warranted, a separate request for OMB clearance would be made in the future.


The evaluation components being proposed under this clearance begin with a formative data collected on NES, which includes preliminary measures of intended project outcomes. For new programs, such as NES, process or formative studies are a logical first step, as a formative evaluation can determine whether or not an intervention is being implemented as intended. For example, if the intervention requires that teachers participate in training (e.g. NES’s online PD), provide program materials to their students (e.g. NASA curricular modules), and present particular instructional content or employ particular instructional strategies with their students (e.g. NASA Now events), then this process study can determine if those things can be, and are being, implemented. The NES formative evaluation will look at the implementation of the various components of NES.


Data on the outcomes of interest will be gathered to investigate whether changes in the intended outcomes of the program are present among project participants. In addition to implementation data, which includes real-time implementation data on the program via teacher logs, the evaluation will use a pre-post design to gather data from teacher and student participants to see whether there are changes in intended outcomes as measured before and after participation in the NES project. The gathering of outcomes data will lay the groundwork for decisions about additional evaluation of NES. For example, a look at the related outcome of NES may reveal that program participants show pre-post gains on the science attitudinal outcomes that the program is designed to boost. If the intended outcomes are present, a more rigorous impact evaluation could be designed to test whether the changes are due to the NES project.


At the end of the proposed formative evaluation, NES will determine whether a more rigorous evaluation is appropriate given the findings from this evaluation. NES may determine it is ready for an impact evaluation if the process and outcome studies have shown that the NES project is being implemented as intended, and that there is evidence that teachers and students are exhibiting the intended outcomes. This request for clearance does not cover an impact study. If the current evaluation suggests that an impact study is warranted, a separate OMB Supporting Statement and request for OMB approval would be submitted at the appropriate time, not expected before Spring 2013.


A.1 Circumstances Requiring the Collection of Data


The National Aeronautics and Space Administration (NASA) Office of Education (OE) seeks clearance to administer teacher surveys, teacher logs, and student surveys, and to conduct focus groups with students in a subgroup of classrooms as part of the formative evaluation of the redesigned NES. The formative study will utilize extant program documents and data where available. However, because the current NES project was just launched in September 2010, there are limited existing documents and data, making new data collection necessary. Current authorization for NASA’s research and information dissemination activities is contained in the National Aeronautics and Space Act of 1958, as amended.


The NES project redesign process included the development of an evaluation plan to ensure that project development and modification would be informed by data, and the NES project team is using data to guide their decisions. The initial data collection during the pilot informed modifications made to NES before the full project implementation in fall 2010. For example, the pilot revealed that teachers were using ePD video segments in their classrooms to engage students, demonstrate activities, and present information from the modules. In response, NES created video segments that are meant to be used in the classroom. In another example, in response to teachers’ feedback that they would like more assistance in identifying which modules would be appropriate for their classrooms, NES has included identifiers and links to assist teachers in selecting appropriate modules for their classrooms. On each curriculum module homepage, NES has identified the: subject(s) covered, topic(s) covered, classification of activity type, targeted grade level, instructional objective, estimated time required to complete the activity, a list of materials needed, and alignment to national content standards. In addition, for lengthy modules, NES has selected featured lessons within these products.


In order to continue to modify and improve the NES project, additional formative feedback is necessary. The next evaluation phase for NES, the formative evaluation beginning in fall 2011, will allow for continued project improvements based on implementation data. This formative evaluation will investigate whether the project overall and its individual components are being implemented as planned. The formative component will document additional lessons from the full NES implementation that can inform program improvement. Data on outcomes will explore whether there is evidence that program participants are exhibiting changes in the intended outcomes, including pre-post gains on outcomes of interest. Although the collection of outcomes data is not designed to measure project impact, it will provide some preliminary evidence on whether there are observable changes in outcomes of interest. This information can help inform future decisions about whether to make the investment that would be required for a rigorous impact evaluation.


A.2. Purposes and Uses of the Data


The purpose of this study is to collect data that supports the formative assessment of the NES project. The information will be used to modify project components and to investigate whether there is preliminary evidence of intended outcomes that would warrant further evaluation. The goals of NES are to engage teachers and schools in delivering unique NASA experiences that inspire middle and high school students and interest them in NASA-related STEM content and careers.


This evaluation is designed to provide formative feedback to the project and preliminary evidence of changes in intended project outcomes. The data collected will help inform NES about what possible modifications to the current NES model might be necessary based on evaluation findings. It also will provide preliminary evidence of outcomes related to project implementation, and it will help form recommendations for the possible extension of the evaluation to examine impacts of NES. The data collected for evaluation will address the following research questions about project implementation and related outcomes for teachers and students:


Participants

  • What are the characteristics of schools, teachers, and students that participate?

Implementation

  • What does NASA provide as part of NES?

  • What components of NES do teachers access and use?

  • How is NES being implemented in schools and classrooms?

  • How are teachers supporting their use of NES?

  • What are barriers to implementation?

  • What are reasons for partial participation?

  • What are user’s impressions of materials?

  • What best practices do teachers use in the areas of curriculum integration, student engagement, technology use, community outreach and family involvement?

  • Are NES teachers collaborating with one another?

Teacher outcomes

  • What are teachers’ comfort and confidence levels with NES products?

  • Do teachers’ comfort levels with STEM topics change with participation in NES?

Student outcomes

  • What are the levels of student engagement in NES and STEM activities?

  • Do students associate perceived changes with NES activities?

  • Is there a change in student attitudes towards STEM before and after the implementation of NES?

  • Is there a change in student interest in other NASA activities?

  • Is there a change in student interest in NASA-related STEM careers?


The table below summarizes the key research questions and the means by which data will be collected for each questions in the study’s first and second years. Drafts of the proposed survey instruments themselves are included as Appendices B through F.


Table A.2 Map of Research Questions to Data Sources

Research Question

Data Sources

Year


Participants



What are the characteristics of schools, teachers, and students that participate?

Teacher pre/post surveys, Student survey, program data


Implementation



What does NASA provide as part of NES?

NES staff interviews


What components of NES do teachers access and use?

Teacher post survey and logs


How is NES being implemented in schools and classrooms?

Teacher logs, student focus group


How are teachers supporting their use of NES

Teacher logs


What are barriers to implementation?

Teacher pre/post surveys


What are reasons for partial participation?

Teacher logs


What are user’s impressions of materials?

Teacher pre/post surveys and logs, student focus group


What best practices do teachers use in the areas of curriculum integration, student engagement, technology use, community outreach and family involvement?

Teacher logs


Are NES teachers collaborating with one another?

Teacher logs


Teacher outcomes



What are teachers’ comfort and confidence levels with NES products?

Teacher pre/post surveys and logs


Do teachers’ comfort levels with STEM topics change with participation in NES?

Teacher pre/post surveys


Student outcomes



What are the levels of student engagement in NES and STEM activities?

Teacher logs, student focus group


Do students associate perceived changes with NES activities?

Student focus group


Is there a change in student attitudes towards STEM before and after the implementation of NES?

Student pre/post surveys, student focus group


Is there a change in student interest in other NASA activities?

Student pre/post surveys student focus group


Is there a change in student interest in NASA-related STEM careers?

Student pre/post surveys student focus group



The NES project tracks teacher participants through an online system that also contains background characteristics of participants, their schools, and their districts. In addition, extant NES project data includes NES website usage data and participant feedback on individual project components.


The evaluation will capitalize on the data in these sources. However, because these data are limited, additional data collection is necessary to answer the research questions and provide feedback to the project. The evaluation data will be collected via teacher surveys, teacher logs, student surveys, and student focus group interviews. These instruments are described in greater detail below, and copies are included in Appendices B, C, D, and E. A table of the evaluation questions along with the data sources for each question is presented in Appendix F.


In the first full year of NES implementation, there are just over 1,000 active teacher profiles in NES (as of the March 2011 the number was 1,051). NES allows for rolling enrollment in the project, thus this number continues to increase each week, and additional teachers are expected to enroll in subsequent years. Approximately 2,000 teachers are projected to enroll in the 2011-2012 academic year. The sample of teachers for inclusion in the data collection activities will be drawn from those with active profiles.


Because of the desired in-depth understanding about project implementation, data will be collected from teachers and students through a variety of means. A sample of 400 teachers, approximately 20 percent of the expected 2,000 registrants, will be recruited to complete teacher surveys before their participation in the NES project and at the end of the academic year as well as teacher logs every month. Students in these classrooms will be administered pre- and post-surveys. Assuming 18 students per classroom, this is estimated to be 7,200 students. In addition, at the end of the academic year, focus groups will be conducted with students in eight classrooms. Assuming that on average 18 students from each classroom will participate, 144 students are estimated to participate in these focus groups.


Teacher Survey – Teacher surveys will be completed by 400 sampled teachers. The teacher survey is designed to gather information from teachers about their comfort teaching NASA-related STEM content and their familiarity with NES materials. The baseline teacher survey will be administered to the teachers when they register for NES, prior to their use of the NES materials. A follow-up teacher survey will be administered at the end of the academic year. These surveys will allow the evaluation team to begin to explore whether there are variations in outcomes of interest related to variations in implementation.


Teacher Log – Monthly teacher logs also will be completed by the 400 sampled teachers. Teachers will be prompted with email reminders to access an online survey that asks about their use of NES materials over the past month. The decision to use teacher logs is influenced by the work of Rowan, Correnti and colleagues on the Study of Instructional Improvement. This work provides important information on the reliability and validity of logs compared to more limited in-person classroom observations and annual surveys on teaching practice.4 Teacher logs are a valid and cost-effective method for adequately sampling instructional practices. The logs are designed to be completed electronically monthly, and teachers will receive a reminder via e-mail. The logs will record information on experiences with specific content modules as well as electronic professional development offerings.


Student Survey – Pre- and post-program surveys will be administered to an estimated 7,200 students. These students will be in the STEM classrooms of the 400 teachers who are completing the teacher logs and surveys. Questions on the survey will measure student outcomes related to NES. Abt’s IRB has approved parent notification for student survey participation; the notification sheet can be found in Appendix G.


Student Focus Groups – Toward the end of the academic year, student focus groups will be held to understand students’ experiences with the NES products. A total of 144 students are expected to participate in these focus groups. Eight classrooms will be sampled from among the teachers selected for the study. Assuming 18 students per classroom this will involve 144 students. Data gathered from these student focus groups will help us understand students’ experiences with NES, their familiarity with the project, and their perceptions of its influence on their interest in NASA-related STEM content and careers. Abt’s IRB has approved parent notification for focus group participation, the notification sheet can be found in Appendix G.


The data collection efforts were submitted for review to the Institutional Review Boards (IRB) at EDC and Abt. Under EDC IRB review, the research activities were determined to be exempt from IRB oversight because they meet one or more of the criteria for exempt research provided for in 45CFR46.101(b)(1). The Abt IRB conducted a review of the study protocol and approved a process for parental notification that is included in the OMB submission in Appendix G, along with the approved teacher invites. Should local school IRBs require active parental consent, the parental consent forms in Appendix I will be submitted for their approval.


Information collected during the evaluation will be used in multiple ways. First, the data will provide NASA with feedback regarding what components are being implemented by project participants and whether they are being implemented as intended by NES. Further, the data will provide feedback on potential areas for project modification. Data collected on the intended outcomes of NES will provide preliminary evidence about whether NES-intended outcomes can be observed among NES participants. The combined results of the data collection efforts will assist NASA in making program modifications and in beginning to test the program theory underlying NES. Further, the evaluation findings will inform decisions about whether the program is being implemented as designed and whether a more rigorous impact evaluation is appropriate.


A.3 Use of Information Technology to Reduce Burden


To minimize burden, information that could be obtained through extant data sources has been identified and reviewed. However, extant data is limited.


To reduce respondent burden among teachers, internet-based surveys will be used to collect information from teacher participants. Web-based systems can facilitate respondents' data entry across computer platforms and information, once entered into the system, data can be presented to the respondent for verification, thus reducing the respondent burden. Another valuable feature is that there can be thorough editing of all submitted data for completeness, validity, and consistency. Editing is performed as data are entered. Most invalid data cannot enter the system, and questionable or incomplete entries are called to respondents' attention before they submit their survey. These surveys have user-friendly features (e.g. custom controls such as check boxes). It complies with Section 508, the 1998 amendment to the Federal Rehabilitation Act, which mandates that the electronic and information technology used by Federal agencies be made accessible to all people with disabilities.


Unfortunately, the opportunity for automated information technology use with students is limited in this study. The study will collect baseline data and follow-up data gathered from self-administered surveys of students in classrooms. Because most U.S. classrooms do not have individual student computers, and access to school computer labs may be limited, the student surveys will be administered on paper. To reduce burden on respondents, surveys have been developed from existing protocols with known administration times and reliability.5


A.4 Efforts to Identify Duplication


The information to be supplied does not duplicate any other information collection. Since NES is a newly re-designed project, no data on the new NES project currently exists. The limited data collected during the pilot period was used to inform project modifications, but is not sufficient to understand the full-scale implementation of the project.


A.5 Small Business

Not Applicable


A.6 Consequences of Not Collecting the Information


This data collection fits into the cycle of development informed by data that has served as the basis for the NES redesign and continued improvement. Absent the data collected through the proposed activities, NES will be faced with either continuing to implement as it is currently structured, or making modifications without the necessary data to inform decisions. Failure to collect the information proposed in this request will prevent NASA from assessing the degree to which the NES project is being implemented as intended. Because the project is in its early stages, it is important to get early feedback about how it is functioning within classrooms. Without this data collection, NASA would lack information for making well informed planning and management decisions related to the NES project.


A.7 Special Circumstances Justifying Inconsistencies with Guidelines in 5 CFR 1320.6


A subset of active NES teachers will be prompted to complete online logs every month about their use of NES materials over the past month. The decision to use teacher logs is influenced by the work of Rowan and colleagues who used logs to document teacher practice instead of in-person classroom observations or an annual survey on teaching practice.6 Initial plans were to collect these logs every two weeks, however, the results of the pilot testing indicated that monthly was sufficient to balance the need to capture accurate data on implementation and reduce burden on teachers.


Other components comply with the guidelines of 5 CFR 1320.5.


A.8. Consultation Outside the Agency


Comments on this data collection effort were solicited in the Federal Register, published October 14, 2010, volume 75, number 198 and page 63207. A copy of the 60-day Federal Register Notice is provided with this application in Appendix H. No comments were received in response to the Agency Federal Register Notice.


In addition, NASA has consulted with national educational organizations to help shape the evaluation research study to keep data collection burden to a minimum and to keep data collection relevant. The partners include: the International Center for Leadership in Education (ICLE), International Technology and Engineering Educators Association (ITEEA), and the National Science Teachers Association (NSTA). Consultation on the study design was conducted by the research firm, Abt Associates Inc. and their subcontractor Education Development Center. In addition staff from Booz Allen Hamilton familiar with NES were consulted. Ricky Takai, former Associate Commissioner, National Center for Education Evaluation, Institute of Education Sciences at the U.S. Department of Education was also consulted on modifications to the study approach.


A.9. Payments or Gifts to Respondents


Not Applicable


A.10. Assurance of Confidentiality

Any required assurances of confidentiality will be provided in writing at the top of each survey or log and in the introduction of the focus group interview protocol. Prior to any data collection, participants will be advised of the purpose and use of the data collection, and the fact that participation is voluntary. Parents will be notified prior to data collection about the study. The IRB-approved study parent notification and teacher invite materials are included in Appendix G.


The NASA HQ Information Technology and Communication Division performed a risk based assessment of the primary contractor’s IT systems and processes for NASA data. They issued an authorization to operate (ATO) that declares that adequate security controls are implemented in the information system and a satisfactory level of security is present in the system.


In addition, the contractors conducting the study will be required to adhere to the following procedures:


  • Access to the electronic files shall be controlled by user ID and by group membership. All paper files (such as hand-written focus group notes, completed surveys) shall be stored in locked cabinets. All electronic and paper files shall be destroyed two years after the end of the contract.

  • Names and other identifiable information shall be redacted in all primary data (focus group notes, survey results) and replaced with numerical identifiers. A separate file shall be created that links names to the identifiers.

  • All data shall be reported in aggregate and will not contain any identifying information (such as respondent’s name, address, or affiliation with a school or district).


A.11. Questions of a Sensitive Nature

Not applicable


12. Estimates of Response Burden.


The requested burden for this evaluation is 4,744 hours for 7,600 respondents, which includes teachers and students involved with NES. This is based on the following assumptions:


  • 400 teachers, 20 percent of the projected registrants, will be recruited from registered participants in NES to complete teacher surveys and logs. Surveys will be administered at the beginning and end of the year. Teacher logs will be completed every month, estimated to be 8 times from the beginning to the end of the academic year.

  • The students of teachers participating in the study will be invited to participate in the student surveys. Assuming 18 participants per classroom, this is estimated to be 7,200 students.

  • A subset of students will participate in student focus group interviews; 144 students will participate in these focus groups. Students in classrooms of 8 of the teachers who are in the study will be invited to participate in hour-long focus groups at their schools. Assuming 18 students participate per classroom, this is estimated to be 144 students.


Table A.12 presents the calculations used for the estimated hours and respondents.


Table A.12. Number of Respondents, Data Collection Activities, Frequency of Response, and Annual Hour Burden

 Respondent

Data Collection

Activity

Number of Respondents

Mean Time per Collection

(Hours)

Number of Collections

Responses

Mean Time per Activity

(Hours)

Total Respondent Time

(Hours)



 

 

 

 

 

 

Teachers

Baseline survey

400

0.25

1

400

0.25

100


Teacher logs

400

0.17

9

3,600

1.50

600


Start-of-course survey

400

0.25

1

400

0.25

100


End of course survey a

400

0.50

1

400

0.50

200

Students








Subset

Baseline survey

7,200

0.25

1

7,200

0.25

1,800


End-of-course survey

7,200

0.25

1

7,200

0.25

1,800

 

Focus group interview a

144

1.00

1

144

1.00

144


Total

7,600b


15

19,344


4,744

a End of course burden calculation does not adjusted for attrition, which is discussed in section B.1

b Total number of respondents calculated by summing shaded boxes.

There is no cost to respondents other than the time it takes to respond to the survey.

A.13 Estimate of Total Capital and Startup Costs/Operation and Maintenance Costs to Respondents or Record Keepers

There are no annualized capital/startup or ongoing operation and maintenance costs involved in collecting the information. Other than the costs represented by the time to complete the surveys, there are no direct monetary costs to respondents.

A.14. Estimates of Costs to the Federal Government


This small scale formative evaluation research study will occur one time over the course of three years. The annualized cost to the Federal government for the data collection activities is $236,326 and includes: material development, recruitment, site visits for focus groups, survey collection, analysis, and report preparation.


A.15 Change in Burden


New collection.


A.16 Plans for Publication, Analysis, and Schedule


An interim report and project briefings will be prepared with findings on the study. NASA will use the information from the study for planning and management purposes, to identify areas for improvement and make revisions to the program to more fully meet the needs of students and teachers. A final report will be submitted to NASA by the contractor in year two of the evaluation. The information will be used in a NASA Office of Education internal report for planning and management purposes for the implementation of the NES project. Senior Leadership in NASA Office of Education at NASA Headquarters and the NES project manager will use the report.


Findings also will be presented to NES project management so that they can make judgments on fidelity of implementation, based on the descriptions of teacher use from teacher logs, whether the NES materials are being used in a frequency and manner in which the project has been designed. The descriptive data on change in teacher and student outcomes will be reviewed to determine whether there is preliminary evidence that the intended outcomes of the project are being observed. These results will be reviewed and inform decisions on project modifications. For example, if it is found that teachers are not using all of the individual components of the NES program (e.g. they use classroom modules without viewing the associated electronic professional development) NES may concentrate communication campaigns on the advantage of using all the components of NES together and draw explicit links between the associated components. NES project management also expects to closely analyze barriers to participation and design and implement strategies to remove any barriers that the project can directly effect. The findings will enable NES to make data driven improvements to project offerings and delivery mechanisms to address the classroom and educational needs of its target audience.


Descriptive and correlational analyses are planned for survey data. Analyses of quantitative survey data will include a detailed summary that utilizes appropriate descriptive statistics. For survey items using continuous scales, the study will calculate means and standard deviations to describe both central tendency and variation. Frequency distributions and percentages will be used to summarize answers given on ordinal scales. Correlational analyses will be used to investigate associations between project components and outcomes of interest. Statistical tests, such as 2 analyses or t-tests, will be used to test for differences between pre- and post-time points. The analyses of the focus group data will include descriptive summaries of emergent themes.



Table A.16-1 Project Time Schedule

Activity

Schedule

Teacher surveys

1 – 18 months after OMB approval

Student surveys

1 - 18 months after OMB approval

Teacher logs

1- 18 months after OMB approval

Student focus group interviews

6 -12 months after OMB approval

Analyze data

10-24 months after OMB approval

Report findings

12-24 months after OMB approval


A.17 Approval to Not Display Expiration Date

The data collection instruments will display the expiration date.


A.18 Exceptions to Item 19 of OMB Form 83-I

No exceptions are sought.


Section B. Statistical Methods


B.1. Respondent Universe and Sampling Methods


The universe for the study is those teachers that are participating in NES and the students in their classrooms. Currently, there are over 1,000 registered teacher participants (1051 as of March 2011). NES allows for rolling enrollment in the project, thus this number continues to increase each week and the number of teachers registered for the project next year is projected to be 2,000 teachers. The sample of teachers for inclusion in the evaluation will be drawn from those who have registered.


Four hundred teachers will be recruited from the population of teachers who have registered for the program to complete the baseline teacher survey. We will also administer the baseline student survey to their 7200 students (assuming an average of 18 students per teacher) prior to implementing the program. Assuming a 25% attrition from baseline to post-program administration, our final sample of teachers with two waves of data will include 300 teachers. This 25% attrition rate accounts for both teacher turnover and non-response. A 25% attrition rate is also assumed for the students of those 300 teachers who remain in the study at the post timepoint. The result is a final student sample of 4050 [300 teachers x 18 students per classroom=5400 students; 5400 *.75 = 4050 to account for 25% attrition]. Further because students are clustered within teachers, we assume a design effect of 1.5 for the student analyses. With the pre-post design we will use paired samples t-tests to test whether there are any pre-post differences in survey responses.


The result of a statistical power analysis is often expressed as a minimum detectable effect (MDE), which represents the smallest difference between two population means that can be detected with sufficient statistical power given specific design parameters (e.g., sample size). Based on the MDE formulae presented in Schochet (2008),7 we express the formula that can be used with a paired samples t-test as follows:

where:

= significance level,

= power,

=degrees of freedom,

=standard error of the difference in means between the two time periods,

To calculate the we must estimate the standard deviation of scores. Both the teacher and student survey item responses are measured on a 5 point likert scale. We assume that the distribution of responses on the likert scale will be approximately normal, such that the mode of the distribution occurs at 3, with an equal frequency of responses at scores 1 and 5, and also at scores 2 and 4. Under this assumption, the variance of scores with this distribution is 1.33, which results in a standard deviation of 1.15. Because our design also involves surveying the same respondents at two time points, we must also consider the correlation between pre and post scores when calculating the standard error of the difference in means. For the purpose of these analyses we will assume =.4. Using these assumptions, the standard error of the difference of means for the teacher survey (n=300) is 0.073. The standard error of the difference in means for the student survey (n=4050 and design effect=1.5) is 0.024.


With calculated, we now determine . For both the teacher and student surveys we will calculate the MDE for a well-powered two-tailed paired t-test with significance level set to .05 and power set to 80 percent ( = .05 and = .8).


MDE for Teacher Survey

Degrees of freedom for the teacher analyses equal 299 (n-1) and so = 2.81 using Table 1 (p. 65) provided in Schochet, 2008. Therefore the MDE for the teacher analyses equal . This means that there is at least an 80% chance ( = .8) we will be able to detect a difference of .20 between pre and post means in the teacher survey items. Therefore the teacher survey, with an initial sample size of 400 and allowing for 25% attrition is well-powered to detect a pre-post difference in the population as small as .2.


MDE for Student Survey

Degrees of freedom for the students analyses equal to 2699 ((n/design effect)-1) and so =2.81 (Schochet, 2008). Thus, the MDE for the student analyses equal . This means that there is at least an 80% chance ( = .8) we will be able to detect a .06 difference between the pre and post means in the student survey items. Therefore the student survey, with an initial sample size of 7200 and allowing for 25% attrition is well-powered to detect a pre-post difference in the population as small as .06.



N

MDE

Teachers

300

0.4

0.20

Students

4050

(Design effect=1.5)

0.4

0.06


 

B.2 Information Collection Procedures/ Limitations of the Study


The data collection is intended to inform the NES project planning life cycle and to inform project management and modification. It will help test program theory by looking at project implementation and intended outcomes, but it is not designed to test the impact of the NES project. The data collection uses web-based surveys, paper surveys and in-person focus group interviews.


Selected teachers complete teacher surveys, and teacher logs. Students of NES teachers complete a pre and post pilot survey, and a subset of students will be involved in focus group interviews. The data collection method will collect data from the selected teachers and data from all of their students.


The data collection instruments have been developed to gather information about the implementation of NES and participants’ experiences with the NES materials, reasons for partial implementation, and suggestions for improvements. Data from the bi-weekly teacher logs will be used to collect implementation information that is proximal to actual use. Surveys gather information on outcomes for students and teachers that have been identified in the program theory as intended outcomes. The student surveys have been constructed using existing instruments that measure the constructs: Attitude toward Science/Attitude toward Engineering (Modified from School and social experiences questionnaire Singh, K., Chang, M., & Dika, S., 2006; alpha= .92); Improved self-efficacy in STEM (modified Attitudes toward Science Inventory; Dimension: Self-concept of science. alpha = .72); Leisure Interest in Science, Technology and engineering (modified from Test of Science Related Attitudes. Fraser (1981); Lott (2002), alpha =.91 and .81).


Over 22 instruments were reviewed and considered for this study. No individual instrument gathered information on all the specific student-related measures that are intended outcomes of the NES program. Thus, the student surveys were constructed by drawing on multiple existing instruments as detailed below.


The attitude toward science scale was taken from the School and Social Experiences questionnaire.8 A similar scale was not available for engineering, so a similar set of items based on the science items were created for engineering. The engineering items have not been used previously. Items for self-efficacy in STEM were drawn from modified Attitudes toward Science Inventory (mATSI) which was designed to measure six constructs related to science (e.g. perception of science teacher, anxiety toward science). Only items to measure self-efficacy toward science were included.9 The leisure interest in STEM scale was taken from the Test of Science-Related Attitudes (TOSRA). The career interest in STEM from the TOSRA was modified to include career choices that were related to NASA STEM fields, instead of STEM more broadly.10


Given the descriptive nature of the information sought, the use of simple descriptive statistics—such as counts, ranges, and frequency—is most appropriate for the analyses of the data. The study will provide useful information that can be used to investigate whether program theory holds and in making project modifications. Also, it will allow investigation into associations between implementation and outcomes of interest. The single-group, pre-post design for measuring outcomes, although it can provide evidence of whether there are changes in outcomes of interest, will not allow us to rule out the possibility that something other than the program is causing the intended outcomes. However, this information is critical to the program in making decisions about program improvement, and the decision about whether a more rigorous impact evaluation should be undertaken.


The NASA Explorer Schools project is committed to using evaluation methodologies appropriate to the maturity of the project design. Because the project is in early phases of implementation, the project is focused on formative evaluation techniques with the intention of characterizing usage patterns, implementation best practices and barriers to sustained participation to ensure consistent delivery of project resources against project design. As part of the verification of the project model, the project will include data collection that includes gathering of outcomes data to understand whether there is evidence that the project’s intended outcomes are present. However, because of the nature of the design, a causal link between the project implementation and intended outcomes cannot be established, and no causal claims will be made. Instead, the data collected on outcomes will be used to inform decisions about whether a more rigorous impact study should be designed and pursued.


The following text will be included in data reports as a disclaimer: This evaluation was designed to provide feedback for program planning and management decisions and to begin to test program theory. The design does not test the impact of the NES project, nor does it warrant causal claims.


Note that an impact study is NOT included in the current approval request. The design of an impact study would be informed by the currently proposed data collection. If warranted, a future impact study would be designed to answer questions regarding the effect of participation in the project on teachers’ knowledge of NASA opportunities, knowledge of NASA-STEM content, and comfort with the STEM substantive knowledge necessary to teach NES modules, and students’ interest and engagement in STEM.


B.3. Methods for Maximizing the Response Rate and Addressing Issues of Nonresponse


Several methods will be used to maximize response rates and to deal with non-response. These include:

  • Sending an email from NASA to inform selected teachers about the teacher logs and surveys;

  • Providing a sufficient timeframe for data collection;

  • Identifying a local site liaison who can assist in gathering parent consent, if necessary, and completed student surveys;

  • Providing a toll-free number that participants can call to ask questions and verify the legitimacy of the evaluation;

  • Following up with teacher non-respondents via email reminders and phone calls.

Nonresponse may be a problem in our analyses if it introduces bias into our population estimates. Bias occurs if the students that refuse to participate or leave the study would give systematically different responses to the survey (had they responded to it) than the students that would complete the surveys.


Poor response rates alone do not guarantee a biased estimate, as the decision to not participate or leave the study could be completely unrelated to survey answers. We will examine the bias in estimates because of nonresponse by following the two steps described below. Based on the analysis we will adjust the weights of responding students to account for student nonresponse.

1. Examination of Response Rates. The first step will be to monitor the overall response rate and response rate by relevant subgroups (e.g., by grade level or by class topic). High response rates (over 80 percent) for the entire sample as well as for subgroups might indicate no need for further analysis of bias due to nonresponse. Large differences in the response rates by strata and for subgroups serve as indicators that potential biases may exist. For example, if response rate from an important subgroup is very low then any difference in the characteristic of interest between this subgroup and other subgroups would result in a bias in the estimates. From the survey results we will examine whether there are differences in the characteristics in the subgroups, especially in a stratum where the response rate is low.

2. Nonresponse Propensity Model. Finally, should the response rate fall below 80 percent we will construct a propensity model to estimate the probability of a student in responding to the survey both for responding and nonresponding students; this is called a propensity score. The estimated propensity scores come from a logistic regression model. The model will be based on variables which are available both for nonresponding and responding students. Students will be grouped using the estimated propensity scores. Within each group we will compare the frame characteristics of responding and nonresponding students. This grouping in addition to assessing the bias will also provide a method of forming weighting classes for adjusting the weights of responding students to reduce the bias due to nonresponse.


B.4. Tests of Procedures or Methods

Experts and practitioners in the field—including faculty from Oklahoma State University and members of the National Science Teachers Association—reviewed the draft and final instruments. In developing the student surveys, existing instruments with established psychometric characteristics were selected after an extensive literature review (School and social experiences questionnaire. Singh, K., Chang, M., & Dika, S., 2006; modified Attitudes toward Science Inventory. Weinburgh, M., Steele, D., 2000; Test of Science Related Attitudes. Fraser, 1981; Lott, 2002). These instruments were modified to reduce the amount of time necessary to complete the survey.


In addition, the teacher survey and logs were pilot tested with six NES teachers. The teacher logs took between 6 and 25 minutes, with an average of 11 minutes. The surveys have taken between 10 and 30 minutes, with an average of 16 minutes. The student surveys were pilot tested with 8 students, and the survey took an average of 7 minutes, with a maximum of 14 minutes.


B.5. Names and Telephone Numbers of Individuals Consulted

The contractors for collection and analysis of data in this study are Abt Associates Inc., Cambridge, MA, and The Education Development Center, Newton. Staff from these organizations have knowledge of statistical methods, experience in evaluation of research programs, and expertise in scientific research. The evaluation has been developed under the oversight of Brian Yoder, Evaluation Manager, Office of Education, NASA HQ 202-358-7338, Rob LaSalvia, NASA Explorer Schools Project Manager, NASA Glenn Research Center, 216-433-8981, and Rick Gilmore, NASA Explorer Schools Senior Education Program Specialis, NASA Glenn Research Center, 216-433-5493.

Key personnel involved in the design include:

Evaluation Contractors



Ricky Takai

Abt Associates, Practice Leader

301-634-1765

Alina Martinez

Abt Associates, Project Director

617-349-2312

Sarah Sahni

Abt Associates, Director of Analysis

617-520-2881

Ricky Takai

Abt Associates, Education Evaluation Leader

301-634-1765

Johnny Blair

Abt Associates, Senior Survey Methodologist

301-634-1825

Barbara Goodson

Dillon-Goodson Research Associates

617-595-7045

Sheila Kirby

Abt Associates, Affiliate Scholar

703-533-743

Jackie DeLisi

Education Development Center

202-261-5409

Other



Jodie Rozzell

National Science Teachers Association

703-312-9295

Ben Jones

Booz Allen Hamilton

202-560-2239

Katie Rae Mulvey

Booz Allen Hamilton

908-578-9902

Cathy Graves

Oklahoma State University

216-433-5615

Richard Adams

Oklahoma State University

405-334-1869

Al Byers

National Science Teachers Association


Kendall Starkweather

International Technology and Engineering Education Association


Elise Russo

International Center for Leadership in Education



1 National Research Council. (2008). NASA’s Elementary and Secondary Education Program: Review and Critique. Committee for the Review and Evaluation of NASA’s Precollege Education Program, Helen R. Quinn, Heidi A. Schweingruber, and Michael A. Feder, Editors. Board on Science Education, Center for Education. Division of Behavioral and Social Sciences Education. Washington, D.C. The National Academies Press.

2 Launched in 2003, the original NASA Explorer Schools (NES) project consisted of three-year partnerships between NASA and selected schools. The project focused on whole schools and provided financial investment, professional development, and curricular support designed to provide engaging student STEM educational experiences and sustained professional development, and to enhance family involvement in science education. The NRC (2008) report recommended that the NASA Explorer Schools (NES) model be redesigned. The report stated that the original NES model was too ambitious in scope and used too many resources in too few schools. The report recommended that the new NES model reach more schools and students, and focus on motivating students around NASA themes so that they are exposed to and consider STEM careers.

3 President’s Council of Advisors on Science and Technology (2010). Prepare and Inspire: K-12 Education in Science, Technology, Engineering, and Math (STEM) for America’s Future. http://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-stemed-report.pdf

4 Camburn, E., and Barnes, C. (2004). Assessing the validity of a language arts instruction log through triangulation. Elementary School Journal, 105, 49-74.

Correnti, R. & Rowan, B. (2007). Opening Up the Black Box: Literacy Instruction in Schools Participating in Three Comprehensive School Reform Programs. American Educational Research Journal, v.44 (2), pp.298-338.

Correnti, R. (2007). An Empirical Investigation of Professional Development Effects on Literacy Instruction Using Daily Logs. Educational Evaluation and Policy Analysis, v.29, pp.239-261.

5 School and social experiences questionnaire: Singh, K, Chang, M., Dika, S. (2006). Affective and motivational factors in engagement and achievement in science. The International Journal of Learning, 12(6), 207-218. Singh, K, Chang, M. & Mo, Y. (2006). Science Achievement: Effect of Self and Engagement Variables. Paper presented at the meeting of the Asia-Pacific Education Research Association conference, Hong Kong, November 28-30, 2006.

Modified Attitudes toward Science Inventory. Weinburgh, M., Steele. D. (2000). The modified attitudes toward science inventory: Developing an instrument to be used with fifth grade urban students. Journal of Women and Minorities in Science and Engineering, 6, 87-94.

Test of Science Related Attitudes: Fraser, B. J. (1981). TOSRA: Test of Science-Related Attitudes Handbook. Hawthorn, Victoria: Australian Council for Educational Research. Lott, K. (2002, April). The evaluation of a statewide in-service and outreach program: Preliminary findings. Paper presented at the annual meeting of the National Association for Research in Science Teaching, New Orleans, LA.

6 Camburn, E., and Barnes, C. (2004). Assessing the validity of a language arts instruction log through triangulation. Elementary School Journal, 105, 49-74. Correnti, R. & Rowan, B. (2007). Opening Up the Black Box: Literacy Instruction in Schools Participating in Three Comprehensive School Reform Programs. American Educational Research Journal, v.44 (2), pp.298-338. Correnti, R. (2007). An Empirical Investigation of Professional Development Effects on Literacy Instruction Using Daily Logs. Educational Evaluation and Policy Analysis, v.29, pp.239-261.

7 Schochet, P. Z. 2008. Statistical Power for Random Assignment Evaluations of Education Programs. Journal of Educational and Behavioral Statistics, 33(1), 62-87.

8 School and social experiences questionnaire. Singh, K, Chang, M., Dika, S. (2006). Affective and motivational factors in engagement and achievement in science. The International Journal of Learning, 12(6), 207-218. Singh, K, Chang, M. & Mo, Y. (2006). Science Achievement: Effect of Self and Engagement Variables. Paper presented at the meeting of the Asia-Pacific Education Research Association conference, Hong Kong, November 28-30, 2006.

9 Modified Attitudes toward Science Inventory. Weinburgh, M., Steele, D. (2000). The modified attitudes toward science inventory: Developing an instrument to be used with fifth grade urban students. Journal of Women and Minorities in Science and Engineering, 6, 87-94.

10 Test of Science Related Attitudes. Fraser, B. J. (1981). TOSRA: Test of Science-Related Attitudes Handbook. Hawthorn, Victoria: Australian Council for Educational Research. Lott, K. (2002, April). The evaluation of a statewide in-service and outreach program: Preliminary findings. Paper presented at the annual meeting of the National Association for Research in Science Teaching, New Orleans, LA.

22 of 22

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleRequest for Clearance
Authorgaddisd
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy