NASA SoI OMB Package _2 Part A 5-25-11

NASA SoI OMB Package _2 Part A 5-25-11.doc

Summer of Innovation #2 (Surveys)

OMB: 2700-0151

Document [doc]
Download: doc | pdf






SUPPORTING STATEMENT

FOR OMB CLEARANCE

PART A



NASA Summer of Innovation FY2011


STUDENT SURVEY, TEACHER SURVEY, AWARDEE PLANNING AND IMPLEMENTATION DATA COLLECTION





National Aeronautics and Science Administration




May 24, 2011













Contents



Part A: Justification

A.1 Explanation of Circumstances That Make Collection of Data Necessary

The National Aeronautics and Space Administration (NASA) Office of Education, requests that the Office of Management and Budget (OMB) approve, under the Paperwork Reduction Act of 1995, an emergency clearance for NASA to collect student survey data, teacher survey data, and awardee planning and implementation data as part of the formative evaluation of NASA’s Summer of Innovation (SoI) Project FY2011.


In 2010, NASA’s Office of Education launched the SoI pilot, a NASA-infused summer experience for middle school students who underperform, are underrepresented, and underserved in science, technology, engineering, and mathematics (STEM) fields. The SoI pilot utilized a multi-faceted approach to reach and engage middle school students in STEM learning with NASA content and experiences. The topics ranged broadly and included activities concerning robotics, rocketry, engineering design, meteorology, space science, and climate science. Evaluation data were collected from various sources during the pilot to produce lessons learned regarding program design, implementation, and program evaluation. The pilot evaluation produced valuable insight into the program and was used to redesign SoI for this year. However, it was limited in its ability to generate hypotheses about promising practices; in most cases the pilot evaluation was not able to field baseline surveys, necessary for assessing change in the program’s outcomes of interest.


Drawing heavily upon the lessons learned identified in the evaluation of last summer’s SoI pilot, NASA modified its approach to focus on expanding the capacity of community and school-based organizations to engage youth in STEM learning activities. In FY2011, NASA is implementing a three-tiered solicitation and award structure that is designed to provide selected awardees with different levels of funding and access to NASA staff, facilities, and technology to engage 4th through 9th grade students in intensive, high-quality, inquiry-based content learning experiences in STEM during the summer and the school year. As it continues to develop and refine the program’s design, NASA has contracted with Abt Associates to conduct a formative evaluation. This evaluation intends to describe the different approaches taken by national awardees to meet the SoI requirements. It will also explore whether change occurs in key outcomes between baseline and follow-up surveys for participants at the national awardees and NASA Center SoI sites, where NASA is investing the majority of the SoI project funds.


NASA has revised SoI’s expected outcomes to better reflect the nature and objectives of the new SoI model (see Appendix 1 for the revised logic model).1 Given that the activities are short in duration, SoI 2011 has shifted the focus of the program from attempting to impact student achievement directly to inspiring and engaging middle school students in NASA STEM content. To accomplish this goal, NASA has set programming requirements as follows: national awardees are required to provide 40 hours of student STEM activities utilizing NASA content over the summer and an additional 25 hours by March 2012, while NASA Center partnerships must provide 20 hours of student STEM activities utilizing NASA content during the summer and an additional two STEM activities integrating NASA content by March 2012. For the National awards, organizations receiving SoI funding are required to provide classroom middle school teachers 40 hours of professional development by March 2012 and use them as part of their summer staff delivering NASA content; NASA Center partnerships are not expected to provide professional development for classroom teachers.


Expected outcomes for students in FY2011 include increased interest in STEM topics, careers, and leisure activities in the short term; ultimately, the program aims to increase the overall number of students pursuing STEM degrees and related careers and, more specifically, increase the proportion of underrepresented students who pursue these paths. NASA also seeks specific outcomes for classroom teachers in FY2011: to increase their access to and use of NASA content and resources in their classrooms so that over the long term, they have better understanding of NASA content, increased confidence in teaching NASA topics, and improved ability to teach NASA topics. Furthermore, NASA seeks to build the sustainability of the awardees’ programs by supporting the development of partnerships with formal and informal STEM institutions so that they can eventually operate high-quality STEM programs independently at scale, as SoI funding diminishes in subsequent years.


This emergency clearance request pertains to the portions of the data collection that will occur between June 1, 2011 and November 30, 2011 for the formative evaluation. It includes draft versions of the following instruments:


  • Awardee and NASA Center student surveys (Appendix 2 for the 4th & 5th grade version and Appendix 3 for the 6th-9th grade version; note that baseline and first follow-up surveys are identical);

  • Awardee classroom teacher baseline and first follow-up surveys (Appendices 5 and 6);

  • Awardee summer implementation reporting forms (Appendices 9 and 11);

  • Awardee fall focus group protocol (Appendices 14);

  • Awardee school-year planning form (Appendix 8); and,

  • Awardee school-year implementation reporting form for students and professional development activities as well as the school-year teacher implementation form (Appendices 10, 11, and 12).


As described in more detail below, the student and teacher outcome data as well as the planning and implementation data are not available elsewhere unless collected through the national evaluation. The teacher and student instruments will be used to gather data prior to and after the summer activities in order to assess for change in SoI’s key short term outcomes. Information about implementation will be gathered from numerous sources, including the awardee planning and implementation reporting forms, and fall focus groups. These data will allow the national evaluator to describe the different approaches taken by the awardees, as well as compare awardees’ original plans with what they actually implement. In addition, the national evaluation intends to collect information about the awardees’ plans for the school year activities in late summer 2011 to provide a baseline against which school-year implementation data can be compared.


The national evaluation is an important opportunity to collect information needed to develop and refine the program’s design. However, the evaluation is not intended to address questions of program impact on students or classroom teachers. It will explore whether NASA’s requirements, as now defined, are feasible and appropriate, and continue to generate lessons learned for future implementations of SoI and NASA’s education activities more broadly. Finally, it will help NASA consider which SoI practices and models may have promise, those that are correlated with desired outcomes and perceived by the NASA as worthy of further investment, which may be ready for more rigorous examination in a future summative study.


A.2 How the Information Will Be Collected, by Whom, and For What Purpose

How Information Will Be Collected

Data will be collected using several methods. Student and classroom teacher data will be collected through survey instruments, while planning and implementation data for summer activities will be collected via awardee reporting forms and focus groups with the awardees’ principal investigators. As the structure of the programs change for the school-year activities, implementation data will be collected through school-year planning forms, school-year implementation forms, and teacher implementation forms. Because of different programmatic expectations and varying levels of funding, as well as the plans for internal monitoring of the Center activities, all types of data will be collected from the national awardees while only student survey data will be collected at NASA Center partnerships.



Student Surveys

Approximately 3,450 students across the national awards and 3,450 students across the NASA Center partnerships sites will be sampled and asked to complete SoI surveys. As discussed in more detail later, there are two versions of the student survey instruments, one for younger middle school students (4th and 5th grades; see Appendix 2 for the draft version) and one for older middle school students (6th through 9th grades; see Appendix 3 for the draft version) to address concerns regarding the need to use age-appropriate measurement instruments that take into account children’s and youth’s rapid development. A crosswalk that describes how the survey items link to the research questions, their purpose, and their sources is included in Appendix 4.The version for younger students does not include items regarding career interests, which results from last year’s pilot evaluation indicated may not be appropriate given their level of development.


Paper surveys will be administered to students at baseline (prior to the start of the SoI summer activities) and immediately at the end of the summer SoI activities; a follow-up survey will be mailed to students in spring 2012. A third wave will not be administered to students who participated in summer activities at NASA Centers, where students are minimally engaged in follow-on activities that are likely insufficient to affect their interest in science. To document changes over time on the outcome measures, the follow-up surveys will contain the same items as the baseline instruments.2


Teacher Surveys

Teacher surveys will be administered to the 1,500 classroom teachers participating at national award sites at baseline (prior to the start of their summer involvement in SoI; see Appendix 5 for the draft version), at the end of the summer SoI activities in August 2011 (Appendix 6A for the draft paper version and Appendix 6B for the draft online version) and again at the conclusion of the school year activities (spring 2012). A crosswalk that describes how the survey items link to the research questions, their purpose, and their source is included in Appendix 7.To capture changes over time on outcome measures, the follow-up survey will contain the same items as the baseline instrument; the follow-up survey contains a few additional items to gather feedback regarding the SoI professional development.3


Teachers will be provided paper surveys when they agree to participate, as part of their employment/registration paperwork, and will be asked to return them to their coordinator prior to participating in the summer activities. The follow-up surveys will be online and emailed to teachers immediately following the completion of the summer activities (August 2011) and the school-year activities (spring 2012); if teachers do not have access to computers, paper surveys will be mailed. As NASA Center partnerships are not required to engage classroom teachers in their SoI activities, surveys will only be administered to teachers participating at the national awardee sites.


Planning Forms

Once the awards have been announced, NASA will distribute summer (in the first OMB package) and school-year planning forms (draft version included as Appendix 8) to awardees. The forms ask about awardees’ plans regarding the structure of their summer and school-year implementations, including dates, number of camps or events, locations, hours, expected participants and key partners as well as the content they intend to use. The national evaluator will pre-populate these forms based on the awardee’s proposal; awardees will be required to review these forms to ensure their accuracy and fill in any missing data. As it is likely that awardees will not begin detailed planning of their school-year activities this spring, the national evaluator will collect a second planning form in August (draft version included as Appendix 8) that gathers the planning information for the school-year activities.


Awardee Implementation Reporting Forms

Links to electronic implementation forms will be sent to the evaluation coordinators at each awardee to collect information about the awardees’ professional development and student activities that are actually implemented. The coordinator will be responsible for ensuring that each lead instructor of a summer activity complete a student implementation form at the conclusion of each student summer classroom session (draft version included as Appendix 9), each student school-year event coordinated by the awardee (draft version included as Appendix 10) as well as at the end of each summer and school-year professional development session (draft version included as Appendix 11).


These forms asks awardees to report the actual dates of implementation, the content used, the number of contact hours, the number of hours during which NASA content was used, the number of participants enrolled and attending, reasons for why participants did not complete the activity, and who lead the activities. The data collected through these forms will allow the evaluators to describe the different approaches taken by awardees to meet the NASA requirements. It will also be compared with the information collected from their planning forms (submitted in the first package) to identify where awardees deviated from their original plans which may in turn highlight a “lesson learned.”


School Year Teacher Implementation Form

After reviewing the finalists’ proposals, it became clear that the national awardees may not always be involved in the delivery of school-year SoI activities. Specifically, one finalist serving 150 teachers intends to provide professional development for classrooms teachers during the summer and expects teachers to use its “experiments of the month”, a “science launch kit”, and well as independently select and share additional NASA resources in their classrooms during the school-year. This structure necessitates collecting data directly from teachers rather than from the awardees’ coordinators. To accomplish this, the national evaluation will use the school-year teacher implementation form (draft version included as Appendix 12), which is designed to be completed electronically monthly, and teachers associated with awardees not providing structured school-year activities will receive a reminder via e-mail. This form is a shorter version of the implementation report forms; they ask teachers when they used Experiment of the Month, Science Launch Kits, and NASA resources, the number of hours of NASA content they provide, and the number of students who participate. Collecting this data will allow NASA to learn how school-year activities are implemented when an awardee does not coordinate them.


Focus Groups

All ten principal investigators of the SoI national awards will be asked to participate in focus groups at three points in time: at the kick-off meeting in May 2011 (as described in the first package), the lessons learned meeting in early fall 2011 (Appendix 13 for draft consent script and Appendix 14 for the draft protocol), and again at the year-end meeting in summer 2012 (as will be described in the third package). At these meetings, PIs will break into small groups of 3 to 4 individuals; at the kick-off meeting, they will discuss their plans for the upcoming summer implementation. In the fall, they will be asked about the summer activities, their capacity-building efforts, and their plans for the upcoming school year. These focus groups will allow the evaluators to collect the qualitative descriptions with significant detail of the awardees’ plans that will complement the quantitative data planning and implementation data collected through reporting forms.


Review of Draft Online Forms

The teacher surveys and implementation forms included in this package are in draft form. Once their content has been finalized and approved by OMB, they will be programmed and reviewed by Abt SRBI’s expert staff - Courtney Kennedy, PhD (VP Advanced Methods Group) and Robb Magaw (Senior Project Director with over 20 years of experience in conducting survey research including Web surveys) to make sure they meet the highest standards supported by literature and best practices; Mr. Chintan Turakhia (Sr. Vice President of Social and Public Policy research group with over 20 years of survey research experience) will serve as project consultant. At a minimum, the online version will address the following:


  • Employ a Paging Design: The forms contain skip patterns. That is, some items should only be answered by a subset of the respondents, and eligibility for these items is conditional on information entered earlier in the form. Forms featuring skips are best administered with a paging design rather than a scrolling design because skip patterns can be executed automatically without respondents needing to determine which items they should answer (Couper 2008). Automated skips can improve data quality by reducing errors of omission and errors of commission (Redline and Dillman 2002). Automated skips have also been shown to reduce the length of the time required to complete the form, particularly for respondents who are not eligible for certain items (Peytchev et al. 2006).


  • Eliminate Requests for Information That Can Be Captured Passively: The draft versions will be reviewed to remove any items that can be collected passively, eliminating the need for the respondent to report this information. Best practices of online data collection involve using some kind of authentication process to link each respondent to their form and no other (Couper 2008). The research literature supports using a “semiautomatic authentication” approach, which has the respondent enter a personal PIN when accessing the online form (Heerwegh and Loosveldt 2003). For example, under this approach it is not necessary for respondents to enter their username because it is embedded in the URL. Each PIN should be associated with a unique SoI Awardee, so when completing the form asking for both the PIN and the Awardee name will be redundant. Similarly, the date on which the form is completed can be captured automatically by the web survey software.


  • Present Definitions of Key Terms Effectively: The Word document version presents five key definitions on the introductory page of the Planning Form, which research literature indicates is not an effective approach for presenting definitions. For example, eye-tracking research demonstrates that respondents are reluctant to invest effort in reading text that is not on the “critical path” to completing the form (Galesic et al. 2008). In other words, the likelihood that respondents completing the form will read and process definitions presented on an introductory page is quite low. Rather than presenting this information on the first page, the programmed version will include hyperlinks in the wording of the relevant items. For example, in items mentioning the “key partner”, we recommend creating a hyperlink for the phrase “key partner” that if clicked will lead the respondent to a new page containing the definition. This way the definition is presented in the item itself and is available exactly when the respondent may need to reference it. That said, hyperlinks are not a perfect solution because research indicates that some respondents are unwilling to expend the effort required to click on the link and read the definition (Conrad et al. 2006). Ideally, the definition would be integrated into the wording of the item itself or the item would be re-written so that the definition is less necessary (Couper 2008, p289).

In addition, Abt-SRBI’s expert team will also enhance the esthetics and usability of the forms, to create a professional-looking format, ensure consistency in how items are presented, and improve the usability of navigation buttons. It is critical that respondents are able to navigate through the online form easily. Central to this goal is providing clear navigational buttons that stand out and are strategically located and adheres to the U.S. Health and Human Services Web design guidelines (2006), which recommend that, “If one pushbutton in a group of pushbuttons is used more frequently than the others, put that button in the first position.”



Who Will Collect the Information

As part of the solicitation, national awardees have been notified that they will be required to identify an on-site national evaluator coordinator who will assist in the evaluation’s data collection. During the kick-off meeting, the national evaluator will present the purpose of the evaluation to the principal investigators and outline their responsibilities. Right before fielding begins, a mandatory webinar training will be provided to coordinators and class teachers that will provide training in how to administer the surveys to ensure that the data are collected consistently across sites. Student survey data at NASA Centers will be collected by NASA staff, also trained through a webinar by the national evaluation team for consistency in administration. The national evaluator will provide the awardees/Centers with printed (so that content cannot be modified) and scannable surveys to administer and return to the national evaluator who will prepare data files for analysis. Classroom teacher surveys will be administered online; should the teachers not have access to email, paper surveys will be distributed by the awardees.


The coordinators will also complete the planning forms and ensure that each teacher and professional development leader completes an implementation form at the conclusion of their class/activity. Finally, focus groups with the awardees’ principal investigators will be conducted by the national evaluation team at the lessons learned meeting in early fall 2011.


For What Purpose

The purpose of this data collection effort is to support the national evaluation of the SoI project. The goal of the national evaluation is formative, that is, to gather data that will inform NASA’s continued development of the program as well as to assess whether evidence supports the progression to a more rigorous, summative, impact evaluation. As such, the evaluation will focus on describing SoI’s implementation and associated outcomes, but will not determine whether there is a causal link between the program and outcomes. The formative work will develop a description of the awardee models and possible linkages to desired outcomes, enable NASA to assess the fidelity of implementation, and generate lessons learned to improve future SoI activities.


Exhibit 1 below outlines the research questions for the SoI national evaluation, data sources, and outcome measures.


Exhibit 1: National Evaluation Research Questions

Research Questions

SoI Tier of Interest

Data Sources

Outcomes

1. Who participates in SoI FY2011?

National awardees and NASA Centers

Parent consent forms/surveys forms

Participant demographic information

2. Does student interest in science change significantly between the baseline and follow-up surveys? If so, are these changes larger at some awardees/NASA Centers than others?

National awardees and NASA Centers

Student surveys

Overall interest in science,

career interest in science, leisure interest in science

3. Does comfort in teaching NASA topics and access/use of NASA resources change between baseline and follow-up surveys? If so, are these changes larger at some awardees than others?

National awardees only

Teacher surveys

Access and use of NASA content and resources; comfort in teaching NASA topics

4. How do awardees plan and implement their summer and school-year activities? What are the similarities and differences across the approaches? Are there any apparent relationships between the approaches and desired outcomes?

National awardees only

Planning forms; Implementation forms; focus groups

Program scheduling, activities, duration, content, delivery methods, participants

5. What supports and challenges do awardees face in implementing their SoI programs? How do they negotiate these challenges?

National awardees only

Focus groups

Implementation challenges and successes

6. How are awardees preparing to operate independently of SoI funding?

National awardees only

Focus groups

Sustainability planning


The first research question will be answered using the parent consent form/survey (included in the first OMB package).The student surveys will address the second research question and allow the national evaluators to explore changes associated with participation in SoI in student interest in STEM (including overall interest, career interest, and leisure interest. NASA focuses specifically on measurement of students’ interest in science only. While NASA is certain science will be addressed by all programs, technology, engineering, and mathematics may not be a focus of the summer programs across all awardees/Centers. The teacher surveys, which enquire about teacher access and use of NASA resources, as well as comfort in teaching NASA content (all of which addresses science-related topics), will inform the fourth research question.


As mentioned earlier, while measuring outcomes at multiple points in time can provide evidence of whether the outcomes of interest change, it will not allow us to rule out the possibility that something other than the program is affecting this change. However, it will support investigation into associations between implementation and outcomes of interest to inform future program strategy, as well as inform the future decision about whether a more rigorous impact evaluation should be undertaken.


Planning and implementation data collected through reporting forms and focus groups to answer questions four, five, and six. Not only will the data be vital in understanding the context in which any change in key outcomes is identified, but will also serve as a resource to support additional research on STEM learning as it relates to informal and K-12 education by academic researchers and others interested in STEM engagement.


A.3 Use of Improved Information Technology to Reduce Burden

The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. The national evaluator will provide training and support to all awardees/Centers to assist in obtaining systematic and consistent data. Surveys were designed to require minimal effort, including only questions not available elsewhere. In particular, the student surveys were designed to be easy to read with straight-forward questions and minimal skip-patterns. Student survey data will be collected on paper distributed by the national evaluation coordinator at each awardee/Center during the summer and by mail in spring 2012. Teacher baseline survey data will be collected on paper and follow-up surveys will be conducted online. Awardee planning and implementation forms will be administered online. The national evaluator’s electronic mail address and toll-free telephone number will be included on the first page of the teacher survey for participants who have questions. Taken together, these procedures are all designed to minimize the burden on respondents.


A.4 Efforts to Identify and Avoid Duplication

This effort will yield data to assess SoI implementation and outcomes; as such, there is no similar evaluation being conducted and there is no alternative source for collecting the information. NASA has identified technical representatives who will be responsible for coordinating the requests for information from the contractor who is responsible for collecting compliance information from awardees and the national evaluation team. In addition, the national evaluation team has shared all data collection instruments, including the planning and implementation forms, with NASA’s compliance contractor who is including them in a “tool kit” for the awardees. This will enable awardees to complete one form for both the national evaluator and the compliance contractor. Furthermore, NASA will identify a single point of contact for the awardees who will ensure that duplicative data collection is avoided. This year, no student surveys will be administered by NASA’s Office of Education Performance Management system, further reducing duplicative data collection.


A.5 Efforts to Minimize Burden on Small Business or Other Entities

No small businesses will be involved as respondents. The primary survey entities for data collection efforts described in this package are students, teachers, and awardees. Burden is minimized for all respondents by requesting only the minimum information required to meet study objectives. All primary data collection will be coordinated by the site administrators in partnerships with the national evaluator, so as to reduce the burden on the SoI awardees and NASA Centers.


A.6 Consequences of Less-Frequent Data Collection

If the proposed student and teacher survey data were not collected, NASA would not fulfill its objectives in measuring change in important student and teacher outcomes that may be associated with participation in SoI. Without the planning and implementation data, NASA would not understand how the program models were intended to work or were actually implemented. In addition, NASA would not know what would be required to replicate the models, should they be associated with promising outcomes. Thus, by not collecting survey, planning, and implementation data, federal resources would be allocated and program decisions would be made in the absence of information about the actual activities provided by the SoI awardees and lessons learned.


A.7 Special Circumstances Requiring Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations

There are no special circumstances associated with this data collection.


A.8 Federal Register Comments and Persons Consulted Outside the Agency

In accordance with the Paperwork Reduction Act of 1995, NASA published a notice in the Federal Register announcing the agency’s intention to request an OMB review of data collection activities. The notice was published on January 10, 2011, in volume 76, number 6, page 1461, and provided a 30-day period for public comments. To date, no comments have been received.


The student and teacher survey instruments were developed by the national evaluators, Abt Associates, Inc. and staff from the Education Development Center (EDC), comprising: Ricky Takai, Principal Investigator; Hilary Rhodes, Project Director; Alina Martinez, Principal Associate; Kristen Neishi, Deputy Project Director; and Melissa Velez, Survey Analysis Task Manager; and Jacqueline DeLisi, Abigail Levy, and Yueming Jia at EDC. These surveys are based on the theory of change depicted in the SoI logic model and informed by the evaluators’ knowledge of the program. Items were selected from previously validated instruments as was feasible. Feedback on the instruments was solicited from staff at NASA’s Office of Education. Surveys were then pilot tested, first during the SoI pilot in 2010 and then again in 2011, to ensure items were unambiguous and had face validity, that is, to learn whether they measure outcomes as intended.


Student Surveys

The Abt-EDC team revised the student surveys based on the lessons learned from last year’s pilot administration to alleviate respondent burden, clarify items, and ensure the inclusion of items that measure NASA’s outcomes of interest for SoI in FY2011. Given reports that younger middle school students had difficulty with certain items in the pilot study as well as the results of a factor analysis, two student surveys were created, one for younger middle school students (4th and 5th grades) and one for older middle school students (6th through 9th grades). Two versions were prepared because children in various age groups differ substantively in terms of their cognitive ability and life experience, which may lead to different levels of understanding of the questions in a measure. For instance, children aged 7 to 10 may not be able to answer questions designed for older youth because of their limited reading skills or limited understanding of the content.4 In addition, questions designed for children in one age group may not be relevant to and/or interpreted differently by groups in other developmental stages.5 Ignorance of the above issues resulting from age differences could cripple both the reliability and validity of the measures.


In the current evaluation, three well-established measurements of attitude and interest were used to create the pilot student survey: an attitude measure adapted from the School and Social and Social Experiences Questionnaire6 and two interest measures from the Test of Science Related Attitudes.7 All three measures were developed and validated only in samples of students above 6th grade. To make the measures appropriate for the younger students in our sample, the measures were revised based on two criteria, language level and life relevancy, and created two different versions of the survey, one for 4th and 5th graders and another for 6th, 7th, 8th, and 9th graders. For the attitude and leisure interest measures, the same questions were used in both surveys as all questions in the two measures required a 3rd grade language skill level and were relevant to the student life experience for both age groups.


The career interest measure was revised substantively for 4th and 5th graders because many questions in it were beyond the life-experience of children in this age group (e.g., Working in a science lab would be an interesting way to earn a living.). Only three questions are used to assess career interest of 4th-5th graders (“I would like to be a science teacher when I grow up”; “I do not want to be a scientist when I grow up”); the words in these two questions were modified to be appropriate for younger children (6th-9th grade version uses the item, “I do not want to be a scientist when I leave school”). These revised items were piloted with 4 students in 3rd through 5th grade, and no problems in comprehension were reported. Furthermore, these the career items on the 4th-5th student survey version as individual items and not as a scale.


Several additional modifications that altered the length of the survey were also made after the pilot in 2010. First, the questions about students’ contact information (e.g., name of parent/guardian), gender, ethnicity, and race were removed; these items now are included as part of the parent consent form. Secondly, as NASA is confident that all SoI programs will address science but is not sure whether all will address math, the items regarding interest in STEM were limited to science topics only, dropping the pilot items regarding math. Finally, as the outcomes of interest have changed since the pilot in 2010, a few items were added to the student survey regarding their interest in participating in future informal science activities. These items had been previously piloted as part of the NASA Explorers School evaluation.


The Abt-EDC team then tested the revised student surveys with seven students (four students in 3rd through 5th grade and 3 students in 6th through 9th grade) to estimate time for completion and understandability of text. It took students between 4 and 15 minutes, averaging 7.9 minutes, to complete. Student surveys were further revised so that the instructions were shorter and included more age appropriate wording.


Teacher Surveys

Teacher surveys were also substantially shortened from the version piloted in 2010 as NASA refined its expectations for their role. Questions about personal science teaching efficacy, science teaching outcome expectancy (the extent to which teachers believe that certain behaviors lead to improved student outcomes), use of traditional teaching practices, use of strategies to develop students’ abilities to communicate ideas, use of laboratory activities, and current need for professional development in STEM areas were removed, as they are no longer the outcomes of interest for SoI in FY2011. Instead, NASA needs data for its monitoring and compliance responsibilities to learn whether teachers access and use NASA content and resources outside of the SoI program and whether they are more comfortable teaching the NASA topics.


The Abt-EDC team then tested the revised surveys with six secondary STEM former and current teachers to estimate time for completion and understandability of text. It took between 5 and 20 minutes, averaging 8 minutes, to complete. Based on the pilot, we learned that all teachers interpreted “how comfortable you are teaching” as a combination of their understanding of the material and their ability to use it in their classrooms. Slight modifications were made to the teacher survey based on the pilot test specifically to clarify what is meant by “NASA resources.”

A.9 Payments to Respondents

There will be no payments to respondents.


A.10 Assurance of Confidentiality

Every effort will be made to maintain the privacy of respondents to the extent provided by law, including the use of several procedural and control measures to protect the data from unauthorized use. Collected data will not be released with individual identifiers, and results will be presented only in aggregated form. A statement to this effect will be included on the first page of each teacher survey and will be read to students before administering the survey; it will also be read to awardees prior to conducting focus groups and interviews (See Appendices 2, 3, 15 & 17 for draft versions of the consent form read to students prior to survey administration, and the focus group consent script). Respondents will be assured that all information identifying them will be kept private.


The procedures to protect data during information collection, data processing, and analysis activities are as follows:


  • All respondents included in the study sample will be informed that the information they provide will be used only for the purpose of this research. Individuals will not be cited as sources of information in prepared reports.

  • Hard-copy data collection forms will be delivered to a locked area at the contractor’s office for receipt and processing. The contractor will maintain restricted access to all data preparation areas (i.e., receipt, coding, and data entry). All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only.

  • Individual identifying information will be maintained separately from completed data collection forms and from computerized data files used for analysis.


The national evaluation team will also have the data collection protocols and surveys reviewed by Abt’s Institutional Review Board (IRB), a process which has already been initiated and will be completed once the awardees have been selected. Prior to their use, Abt’s IRB will approve the data collection instruments, including student and teacher surveys, the focus group protocol, the consent script, and the planning and implementation forms. The IRB will assure that the data collection protocols and procedures, including consent forms, abide by strict privacy procedures.


A.11 Questions of a Sensitive Nature

The questions included on the data collection instruments for this study do not involve sensitive topics and respondents may skip items if they so wish.


A.12 Estimates of Respondent Burden

Exhibit 2 presents estimates of the reporting burden for the student surveys, the teacher surveys and the implementation plan reporting: we estimate that the annualized response burden for the entire evaluation is 1,725 hours for students at awardees (including the time associated with the second follow-up survey that will occur outside the cleared period and thus will be part of a third package), 1,150 hours for students at NASA Centers, 925 hours for teachers (750 hours for baseline and two follow-up surveys, including time associated with the second follow-up survey that will be part of the third package, and 175 hours associated with school-year teacher implementation form), 3,333 hours for parents related to the parent consent form (addressed in the first package) and 330.3 hours for awardees, the sum of the burden for the summer planning and school-year planning forms (10 hours), student summer and school-year implementation forms (268 hours), summer and school-year professional development implementation forms (12.3 hours), spring focus groups addressed (20 hours) and fall focus groups (20 hours). The total burden associated with this evaluation is 7,463.67 hours8.


This estimate assumes that it will take both students and teachers about 10 minutes to read the survey’s introduction and answer the questions. Estimates for the student hour burden are based on time requirements from similar surveys conducted on comparable evaluations and this spring’s pilot testing, where the seven participating students took as little as 3 minutes and as much as 15, averaging 8 minutes to complete this year’s surveys. Estimates for the teacher hour burden are based on time requirements from similar surveys conducted on comparable evaluations and this spring’s pilot test, where the 6 participating teachers took between 5 and 20 minutes, averaging 8 minutes to complete the surveys.


Burden associated with the collection of implementation data is estimated as follows. Given the time it took to complete planning and implementation forms last year, and what the national evaluator has observed in similar evaluations, we expect the summer and school-year planning form to require 30 minutes from each awardee (total burden of 10 hours for both planning forms across awardees). Assuming that each awardee has 150 student classrooms sessions during the summer, the maximum numbers that were indicated in the proposals, and that the implementation form takes 10 minutes, the total summer classroom reporting burden is 250 hours. Assuming that no more than two professional development sessions will be held at each site during the summer (as discussed in the proposals), and that these forms take 10 minutes, the professional development implementation reporting in the summer will require 3.3 hours.


Implementation reporting for the school year will occur through two channels. For the nine awardees that plan to coordinate their school-year activities, the burden estimate for the implementation reporting forms during the school-year activities assumes that no more than 12 student activities and 6 professional development sessions per awardee. As the forms are essentially the same as the summer forms, each should take no more than 10 minutes for a burden of 18 hours for student reporting and 9 hours for teacher reporting across all sites. One likely awardee, however, does not appear to offered structured activities or professional development sessions during the school-year; instead, the teachers who are trained in the summer are expected to lead SoI activities independently. To collect implementation data, we will send the 150 teachers at this site 7 monthly teacher implementation forms, which as a shortened version of the implementation reporting form, should take no more than 10 minutes each (total burden of 175 hours).


Qualitative implementation data will also be collected through focus groups. Principle investigators will be asked to participate in focus groups scheduled for 2 hours during the kick-off meeting in May 2011 and 2 hours in the fall. Assuming that all attend, total burden of these focus groups is 20 hours per session (total 40 hours).


A.13 Estimates of the Cost Burden to Respondents

We estimate that the annualized cost burden is $12,506.25 for students at awardees (the baseline and two follow-up surveys), $8,337.50 for students at NASA Centers, $18,694.25 for teachers ($15,157.50 for the baseline and two follow-up surveys and $3,536.75 for the school-year teacher implementation forms), $79,766.67 for parents, and $12,094.03 for awardees, which includes the cost burden associated with the summer and school-year planning forms ($239.30), student summer ($5,982.50) and school-year implementation forms ($430.74), summer ($79.77) and school-year professional development ($215.37), school year teacher implementation form ($3,536.75) and spring and fall focus groups ($1,609.60). Please see Exhibit 2.


The cost burden associated with the surveys is estimated as follows: for students, we used the federal minimum wage, for teachers we used the median income of middle school teachers (as of April 15, 2011), and for parents, we used 2009 national median income.


For the cost burden associated with the collection of implementation data, we assumed that:


  • Awardee evaluation coordinators will complete the planning and implementation forms;

  • Teachers will complete teacher school-year implementation forms; and

  • Awardee principal investigators will participate in the focus groups.


The cost per hour burden for filling out the implementation and school year planning forms, as well as participating in the interviews, is calculated using the 2009 national median income. The cost per hour burden for PIs’ participation in focus groups is based on the assumption that this year’s PIs will be similar to those who participated in the pilot, several of whom were associate professors at baccalaureate institutions. As such, we calculated the burden based on the national average annual salary of associate professors. There is no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the data. Other than their time to complete the surveys and forms, as well as the time to participate in interviews and focus groups, which are estimated in Exhibit 2, there are no direct monetary costs to respondents.


A.14 Estimates of Annualized Government Costs

This information collection activity has been developed in the performance of the Contract Number: NNH08CD70Z (Task Order NNH11CC54D). Under this contract, the evaluation’s plans will cost approximately $40,795 to update SoI’s pilot survey instruments. The awardees’ evaluation coordinators will collect the survey and planning/implementation data.


A.15 Changes in Hour Burden

This is a new collection of information.


Exhibit 2. Estimates of Annualized Burden Hours and Cost for Data Collection

Data Collection Sources

OMB Package Number

Number of Respondents

Frequency of Response

Total Minutes per Respondent

Total Response Burden in Hours

Estimated Cost Per Hour

Total Cost Burden

Awardee & Center Parent Surveys

1

40,000

1

5

3,333

$23.93 a

$79,766.67

Awardee Student Surveys

2 & 3

3,450b

3

30

1,725

$7.25 c

$12,506.25

NASA Center Student Surveys

2

3,450

2

20

1,150

$7.25

$8,337.50

Awardee Teacher Surveys

2 & 3

1,500

3

30

750

$20.21d

$15,157.50

Awardee Planning Forms (summer & school year)e

1 & 2

10 f

2

60

10

$23.93

$239.30

Student Summer Implementation Form

2

1,500 g

1

10

250

$23.93

$5,982.50

Student School-Year Implementation Form

2 & 3

108 h

1

10

18

$23.93

$430.74

Summer PD Implementation Form

2

20 i

1

10

3.3

$23.93

$79.77

School-Year PD Implementation Form

2 & 3

54 j

1

10

9

$23.93

$215.37

School Year Teacher Implementation Form

2 & 3

150 k

7

70

175

$20.21

$3,536.75

Awardee Planning Focus Groups (May 2011)

1

10

1

120

20

$40.24 l

$804.80

Awardee Lessons Learned Focus Groups (fall 2011)

2

10

1

120

20

$40.24

$804.80

Total Burden for Evaluation

 

50,262

 

 

7,463.67

 

$127,861.94

Notes:

a Estimated cost per hour for parents is calculated based on the national median income of $49,777 (~23.93 per hour, assuming a 40 hour work week) for 2009 according to the Current Population Survey (http://www.census.gov/prod/2010pubs/p60-238.pdf, retrieved on March 9, 2011.

b Number of respondents based on estimated sample size, according to power calculations discussed in Supporting Statement B.

c Estimated cost per hour for students is calculated based on federal minimum wage of $7.25 per hour effective July 24, 2009.

d Estimated cost per hour for teachers is calculated by median income of middle school teachers of $42,033 (as of April 15, 2011), or $20.21 per hour (http://www.payscale.com/research/US/All_K-12_Teachers/Salary).

e Assumes that the evaluation coordinator – not the PI – completes the implementation forms. Estimated cost per hour based on the national median income.

f NASA intends to fund 10 NASA Centers and 10 national awards. However, only national awardees are expected to participate in focus groups and complete planning & implementation forms.

g Not all proposals identify how many classrooms will be implemented; of those that do discuss their structure, the most classrooms noted are 150; accordingly, we assume this value across all awardees recognizing that it may exceed what is actually implemented, to provide a conservative estimate of burden. Furthermore, we assume that the class instructor will complete the reporting forms.

h Not all proposals identify how many school-year student activities will be implemented; of those that do, the greatest number of student activities is 12; accordingly, we assume this value across all awardees recognizing that it may exceed what is actually implemented, to provide a conservative estimate of burden. Furthermore, we assume that the class instructor will complete the reporting forms. Finally, note that the burden calculations reflect that only 9 awardees are planning school-year activities.

i Not all proposals identify how many PD sessions will be provided in the summer; of those that do discuss their structure, the greatest number of PD sessions is 2; accordingly, we assume this value across all awardees recognizing that it may exceed what is actually implemented, to provide a conservative estimate of burden. Furthermore, we assume that the class instructor will complete the reporting forms.

j Not all proposals identify how many PD sessions that will be implemented in the school year; of those that do, the greatest number of PD sessions is 6; accordingly, we assume this value across all awardees recognizing that it may exceed what is actually implemented, to provide a conservative estimate of burden. Furthermore, we assume that the class instructor will complete the reporting forms. Finally, note that the burden calculations reflect that only 9 awardees are planning coordinated school-year activities.

k Note that only teachers associated with one awardee will fill out the teacher school-year implementation forms, as the other 9 are expected to provide coordinated school-year activities for which the awardees will submit implementation forms.

l Estimated cost per hour for PI’s is calculated based on assumption that, as last year, PIs will likely be associate professors, whose national average annual salary is $83,700 (~$40.24 per hour, assuming a 40 hour work week) for 2009-2010 at Baccalaureate institutions, as calculated using American Association of University Professors survey results (http://chronicle.com/article/Searchable-Database-AAUP/64231/, retrieved on March 4, 2011).

A.16 Time Schedule, Publication, and Analysis Plan

The schedule shown in Exhibit 3 displays the sequence of activities required to conduct the information collection activities and includes key dates for activities related to data collection, analysis, and reporting. Two evaluation reports based on findings from the surveys and implementation data will be prepared; one following the completion of summer activities (fall 2011) and one after the completion of the school-year activities (summer 2012).


Exhibit 3. SoI Schedule

Activities and Deliverables

Responsible Party

Date

Development & refinement of instruments

National evaluator

January –April 2011

Parent consent form & associated short survey collection

National evaluator & site administrators

May – June 2011

Student survey data collection

National evaluator & site administrators

June – August 2011; March 2012 *

SoI kick-off meeting and planning focus groups

NASA & national evaluator

May 2011

SoI planning forms submission

Awardees

May –June 2011

Teacher survey data collection

National evaluator & site administrators

June – August 2011; March 2012 *

SoI implementation forms submission

Awardees

July –August 2011

Data analysis of baseline/follow-up student and teacher surveys, implementation data

National evaluator

Fall/ Winter 2011

SoI “Lessons Learned” meeting and implementation focus groups

NASA & national evaluator

Fall 2011

Expert panel review meeting

NASA & national evaluator

Fall 2011

National Report #1

National evaluator

Fall 2011

Post school-year PI focus groups

National evaluator

Spring 2012*

Data analysis of post-school year student and teacher survey, implementation data

National evaluator

Summer 2012

National Report #2

National evaluator

Summer 2012

* Data collection activity will be included in a subsequent clearance package that will be submitted before school-year activities begin as emergency clearance likely will not cover the school-year data collection efforts.


The national evaluator will conduct analyses on survey and implementation data to assess changes in student and teacher outcomes over time, how awardees implemented their activities, and how the two might be related. Survey data will be analyzed separately for NASA Centers and awardees (planning and implementation data will only be collected from national awardees).


Analysis of Survey Data

Below, the analysis plan for the survey data is summarized. It is discussed in fuller detail in Supporting Statement B.


Descriptive Cross-Sectional Analyses

The evaluation team will calculate representative, cross-sectional proportions and averages of student outcomes at the student level across all awardees/center and at the awardee/Center level, adjusting them for the sampling design by applying a calculation algorithm described in Supporting Statement B. Using the overall awardee/Center weight will allow for statements like, “the percent of students that ....,” so that it corresponds to the percent of students out of all SoI students in the country, not just the students that happen to be in the sample. Using the same calculation algorithm, but adjusting the weight to reflect all students at a particular awardee/Center will allow for the calculation of statistics that are representative of all students at a particular awardee/Center, (i.e., “the percent of students at Awardee A that ...”). 


Because the universe of teachers will be sampled, the descriptive statistics for a single point in time do not need to be adjusted for sampling design. Means and standard deviations will be used to describe central tendency and variation for survey items using continuous scales. Frequency distributions and percentages will be used to summarize answers given on ordinal scales. Descriptive analyses about all awardees will be conducted on the all teacher respondents, while descriptive analyses about teachers within particular awardees will be restricted only to respondents from that awardee.


Descriptive Change Over Time Analyses

The evaluation team will examine the student and teacher survey data to provide simple descriptions of change in a variable over time. For the student surveys, we will test whether the difference in proportions and means between two time points is zero using z-tests and t-tests that take into account that the samples are overlapping. Namely, the standard errors, the precision of the estimate, will be computed taking into consideration that the samples are not independent (i.e., the same students take the surveys at different points in time) and that the estimation of variance must include covariance. For the teacher surveys, we will test whether the difference in proportions and means between two time points is zero using a McNemar test or paired t-test, depending on the distribution of the outcome variables. Both the student and teacher statistical tests are distinct from a model where the relationship between some predictor variable(s) and the change in the outcome variable over time is assessed.


Analysis of Implementation Data

Analysis of the implementation forms will be descriptive, using counts, ranges, frequencies, means, and standard deviations. Notes from the focus groups will be coded using NVivo, a qualitative analysis software program that facilitates tagging and retrieval of data associated with selected themes, and content analyzed. The implementation data will allow us to explore how summer activities were implemented and how strategies were similar or different between awardees. Further, implementation data will be used to explore associations with survey outcomes and to generate hypotheses.


A.17 Display of Expiration Date for OMB Approval

NASA is not requesting a waiver for the display of the OMB approval number and expiration date on the data collection instruments.


A.18 Exceptions to Certification Statement

This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).

References

Barnette, J.J. (2000). Effects of stem and likert response option reversals on survey internal consistency: If you feel the need, there is a better alternative to using those negatively worded stems. Educational and Psychological Measurement 60, 361-370.


Capaldi, D.M. & Rothbart, M. K. (1992). Development and validation of an early adolescent temperament measure. Journal of Early Adolescence 12, 153-173.


Conrad, F.G., Couper, M.P., Tourangeau, R., and Peytchev, A. 2006. Use and non-use of clarification features in web surveys. Journal of Official Statistics 22, 245-269.


Couper, M. P. 2008. Designing Effective Web Surveys. Cambridge, England: Cambridge University Press.


Fraser, B. J. (1981). TOSRA test of science related attitudes handbook. Hawthorn, Victoria, Australia: Australia Council for Educational Research.


Galesic, M., Tourangeau, R., Couper, M.P., and Conrad, F. 2008. Eye-tracking data: New insights on response order effects and other cognitive shortcuts in survey responding. Public Opinion Quarterly 72, 892-913.


Heerwegh, D. and Loosveldt, G. 2003. An evaluation of the semiautomatic login procedure to control web survey access. Social Science Computer Review 21, 223-234.


Peytchev, A., Couper, M.P., McCabe, S., and Crawford, S. 2006. Web survey design: Paging versus scrolling. Public Opinion Quarterly 70, 596-607.


Redline, Cleo and Dillman, Don. 2002. The Influence of alternative visual designs on respondents’ performance with branching questions in self-administered questionnaires. In R.M. Groves, D.A. Dillman, J.A. Eltinge, and R.J.A. Little (Eds.), Survey Nonresponse. New York: Wiley, 179-193.


Sengstock, M.C. & Hwalek, M. (1998). Issues to be considered in evaluating programs for children and youth. Paper presented at the American Sociological Association, San Francisco, CA.


Singh, K., Chang, M., & Dika, S. (2006). Affective and motivational factors in engagement and achievement in science. International Journal of Learning 12(6), 1447-9540.


U.S. Department of Health and Human Services (HHS). 2006. Research-Based Web Design & Usability Guidelines. Washington D.C.: Government Printing Office.


1 Please note: this package is the second of three for SoI FY2011. It focuses on the data collection efforts scheduled to begin in June 2011 and conclude by November 30, 2011. The first package, submitted on March 25, 2011, requested clearance to collect parent consent forms/surveys and awardee planning information. The third package will include materials for activities occurring between December 2011 and March 2012, which will take place outside the six month period provided by an emergency clearance.


2 The emergency clearance would not include survey data collected in the third wave as it falls outside of the cleared time frame. We include it in the description to articulate our vision for the national evaluation. A subsequent OMB package will be prepared to obtain clearance for the third wave of survey data.

3 The emergency clearance would not include survey data collected in the third wave as it falls outside of the cleared time frame. We include it in the description to articulate our vision for the national evaluation. A subsequent OMB package will be prepared to obtain clearance for the third wave of survey data.

4 Sengstock, M.C. & Hwalek, M. (1998). Issues to be considered in evaluating programs for children and youth. Paper presented at the American Sociological Association, San Francisco, CA.

5 Capaldi, D.M. & Rothbart, M. K. (1992). Development and validation of an early adolescent temperament measure. Journal of Early Adolescence 12, 153-173.

6 Singh, K., Chang, M., & Dika, S. (2006). Affective and motivational factors in engagement and achievement in science. International Journal of Learning 12(6), 1447-9540.

7 Fraser, B. J. (1981). TOSRA test of science related attitudes handbook. Hawthorn, Victoria, Australia: Australia Council for Educational Research.

8 Note: small differences in sums due to rounding.

File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorMelissa Velez
Last Modified ByKristen Neishi
File Modified2011-05-26
File Created2011-05-25

© 2024 OMB.report | Privacy Policy