updated NASA SOI OMB Part A 7 21 10

updated NASA SOI OMB Part A 7 21 10.docx

NASA Summer of Innovation

OMB: 2700-0142

Document [docx]
Download: docx | pdf






SUPPORTING STATEMENT

FOR OMB CLEARANCE

PART A



NASA Summer of Innovation Pilot


SURVEY DATA COLLECTION





National Aeronautics and Science Administration




July 2, 2010













Part A: Justification

A.1 Explanation of Circumstances That Make Collection of Data Necessary

The National Aeronautics and Space Administration (NASA) Office of Education, requests that the Office of Management and Budget (OMB) approve, under the Paperwork Reduction Act of 1995, an emergency clearance for NASA to conduct data collection efforts related to the evaluation of NASA’s Summer of Innovation (SOI) program. The speed with which NASA has decided to move forward with the program has made emergency clearance a necessity to allow for the evaluation of the program, which begins implementation at many sites this month.


NASA Office of Education is initiating a project called Summer of Innovation which targets middle school students who underperform, are underrepresented, and underserved in science, technology, engineering, and math (STEM) fields. The Summer of Innovation pilot project includes three categories of organizations: Space Grant Consortiums, NASA Center Partnerships, and a Sub-award. The Space Grant Consortiums and the Sub-award ill be made through competitive cooperative agreements to states, and partnerships with universities, companies, and nonprofits. Across these SOI types, NASA will use its STEM assets -- including the agency's scientists and engineers -- to create multi-week summer learning programs. Space Grant Consortiums and the Sub-award implementing SOI will hire external local evaluators to document implementation and collect impact data for analysis. In addition, Abt Associates has been contracted to serve as the external national evaluator. It will work with the NASA Center Partnerships and the local evaluators hired by the Space Grant consortiums and the Sub-award to provide technical assistance for the evaluation, as well as analyze impact data from across the project’s sites and codify key lessons learned (best practices) from grantees’ experiences of implementation. In sum, the evaluation of the Summer of Innovation will be implemented across 11 sites, with around 21,095 students participating in the 2010 summer; the breakdown by SOI type is outlined below.


  • NASA Center Partnerships engaging ~10,000 students (the evaluation will include ~1,000 students at 6 NASA Center Partnerships)

  • 1 Sub-award site engaging ~5,000 students

  • 4 Space Grant Consortiums engaging ~ 673 educators and ~6,095 students


This clearance request pertains to the administration of the baseline and post teacher surveys (Appendix A and B) and the baseline and post student surveys (Appendix C and D). As described in more detail below, the instruments will be used to gather information on change, or growth, as it relates to SOI-supported activities. In particular, the national evaluator will examine students’ STEM awareness, motivation, and career pathways across collaborative activities; they will also assess teachers’ beliefs about their abilities to teach science and impact on student learning (self-efficacy). The OMB clearance package provides a question-by-question justification for each item, aligning the survey questions to the research questions of the evaluation, on the teacher and student survey forms (Appendix E). These outcome data are not available elsewhere unless collected via surveys to students and teachers. The evaluation is an important opportunity to examine the extent to which the SOI-supported activities collectively meet the objective of improving the outcomes of teachers and students as well as to identify collaborative activities with the greatest promise for generating results.



A.2 How the Information Will Be Collected, by Whom, and For What Purpose

How Information Will Be Collected and by Whom

This information collection request is for survey instruments (one for students, and one for teachers) that will be administered in four waves: Baseline (early summer 2010). The emergency clearance would cover the baseline and the first post-program follow-up in early summer 2010 and late summer 2010 respectively. The first survey will be administered at baseline, prior to the start of SOI activities in early summer 2010. The second survey will be administered as a post-test, immediately following SOI activities in late summer 2010, or approximately one month after baseline. Local evaluators will implement and administer the survey at their grantee sites. The national evaluator, Abt Associates, Inc., will collect survey data at NASA Center Partnership sites selected by NASA; these selected sites will approximately engage a total of 10,000 students of which ~1,000 students will be surveyed. The surveys will be used consistently across the SOI sites because the national evaluator will be developing and providing survey modules for key student and teacher domains, or content, to be included in the sites’ surveys, and will be providing technical assistance on how to administer them.


For What Purpose

A key goal for the Summer of Innovation is to provide hands-on intensive activities during the summer that motivate teachers and students for academic success. Data gathered through the survey instruments will be used to assess whether this goal is attained: the data is key to determining whether the students and teachers attending the Summer of Innovation activities made gains in STEM motivation, awareness, and efficacy. The evaluation goal, therefore, is to conduct an efficacy study determining whether participation in SOI is related to future student and teacher outcomes. A long-term evaluation goal is to scale-up to an effectiveness study in the future.


The research questions for the national evaluation of Summer of Innovation include:

  1. To what extent and in what domains did student STEM motivation, awareness, and efficacy increase immediately after the Summer of Innovation collaborative activities?

  2. How do student outcomes vary by characteristics of the students and SOI collaborative activities?

  3. To what extent and in what domains did teacher efficacy, expectancy, and the use of inquiry-based pedagogy increase immediately after the Summer of Innovation collaborative activities?

  4. How did teacher outcomes vary by characteristics of the teacher and collaborative activities?


The table on the following page outlines the research questions for the SOI efficacy evaluation, outcomes from the evaluation, data collection procedures, and analysis.


Exhibit 1: Research Questions, Data, and Methods

Research Questions

Data

Sample

Measures/Domains

Analytic Approach

Students





To what extent and in what domains did student STEM motivation, awareness, and efficacy increase immediately after the Summer of Innovation collaborative activities?

Student outcomes survey

11 SOI sites, ~12,095 students

Self-confidence in science, self-confidence in math, career interest in STEM, leisure interest in STEM


Descriptive statistics, t-test, multiple regression

How did student outcomes vary by characteristics of the students and SOI collaborative activities?

Student outcomes survey, implementation data

11 SOI sites, ~12,095 students

Student demographics, program attendance, program duration, student self-confidence in science, self-confidence in math, career interest in STEM, leisure interest in STEM


Correlations, Chi-squares, multiple regression

Teachers





To what extent and in what domains did teacher efficacy, expectancy, and the use of inquiry-based pedagogy increase immediately after the Summer of Innovation collaborative activities?

Teacher outcomes survey

11 SOI sites, ~1,000 teachers

Personal science teaching efficacy, science teaching outcome expectancy, use of traditional teaching practices, use of strategies to develop students’ abilities to communicate STEM ideas, use of laboratory activities

Descriptive statistics, t-test, multiple regression

How did teacher outcomes vary by characteristics of the teachers and SOI collaborative activities?

Teacher outcomes survey, implementation data

11 SOI sites, ~ 1,000 teachers

Teacher demographics, program attendance, program duration, personal science teaching efficacy, science teaching outcome expectancy, use of traditional teaching practices, use of strategies to develop students’ abilities to communicate STEM ideas, use of laboratory activities

Correlations, Chi-squares, multiple regression


The cross-site analyses conducted by the national evaluator will enable NASA to describe the SOI’s implementation and the associated outcomes. Data collection will include student and teacher outcomes collected by the local and national evaluators, and program and implementation data collected at the SOI sites. The Space Grant Consortiums and the External Sub-award, each funded through a grant, have hired local evaluators for their projects The local evaluators are being paid by grantees operating independent of the government in their evaluations, and therefore we cannot provide details about the work that will be conducted or the specific research questions they will answer. However, the local evaluators will be asked to assist in the collection of both the student and teacher outcomes, implementation, analysis, and reporting of their specific site. The NASA Center Partnerships do not have local evaluators; therefore, the national evaluators will collect the outcomes data and the SOI sites will report implementation data. All 11 sites will provide their data to the national evaluators so that the national evaluators can conduct a cross-site analysis of the efficacy of Summer of Innovation.


NASA will use the information from this study to determine the efficacy of an intense summer program such as SOI on students’ and teachers’ STEM motivation and awareness, and explore how summer activities can affect longer-term student outcomes such as course taking patterns, and interest in STEM post-secondary education and careers. The data will also be useful for state and local policy makers, districts, schools, and other organizations in their efforts to improve the academic outcomes for middle school students. An important goal of the evaluation is to produce findings that will trigger wide adoption of effective summer activities that improve STEM motivation and awareness among middle school students, particularly those who are underachieving and underrepresented. In addition, the data will be a resource to support additional research on STEM teaching and learning as it relates to informal and K-12 education by academic researchers and others interested in STEM teaching and learning.


To collect this information, the national evaluator will develop a series of reporting requirements that will ask each site to describe their projects as well as document what went well and what barriers they encountered during their implementation. Sites will be asked to report how they were able to overcome hurdles successfully and well as outline how they would improve implementation in the future. A standardized template will be provided to all sites to facilitate the collection of this implementation data. The national evaluator will compile the information from across the sites, extracting key lessons learned and providing recommendations for subsequent implementations.


The SOI evaluation includes a sample of ~12,095 students and ~1,000 teachers across the Space Grant Consortiums, one External Sub-award, and NASA Center Partnerships sites. The sample of ~12,095 students will be surveyed every year for three years. While our goal is to obtain an 80% response rate per wave of data collection, surveys will be distributed to the full sample each year. Exhibit.1 displays the timing of the data collection waves, the expected number of respondents, and a description of the survey administration.


Exhibit 1

Summary of Proposed Data Collection Activities

Survey

# of Respondents

Data Collection

Survey Administration

Timing

Student Survey

12,095

Baseline/ Pre-test

Self-administered/ Paper and Pencil

Early Summer 2010

Student Survey

9,676

Post-test/ Follow-up 1

Self-administered/ Paper and Pencil

Late Summer 2010











Teacher Survey

1,023

Baseline/ Pre-test

Self-administered/ Paper and Pencil

Summer 2010

Teacher Survey

818

Post-test/ Follow-up 1

Self-administered/ Paper and Pencil

Fall 2010

Note: Assumes 80% response rate from the previous wave of data collection.



A.3 Use of Improved Information Technology to Reduce Burden

The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Therefore, the surveys were designed so that only questions not available elsewhere were asked. We will supplement other data sources (e.g. NASA Office of Education Performance Measures, state-level student data, district-level data) for student demographic data and other characteristics. The national evaluator will provide technical assistance to all 11 sites, as well as produce a report on data requirements, to assist sites in obtaining systematic and consistent data from other sources. Specifically, in states where it is feasible, student math and science state test scores will be obtained through the state department of education, and in some cases through the local evaluator and/or the school district. The math and science test scores will be used as additional student outcomes to assess whether there were a relationship between SOI attendance and an increase in test scores. Other long term outcomes, such as student course taking behaviors in high school, will be collected in a similar manner.



Finally, the national evaluator’s electronic mail address and toll-free telephone number will be included on the front of the surveys for respondents who have questions. Taken together, these procedures are all designed to minimize the burden on respondents.


A.4 Efforts to Identify and Avoid Duplication

This effort will yield data to evaluate students’ STEM motivation and awareness and teachers’ science self-efficacy as it relates to SOI activities; as such, there is no similar evaluation being conducted and there is no alternative source for the information to be collected. The national evaluator is providing the national data collection instruments to the local evaluators so that the unique local site instruments can be designed or revised so they do not collect the same information. The national evaluator will also pre-populate implementation reporting forms using the proposal and other materials so that sites only need to review the report for accuracy and fill in any remaining blanks. As mentioned earlier, the evaluation will also utilize extant data from states, districts, and performance measures from NASA to obtain demographic information.


Moreover, the national evaluator will provide technical assistance across all local evaluators and sites to ensure that there is no duplication of effort in collecting student and teacher data. The national evaluator will accomplish this by identifying the outcome and implementation data and the party responsible for their collection and developing protocols that standardize each technical assistance provider’s guidance to sites. In addition, weekly meetings convening all technical assistance providers will be held to discuss the questions they are fielding. Topics for the national evaluator’s technical assistance will include survey and other outcomes data collection techniques, implementation data points to collect, and reporting requirements.


A.5 Efforts to Minimize Burden on Small Business or Other Entities

The primary entities for the study are students and teachers. Burden is minimized for all respondents by requesting only the minimum required to meet study objectives. All primary data collection will be coordinated by the local evaluators and site administrators, and supported by the national evaluator, so as to reduce the burden on Space Grant Consortiums, the Sub-Award, and NASA Center Partnerships. No small businesses will be involved as respondents.


A.6 Consequences of Less-Frequent Data Collection

If the proposed data were not collected, NASA would not fulfill its objectives in measuring change, or growthThus, federal resources would be allocated and program decisions would be made in the absence of valid evidence of the effectiveness of SOI.


A.7 Special Circumstances Requiring Collection of Information in a Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations

There are no special circumstances associated with this data collection.


A.8 Federal Register Comments and Persons Consulted Outside the Agency

In accordance with the Paperwork Reduction Act of 1995, NASA published a notice in the Federal Register announcing the agency’s intention to request an OMB review of data collection activities. The notice was published on June 9, 2010, in volume 75, number 110, page 32817, and provided a 30-day period for public comments. To date, no comments were received.


The survey instruments were developed by the national evaluators, Abt Associates, Inc. and staff from the Education Development Center (EDC), comprising: Alina Martinez, Principal Investigator; Ryoko Yamaguchi; Hilary Rhodes, Project Director; Kristen Neishi, Deputy Project Director; and Jackie DeLisi, Linda Hirsch and Daphne Minner at EDC. Feedback on the surveys was solicited from staff at NASA’s Office of Education.

A.9 Payments to Respondents

There will be no payments to respondents.


A.10 Assurance of Confidentiality

Every effort will be made to maintain the privacy of respondents, using several procedural and control measures to protect the data from unauthorized use. Abt Associates, the national evaluator, will follow procedures for ensuring and maintaining confidentiality, consistent with the Family Educational Rights and Privacy Act of 1974 (20 USC 1232g), the Privacy Act of 1974 (P.L. 93-579, 5USC 552a), and the Federal common rule or Department final regulations on protection of human research subjects. Data to be collected will not be released with individual student or teacher identifiers, and data presented will occur only in aggregated form. A statement to this effect will be included in a letter accompanying each survey and will be read to students before administering the survey. Respondents will be assured that all information identifying them will be kept confidential, in compliance with the legislation (P.L. 103-382).


The procedures to protect data during information collection, data processing, and analysis activities are as follows:


  • All respondents included in the study sample will be informed that the information they provide will be used only for the purpose of this research. Individuals will not be cited as sources of information in prepared reports.

  • To ensure data security, all individuals hired by the contractor are required to adhere to strict standards and sign an oath of confidentiality as a condition of employment.

  • Hard-copy data collection forms will be delivered to a locked area at the contractor’s office for receipt and processing. The contractor will maintain restricted access to all data preparation areas (i.e., receipt, coding, and data entry). All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only.

  • Individual identifying information will be maintained separately from completed data collection forms and from computerized data files used for analysis. No respondent identifiers will be contained in public use files made available from the study, and no data will be released in a form that identifies individuals.


The national evaluator, Abt Associates, will also have all data collection protocols and surveys reviewed by its Institutional Review Board (IRB). Prior to their use, Abt’s IRB will approve all data collection instruments, including the parental and teacher consent forms and student assent forms. The IRB will assure that the data collection protocols and procedures, including consent forms, abide by strict confidentiality procedures. See Appendix F and G for draft versions of the consent and assent forms.


A.11 Questions of a Sensitive Nature

The questions included on the data collection instruments for this study do not involve sensitive topics.


A.12 Estimates of Respondent Burden

Exhibit A.2 presents estimates of the annualized reporting burden for the surveys. The student surveys will be administered four times, over a 3-year period. To provide a conservative estimate of burden, we assume a response rate of 100 percent so that 12,095 students are surveyed annually. The ~1,000 teachers will be surveyed twice in one year only. Estimates for the hour burden are based on time estimates provided by developers of the originals surveys that were adapted for the current evaluation and similar surveys conducted on comparable evaluations.


Exhibit 2

Estimates of 2010 Annualized Burden Hours and Cost

Data Collection Sources

Number of Respondents a, b

Frequency of Response

Total Minutes per Respondent

Response Burden in Hours

Estimated Cost Per Hour c

Costs per Respondent

Total Burden (Costs)

Student Surveys

12,095

4

80

16,127

$7.25

$9.67

$116,9594

Total Annualized Burden (Students only)

12,095

1.33

27

5,376

$7.25

$3.26

$39,460

Teacher Surveys

(Both waves in 2010)

1,000

2

40

667

$20.21

$13.47

$12,800

Notes:

a Number of respondents based on baseline estimated sample size, according to NASA estimates from SOI proposals.

b Assumes an 100% response rate from the previous wave of data collection.

c Estimated cost per hour for students is calculated based on federal minimum wage of $7.25 per hour effective July 24, 2009. Estimated cost per hour for teachers is calculated by median income of middle school teachers of $42,033 in 2010, or $20.21 per hour (http://www.payscale.com/research/US/All_K-12_Teachers/Salary). As such, it represents a conservative estimate of the cost to respondents.



A.13 Estimates of the Cost Burden to Respondents

There is no annualized capital/startup or ongoing operation and maintenance costs associated with collecting the data. Other than their time to complete the surveys, which is estimated in Exhibit 3, there are no direct monetary costs to respondents.


A.14 Estimates of Annualized Government Costs

This information collection activity has been developed in the performance of the Contract Number: GS-10F-0086K (Task Order NNH10CC70D). Under this contract, approximately $82,727 is budgeted for the survey data collection instruments, which includes instrument development. The local evaluators and site administrators will collect the survey data.


A.15 Changes in Hour Burden

This is a new collection of information.


A.16 Time Schedule, Publication, and Analysis Plan

The schedule shown in Exhibit 3 displays the sequence of activities required to conduct the information collection activities and includes key dates for activities related to data collection, analysis, and reporting.


Exhibit 3

Time Schedule


Activities and Deliverables

Responsible Party

Date

Identify and Develop Teacher and Student Survey Modules

National evaluator

May – June 2010

Survey Data Collection

Local evaluators & site administrators

June – August 2010

Data Analysis of Pre/Post Survey

Local evaluators & national evaluator

August – September 2010

Cross-site Data Analysis

National evaluator

September – October 2010

Cross-site Reporting

National evaluator

October 2010


























The Space Grant Consortiums and the sub-Award will have a local evaluator conducting the local evaluation, which includes collecting data on their participating students and teachers, analysis and reporting of their local program. For the baseline and first post-test at the NASA Center Partnerships, site administrators will collect the data and complete reports which they will provide to the national evaluator. For the subsequent waves, the national evaluator will track the students and administer the surveys.


The national evaluator will provide cross-site analyses to explore whether Summer of Innovation as a whole had an effect on student and teacher short-term outcomes. In so doing, the surveys will be utilized to measure changes in student awareness, motivation, and engagement in STEM, and teachers’ self-efficacy for teaching science. The main analyses will use descriptive statistics and gain scores (difference between the baseline and the follow-up) to examine change over time on short-term outcomes. A cross-site analytic evaluation report will be prepared yearly based on findings from the surveys. The three main reports include a short-term outcomes reporting in October 2010 (Cross-consortium Reporting).


A.17 Display of Expiration Date for OMB Approval

NASA is not requesting a waiver for the display of the OMB approval number and expiration date on the data collection instruments.


A.18 Exceptions to Certification Statement

This submission does not require an exception to the Certificate for Paperwork Reduction Act (5 CFR 1320.9).



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Double-Sided Body Template
AuthorAbt Associates Inc
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy