Supt. Stmt. Part A - NASA ED Performance Measurement and Evaluation Testing (4-27-18)

Supt. Stmt. Part A - NASA ED Performance Measurement and Evaluation Testing (4-27-18).pdf

Generic Clearance for the NASA Office of Education Performance Measurement and Evaluation (Testing)

OMB: 2700-0159

Document [pdf]
Download: pdf | pdf
Section

Page

A. JUSTIFICATION .................................................................................................................. 1
1. Necessity For Information Collection:.................................................................................... 1
2. Uses of Information ................................................................................................................ 7
3. Considerations of Using Improved Technology ................................................................... 10
4. Efforts to Identify Duplication .............................................................................................. 14
5. Efforts to Minimize Burden on Small Business ................................................................... 14
6. Consequences of Less Frequent Data Collection ................................................................. 14
7. Special Circumstances .......................................................................................................... 14
8. Federal Register Announcement and Consultation Outside the Agency .............................. 15
9. Payment or Gifts to Respondents .......................................................................................... 16
10. Assurance of Confidentiality .............................................................................................. 16
11. Justification for Sensitive Questions ................................................................................... 17
12. Estimate of Respondent Burden.......................................................................................... 18
13. Cost Burden to Respondents ............................................................................................... 19
14. Cost Burden to Federal Government .................................................................................. 20
15. Reason for Change in Burden ............................................................................................. 20
16. Schedule for Information Collection and Publication ........................................................ 21
17. Display of OMB Expiration Date ....................................................................................... 21
18. Exception to the Certificate Statement ............................................................................... 21
References .................................................................................................................................... 23
APPENDIX A: NASA Education Goals ................................................................................... 25
APPENDIX B: NASA Center Education Offices ..................................................................... 26
APPENDIX C: Data Instrument Collection Testing Participation Generic Consent Form 28
APPENDIX D: Descriptions of Methodological Testing Techniques .................................... 30
APPENDIX E: Privacy Policies and Procedures...................................................................... 33
APPENDIX F: Overview: NASA Education Data Collection Instrument Development
Process
35

i

FROM OUTPUTS TO SCIENCE, TECHNOLOGY, ENGINEERING, AND MATHEMATICS (STEM)
EDUCATION OUTCOMES MEASUREMENT: DATA COLLECTION INSTRUMENT DEVELOPMENT
PROCESS ....................................................................................................................................... 35
APPENDIX G: Explanatory Content for Information Collections for Testing Purposes ... 38
List of Tables ............................................................................................................................... 40

ii

GENERIC CLEARANCE FOR THE NASA OFFICE OF
EDUCATION/ PERFORMANCE MEASUREMENT AND
EVALUATION (TESTING) SUPPORTING STATEMENT

A.

JUSTIFICATION

1. NECESSITY FOR INFORMATION COLLECTION:
Explain the circumstances that make the collection of information necessary. Identify any legal
or administrative requirements that necessitate the collection. Attach a copy of the appropriate
section of each statute and regulation mandating or authorizing the collection of information.
The National Aeronautics and Space Administration inspires the world with our exploration of
new frontiers, our discovery of new knowledge, and our development of new technology in
support of the vision to discover and expand knowledge for the benefit of humanity.
The NASA Office of Education (NASA Education) supports that mission by deploying
programs to advance the next generation’s educational endeavors and expand partnerships with
academic communities (see Appendix A).
NASA has a long history of engaging the public and students in its mission through educational
and outreach activities and programs. NASA’s endeavors in education and public outreach
began early on, driven by the language in Section 203 (a) (3) of the Space Act, “to provide for
the widest practicable and appropriate dissemination of information concerning its activities and
the results thereof, and to enhance public understanding of, and participation in, the Nation’s
space program in accordance with the NASA Strategic Plan.” NASA’s education and outreach
functions aim to inspire and engage the public and students, each playing a critical role in
increasing public knowledge of NASA’s work and fostering an understanding and appreciation
of the value of STEM, and enhancing opportunities to teach and learn. By augmenting NASA’s
public engagement and communicating NASA’s work and value, the Agency contributes to our
Nation’s science literacy. NASA is committed to inspiring an informed society; enabling the
public to embrace and understand NASA’s work and value, today and tomorrow; engaging the
public in science, technology, discovery, and exploration; equipping our employees to serve as
ambassadors to the public, and providing unique STEM opportunities for diverse stakeholders.
The Office of Education Performance Assessment and Evaluation Information Management
(PAEIM) Team supports performance assessment and evaluation of NASA’s education
investments executed through headquarters and across the ten Center Education Offices (see
Appendix B). The PAEIM Team became lead for performance measurement and program
evaluation activities within the Office of Education in October 1, 2017. Responsibilities include
recommending and implementing agency-wide strategy for performance measurement and
evaluation; ensuring the collection of high-quality data; process documentation of NASA
Education projects; formative and outcome evaluations; training and technical assistance on
performance measurement and evaluation. The PAEIM Team’s goal is to provide support that
improves education policy and decision-making, provides better education services, increase
evaluation rigor and accountability, and ensures more effective administration of investments.
The Office of Education IT (OEIT) Systems Team supports the NASA Education community in
1

the areas of information technology, dissemination and Web services, and communications and
operations support. These two teams in collaboration support the overall performance
assessment of NASA education investments across the agency.
The purpose of this request is to renew the clearance for methodological testing in order to
continue to enhance the quality of the Office Education’s data collection instruments and
overall data management through interdisciplinary scientific research, utilizing best practices in
educational, psychological, and statistical measurement. NASA Education is committed to
producing the most accurate and complete data within the highest quality assurance guidelines
for reporting purposes by NASA Education leadership and by authority of the Government
Performance and Results Modernization Act (GPRMA) of 2010 that requires quarterly
performance assessment of Government programs for purposes of assessing agency
performance and improvement. It is with this mission in mind, then, that this clearance package
is submitted.1
Under the current clearance (2700-0159 OMB Control Number) for the NASA Office of
Education Performance Measurement and Evaluation (Methodological Testing) the following
information collections were approved for pilot testing.
•
•
•
•
•

Office of Education Performance Assessment, Evaluation, and Information
Management Data Collection Screens: One Stop Shopping Initiative (OSSI) Studentlevel Data
Office of Education Performance Measurement (OEPM) Program-level Data
Collection
NASA Office of Education Undergraduate Internship Impact Surveys - Retrospective
and Traditional Development Surveys, Student Baseline Instruments No. 1 and
Follow-up Instruments #1
NASA Education STEM Challenges Impact Surveys: Student Baseline Instruments,
Student Follow-up Instruments; and Educator Retrospective Instruments
NASA Education Internship Data Collection Screens: NASA Internship Application
Management System (NIAMS) Student-level Data

The PAEIM Team conducted an internal assessment of the NASA Education information
collections above to determine the outcome and results of the methodological testing.
Available documentation and testing technical reports provided the example summary of
results for the NASA Education STEM Challenges Impact Surveys (Student Baseline and
Follow-Up Instruments, and Educator Retrospective) beginning on the next page.

1

The entire GPRMA of 2010 can be accessed at http://www.gpo.gov/fdsys/pkg/BILLS111hr2142enr/pdf/BILLS- 111hr2142enr.pdf.

2

NASA Education STEM Challenges Impact Surveys Methodological Testing
Methodological testing was conducted with educator and student respondents in the
21st Century Learning Community Centers (21stCCLC)/NASA Phase 3 Collaboration.
In conducting the methodological testing analysis of our instruments, we included
several survey items to address: the amount of time to complete the surveys, if survey
questions were understandable, clarity of the survey instructions and if respondents had
any survey feedback.
Type of Validity and Reliability Assessment
We measured validity and reliability of the instruments. Instrument validity occurs
when the answers correspond to what they are intended to measure. There are four
types of validity:
1. Content – domain covered in its entirety;
2. Face – general appearance, design or layout;
3. Criterion – how effective are the questions in measuring what is purports to
measure;
4. Construct – how the questions are structured to form a relationship or
association (Bell, 2007).
Reliable instruments are assessments that produce consistent results in comparable
settings. For example, reliability is increased when there are consistent scores across
more than one organization that serves populations in a rural setting (Bell, 2007)
We examined the instrument items and its subscales. As such, we calculated
conventional measures of reliability for each scale. Cronbach’s α, which can be
interpreted as the average correlation (or loading usually denoted by λ) between the
latent dimension and the items measuring the latent dimension. The squared multiple
correlation (SMC), sometimes referred to as Guttman’s λ6, represents the proportion of
the variance in the true score explained by the items. For each item, we also calculated
the SMC and an examination of each item’s contribution to α by examining α if we
deleted the item.
Construct validity was used to identify questions that assessed students’ skills, attitudes
and behaviors toward STEM. The multi-scale measures described below are from the
PEAR Institute Common Instrument Suite Survey 3.0 (PEAR Institute, 2016). The
common instrument suite survey has been administered over 30,000 times to students
enrolled in informal science programs across the U.S., and it has shown strong
reliability in previous work (α> 0.85) (https://www.thepearinstitute.org/commoninstrument-suite, Allen et al, 2016).
Respondent Characteristics
Our sample consisted of 70 EDC sites chosen at random and all 12 GLOBE SRC pilot
sites. Together these 82 evaluation sites provided all the data (e.g., implementation
information collected from participation logs, educator feedback forms, and in-depth
interviews) for this evaluation.
From these sites we collected a total of 992 surveys from EDC students and 151 surveys
3

from GLOBE SRC students at pre-test. During the post-test, 671 EDC students and 81
GLOBE SRC students provided responses. This represents a retention rate of 68
percent for EDC and 54 percent for GLOBE SRC. High attrition rates are common in
OST programs; previous research has found that between 31 and 41 percent who start
such programs go on to finish them (Apsler, 2009; Weisman and Gootfredson 2001).
All 992 EDC participants contributed to our analysis, but we retained only 151 of the
159 participants from GLOBE SRC due to one school dropping out of the study prior to
post-test. Of the 992 EDC pre-test participants, 671 (or 68%) participated at post-test,
where 321 were lost to attrition. An additional 183 participants provided data only at
post-test; however, these participants likely only had partial exposure to the EDC
program. As a result, we excluded this from our analysis. Considering comparable
numbers for GLOBE SRC, of the 151 pre-test participants, 81 (or 54%) participated at
post-test and 70 were lost to attrition.
Findings
Key findings from the performance assessment of the student and educator surveys and
analysis are as follows:
1. EDC and GLOBE SRC students required more than the projected average 10 minutes
to complete the pre- or post-test surveys;
2. EDC and GLOBE SRC educators required more than the projected average 15
minutes to complete the post-test (retrospective) surveys;
3. Students responded that the pre- and post-test survey items were understandable and
that the instructions were clear;
4. Of those students who provided suggestions for improvement of the EDC and GLOBE
SRC pre- and post-test surveys, the most common suggestion was to add more
response options, followed by provide additional/more interesting questions;
5. Among educators, four responses/suggestions for improving the EDC and GLOBE
SRC educator surveys were to provide greater clarity to the questions, reduce the use
of reverse coding, that the retrospective reporting may have proved challenging for
some respondents, and more time was spent on open-ended responses;
6. Survey items and scales for each of the EDC and GLOBE SRC (pre- and post-test)
surveys, as well as the EDC and GLOBE SRC educator surveys (retrospective)
performed as expected and yielded acceptable reliability readings.
Recommendations
Based on the findings from the survey item and subscale analysis, and the methodological
testing survey item analysis, the contract evaluator made the following recommendations:
1. Create a shorter (fewer questions) and simpler (language) version of the student
surveys to achieve a 10-minute survey experience for students, especially if the plan in
the future is to survey younger elementary school aged children (e.g., 4th grade);
2. Create a shorter (fewer questions) version of the educator surveys to achieve a 15minute survey experience for educators;

4

3. Consider modifying the student and educator instruments to be applicable for older
student populations (e.g., 9th and 10th grades) and include 9th and 10th grade
students in future evaluations to examine effects of 21stCCLC on older students;
4. Maintain separate EDC and GLOBE SRC student instruments (do not combine the
two instruments);
5. Conduct a comparative analysis with other available data on STEM attitudes and
beliefs;
6. Continue scaling the EDC and GLOBE SRC programs and use revised survey
instruments to collect student pre- and post-test data and educator post-test data;
7. Continue to collect and analyze student and educator data and contribute to the
research literature regarding successes and challenges of 21stCCLC programs
teaching engineering and science skills.
Another example summary of results for the NASA Office of Education Undergraduate
Internship Impact Surveys (Retrospective and Traditional Development Surveys, Student
Baseline Instruments No. 1 and Follow-up Instruments #1) is as follows.
NASA Internship Expectations Post-Survey and Development Retrospective
Methodological Testing NASA International Internships (I^2)
• Deployed Spring 2016
• N=20
• STEM-related Outcomes Constructs of interest:
o Internship Expectations
o Development Outcomes: a dependent variable for student learning as well
as an additional construct to understand students’ intention to complete
their degrees and satisfaction with their programs. (Retrospective)
Success Story

5

Comprehensibility & Response Rate Results:
Table 1. Calculation chart to determine statistically relevant number of respondents

Table 2. Response rates for the NASA Internship Expectations Pre-Survey Summer 2016

6

Grit Results: Principal Component Analysis & Deployment June 2017
Rotation of the factor structure has realized three factors and variables loading highly on
two factors primarily, with two variables distinct and apart in the rotated solution and
presenting as a third factor.
What we learned and how it can be used:
• Statistical means to ensure population representation in testing and routine
administration
• Certain of the auto-reminder/auto-send frequency pattern to ensure high,
representative response rates
• Retrospective survey format for attitude & behavior scales yields statistically relevant
STEM-related outcomes data
• Comprehensibility questions (2) will aid in OMB clearance reporting
Towards monitoring performance of its education activities, NASA Education will use
rigorously developed and tested instruments administered and accessed through the Office of
Education Performance Measurement system.2 Each data collection form type possesses unique
challenges which can be related to respondent characteristics, survey content, or form of
administration. In the absence of meticulous methods, such issues impede the effectiveness of
instruments and would decrease the value of the data gathered through these instruments for
both NASA Education and the Agency.
The central purpose of measurement is to provide a rational and consistent way to summarize
the responses people make to express achievement, attitudes, or opinions through instruments
such as achievement tests or questionnaires (Wilson, 2005, p. 5). In this particular instance, our
interest lies in attitude and behavior scales, surveys, and psychological scales related to the
goals of NASA STEM education activities. Yet, since NASA Education captures participant
administrative data from activity application forms and program managers submit
administrative data, PAEIM Team extends the definition of instruments to include electronic
data collection screens, project activity survey instruments, and program application forms, as
well.3 Research-based, quality control methods and techniques are integral to obtaining accurate
and robust data, data of high quality to assist leaders in policy decisions.
The following research techniques and methods may be used in these studies:
•

Usability testing: Pertinent are the aspects of the web user interface (UI) that impact the
User’s experience and the accuracy and reliability of the information Users submit
(Kota, n.d.; Jääskeläinen, 2010).

2
The Office of Education Performance Measurement System (OEPM) is the project level data component of NASA Education’s data collection
suite. It is an automated system for collecting, managing, and securing data, and uses web interfaced on-line data collection screens with a backend database. As an automated, information technology system that is the centralized collection point for NASA Education performance
measurement data, OEPM reduces respondent burden by: 1.) bringing clarity to the exact nature of data required of program managers; 2.)
consolidating disparate NASA Education systems in use throughout the NASA Centers of Education; 3.) providing a means to monitor project
performance data for the purposes of determining education-related outputs and outcomes; 4.) improving the quality of performance measurement
data (i.e., monitoring mechanism for missing data points); and 5.) refining reporting consistency through automated reminder functionality.
3
If constituted as a form and once approved by OMB, forms will be submitted to NASA Forms Management according to NASA Policy Directive
(NPD) 1420. Thus, forms used under this clearance, will have both an OMB control number and an NPD 1420 control number that also restricts
access to NASA internal users only. Instruments not constituted as forms will display an OMB control number only.

7

Think-aloud protocols: This data elicitation method is also called ‘concurrent verbalization’,
meaning subjects are asked to perform a task and to verbalize whatever comes to mind during
task performance. The written transcripts of the verbalizations are referred to as think-aloud
protocols (TAPs) (Jääskeläinen, 2010, p 371) and constitute the data on the cognitive processes
involved in a task (Ericsson & Simon, 1984/1993).
Focus group discussion: With groups of nine or less per instrument, this qualitative
approach to data collection comprises the basis for brainstorming to creatively solve
remaining problems identified after early usability testing of data collection screen and
program application form instruments (Colton & Covert (2007), p. 37).
•

Comprehensibility testing: Comprehensibility testing of program activity survey
instrumentation will determine if items and instructions make sense, are ambiguous,
and are understandable by those who will complete them (Colton & Covert, 2007, p.
129).

•

Pilot testing: Testing with a random sample of at least 200 respondents to yield
preliminary validity and reliability data (Haladyna, 2004; Komrey and Bacon,
1992; Reckase, 2000; Wilson, 2005).

•

Large-scale statistical testing: Instrument testing conducted with a statistically
representative sample of responses from a population of interest. In the case of
developing scales, large-scale statistical testing provides sufficient data points for
exploratory factor analysis, a “large-sample” procedure (Costello & Osborne, 2005, p.
5).

•

Item response approach to constructing measures: Foundations for multiple-choice
testing that address the importance of item development for validity purposes,
address item content to align with cognitive processes of instrument respondents, and
that acknowledge guidelines for proper instrument development will be utilized in a
systematic and rigorous process (DeMars, 2010).

•

Split-half method: This method is an efficient solution to parallel-forms or test/retest
methods because it does not require developing alternate forms of a survey and it
reduces burden on respondents, requiring only participation via a single test rather
than completing two tests to acquire sufficient data for reliability coefficients.

The PAEIM Team’s goal and purpose for data collection through methodological testing is to
provide support that improves education policy and decision-making, provides better education
services, increases accountability, and ensures more effective administration within the NASA
Office of Education. More in depth descriptions of techniques and methods can be found in
Appendix D.

2. USES OF INFORMATION
Indicate how, by whom, and for what purpose the information is to be used. Except for a
new collection, indicate the actual use the agency has made of the information received from
8

the current collection.
The purpose of this data collection by the PAEIM Team is to ultimately improve our Federal
data collection processes through scientific research. Theories and methods of cognitive science,
in combination with qualitative and statistical analyses, provide essential tools for the
development of effective, valid, and reliable data collection instrumentation.
The PAEIM Team’s methodological testing is expected to 1) improve the data collection
instruments employed by NASA Office of Education, 2) increase the accuracy of the data
produced by execution of NASA Education project activities upon which policy decisions are
based, 3) increase the ease of administering data collection instruments for both respondents
and those responsible for administering or providing access to respondents, 4) increase response
rates as a result of reduced respondent burden, 5) increase the ease of use of the data collection
screens within the Office of Education Performance Management system, and 6) enhance
NASA Education’s confidence in and respect for the data collection instrumentation utilized by
the NASA Education community.
The application of cognitive science, psychological theories, and statistical methods to data
collection is widespread and well established. Neglecting accepted research practices and
relying on trial and error negatively impact data quality and unfairly burden respondents and
administrators of data collection instruments. For example, without knowledge of what
respondents can be expected to remember about a past activity and how to ask questions that
effectively aid in the retrieval of the appropriate information, researchers cannot ensure that
respondents will not take shortcuts to avoid careful thought in answering the questions, or be
subject to undue burden. Similarly, without investigating potential respondents’ roles and
abilities in navigating electronic data collection screens, researchers cannot ensure that
respondents will read questions correctly with ease and fluency, navigate electronic data screens
properly or efficiently, or record requested information correctly and consistently. Hence,
consequences of failing to scientifically investigate the data collection process should and can be
avoided.
In light of the Administration’s call for increased sharing of federal STEM education resources
through interagency collaborations, NASA Education may make available results of
methodological testing to other federal STEM agencies in the form of peer-reviewed methods
reports or white papers describing best practices and lessons learned. For instance, from
inception NASA has supported the Federal Coordination in STEM (FC-STEM) Graduate and
Undergraduate STEM Education interagency working groups’ efforts determine cross-agency,
common metrics and share effective program evaluations. Coordination Objective 2: Build
and use evidence based approaches calls for agencies to:
Conduct rigorous STEM education research and evaluation to build evidence about
promising practices and program effectiveness, use across agencies, and share with
the public to improve the impact of the Federal STEM education investment.
(National Science and Technology Council, 2013, p. 45)
9

The methods to be employed in developing and testing data collection instruments will be
methodologically sound, rigorously obtained, and will thus constitute evidence worthy of
dissemination through appropriate vehicles. Data collection instruments appropriate for a
participant in a postsecondary NASA Education research experience and are specific to
the category of participant: undergraduate student, graduate student, mentor participant.
One survey instrument explores a participant’s preparation for a research experience while
its complement explores a participant’s attitudes and behaviors pre- and post-experience
(undergraduate or graduate student) (Crede & Borrego, 2013.) Two non-cognitive
competency scales explore a participant’s developmental levels of affect (grit and
mathematics self-identity & self-efficacy) as related to participation in a NASA Education
research experience (Duckworth, Peterson, Matthews, & Kelly, 2007; National Center for
Education Statistics, 2009.) Lastly, the mentor survey explores a mentor’s attitudes and
behaviors associated with participation as a mentor of a NASA Education research
experience (Crede & Borrego, 2013.) Additional information collections will be submitted
separately under this clearance with justification information and evidence-based
methodology for methodological testing. Appendix G shows the explanatory content that
will accompany each information collection for methodological testing purposes.

3. CONSIDERATIONS OF USING IMPROVED TECHNOLOGY
Describe whether, and to what extent, the collection of information involves the use of
Automated, electronic, mechanical, or other technological collection techniques or other
forms of information technology, e.g., permitting electronic submission of responses, and the
basis for the decision for adopting this means of collection. Also describe any consideration of
using information technology to reduce burden.
The Performance Assessment and Evaluation Information Management (PAEIM) Team in
collaboration with the Office of Education IT (OEIT) System Team and Center Education
Offices will plan, conduct, and interpret field and laboratory research that contributes to the
design of electronic data collection screens, project activity survey instruments, and program
application forms used within the context of the NASA Education community spread across
ten Center Education Offices. These efforts are supported in two ways, by use of information
technology applications and strategic efforts to improve the overall information technology
data collection systems used by NASA Education.
Use of Information Technology (IT) Applications
IT applications will be used to bridge the distance between the PAEIM Team of researchers
mostly based at NASA Glenn Research Center in Cleveland, OH, and the OEIT Systems Team
at NASA Headquarters in Washington, DC along with the Center Education Offices. Multiple
modes of technology may be used to bring the laboratory environment to study participants at
various Center locales. In addition, data management and analyses applications have been
made available to study leads to optimize data collection and analyses.
Different laboratory methods may be used in different studies depending on the aspects of the
10

data collection process being studied. Computer technology will be used when appropriate to
aid the respondents and interviewers, and to minimize burden. For instance, the PAEIM Team
and/or contractor support may use Adobe Connect, VidyoDesktop, or VidyoWeb to conduct
focus groups and cognitive interviews if indeed there is inadequate representation of participant
populations at area NASA research centers.4,5 Adobe Connect and VidyoDesktop platforms are
used throughout the NASA research centers and have the potential to facilitate instrument
development by providing access to appropriate study participants. The PAEIM Team has
direct access and is also training in using other IT applications to facilitate this work as
described below.
•

Adobe Connect: Adobe Systems Incorporated describes Adobe Connect as “a web
conferencing platform for web meetings, eLearning, and webinars [that] powers
mission critical web conferencing solutions end-to-end, on virtually any device,
and enables organizations […] to fundamentally improve productivity.”

•

VidyoDesktop: Key features include Ultra HD 4k support to display rich content
and multiple full HD participants; Multiple user-selectable layouts for continuous
presence, active speaker, and shared content; Supported in Windows and Mac
environments; In- conference public and private text chat, and ability to switch
between multiple streams of shared content; Far-end camera control of Vidyo.
Benefits include conferences hosting in your own virtual conference room with
simple click-to-connect access for both administered users and guests; and works
on existing computers and laptops with no need for an expensive dedicated
appliances. The VidyoWeb browser plug-in provides guest participants a
comparable in-conference experience to VidyoDesktop, but without user account
or special software requirements.

•

SurveyMonkey: This application may be used to collect non-sensitive, nonconfidential qualitative responses to determine preliminary validity. This online
survey software provides an electronic environment for distributing survey
questionnaires.6 For the purpose of NASA Education, SurveyMonkey is a means
by which feedback can be collected from a variety of participants such as from
subject matter experts when in the early stages of instrument development when
operationalizing a construct is vital to the process of instrument development. A

4

More information on Adobe applications is available at http://www.adobe.com/products/adobeconnect.html
More information on Vidyo applications is available at http://info.vidyo.com/schedule-live-vidyodemo.html?utm_source=bing&utm_medium=cpc&utm_campaign=Brand+-+Vidyo(US)&utm_adgroup=BrandVidyo&utm_term={keyword&_kk=vidyo}
6
More information on SurveyMonkey can be found at https://www.surveymonkey.com/mp/take-atour/?ut_source=header. This application has been approved by the OCIO for uses not requiring a high level of
security. In that regard, NASA Office of Education has a license to this application
5

11

•

•

•

7

process referred to as operationalization is another tangible means to measure a
construct since a construct cannot be observed directly (Colton & Covert, 2007, p. 66).
The qualitative feedback of subject matter experts, in addition to the research literature,
provides the factors or variables associated with constructs of interest. SurveyMonkey
will facilitate the gathering of such information and interface with NVivo 10 for
Windows qualitative software for analyses and consensus towards developing valid
items and instruments.
SurveyMonkey: This application may be used to collect non-sensitive, non-confidential
qualitative responses to determine preliminary validity. This online survey software
provides an electronic environment for distributing survey questionnaires.6 For the
purpose of NASA Education, SurveyMonkey is a means by which feedback can be
collected from a variety of participants such as from subject matter experts when in the
early stages of instrument development when operationalizing a construct is vital to the
process of instrument development. A process referred to as operationalization is
another tangible means to measure a construct since a construct cannot be observed
directly (Colton & Covert, 2007, p. 66). The qualitative feedback of subject matter
experts, in addition to the research literature, provides the factors or variables
associated with constructs of interest. SurveyMonkey will facilitate the gathering of
such information and interface with NVivo 10 for Windows qualitative software for
analyses and consensus towards developing valid items and instruments.
NASA Google G-Suite (Google Form): This application may be used to collect nonsensitive, non-confidential qualitative responses to determine preliminary validity. This
online survey application provides an electronic environment for distributing survey
questionnaires. For the purpose of NASA Education, Google Form is a means by which
feedback can be collected from a variety of participants such as from subject matter
experts when in the early stages of instrument development when operationalizing a
construct is vital to the process of instrument development. The NASA Google G-Suite
also provides a file storage and synchronization service that allows users to store files
on their servers, synchronize files across devices, and share files with NASA/nonNASA credentialed.
NVivo 10 for Windows: This software is a platform for analyzing multiple forms of
unstructured data. The software provides powerful search, query, and visualization
tools. A few features pertinent to instrument development include pattern based autocoding to code large volumes of text quickly, functionality to create and code
transcripts from imported audio files, and convenience of importing survey responses
directly from SurveyMonkey. 7

More information is available at http://www.qsrinternational.com/products_nvivo.aspx

12

•

STATA SE v14: This data analysis and statistical software features advanced statistical
functionality with programming that accommodates analysis, testing, and modeling from
large data sets with the following characteristics: Maximum number of variables-32,767;
Maximum number of right-hand variables- 10,998; and unlimited observations. These
software technical specifications allow for the statistical calculations to determine and
monitor over time item functioning and psychometric properties of NASA Office of Education
data collection instrumentation. 8

Strategic Planning and Designing Improved Information Technology Data Collection Systems
NASA PAEIM Team has invested much time and effort in developing secure information
technology applications that will be leveraged on behalf of instrument piloting and for the
purposes of routine deployment that will enable large-scale statistical testing of data collection
instruments. New information technology applications, the Composite Survey Builder and
Survey Launcher, are in development with the NEACC. The Survey Launcher application will
allow PAEIM Team to reach several hundred NASA project activity participants via email
whereas the Composite Survey Builder will allow PAEIM Team to administer data collection
instruments approved by the Office of Management and Budget (OMB) Office of Information
and Regulatory Affairs via emailed web survey links. This same technology PAEIM Team will
leverage to maximize response rates for piloting and routine data collection instrument
deployment.
Most recently, NASA Office of Education has acquired a full-time SME specifically tasked
with strategizing approaches to enhance the Office’s IT systems and applications to be more
responsive to Federal mandates as well as to the needs of the Education community. This
person’s work is intended to lay the foundation for fiscally responsible IT development now and
in the future.
Recall, participants in focus groups and cognitive interviews must mirror in as many
characteristics as possible the sample of participants upon which the instrument will eventually
be tested and then administered. Using technology to employ qualitative and quantitative
methods is a means to establish validity from the onset prior to field testing and quantitative
measures to determine instrument reliability and validity while monitoring and minimizing
burden on study participants. Having the proper IT foundations in place for this work is a
NASA Office of Education priority.

13

8

More information is available at http://www.stata.com/products/which-stata-is-right-for-me/#SE

4. EFFORTS TO IDENTIFY DUPLICATION
Describe efforts to identify duplication. Show specifically why any similar information
already available cannot be used or modified for use for the purposes described in Item 2
above.
Because developing new valid and reliable data collection instrumentation is still a relatively
new procedure for NASA Education, many participants within our community have yet to
participate in this kind of procedure. Participation in instrument development or testing is not
mandatory.
Further, to reduce burden, any participant within our community recruited to participate in
instrument development will only be solicited to contribute effort towards a single instrument,
unless he or she volunteers for other opportunities. The PAEIM Team will attempt to reduce
some of the testing burden by identifying appropriate valid and reliable instruments/scales
through Federal resources or the educational measurement research literature.

5. EFFORTS TO MINIMIZE BURDEN ON SMALL BUSINESS
If the collection of information impacts small businesses or other small entities (Item 5 of OMB
Form 83-I), describe any methods used to minimize burden.
Not applicable. NASA Office of Education does not collect information from any small
business or other small entities.

6. CONSEQUENCES OF LESS FREQUENT DATA COLLECTION
Describe the consequence to Federal program or policy activities if the collection is not
conducted or is conducted less frequently, as well as any technical or legal obstacles to
reducing burden.
This planned collection of data will allow PAEIM Team the opportunity to design appropriate
valid and reliable data collection instrumentation, and the prerogative to modify and alter
instruments in an on-going manner in response to changes in respondent demographics and the
NASA Office of Education portfolio of activities. Because this collection is expected to be an
on-going effort, it has the potential to have immediate impact on all data collection
instrumentation within NASA Education. Any delay would sacrifice potential gains in
development of and modification to data collection instrumentation as a whole.

7. SPECIAL CIRCUMSTANCES
Explain any special circumstances that would cause an information collection to be conducted
in a manner: requiring respondents to report information to the agency more often than
14

quarterly; requiring respondents to prepare a written response to a collection of information in
fewer than 30 days after receipt of it; requiring respondents to submit more than an original
and two copies of any document; requiring respondents to retain records, other than health,
medical, government contract, grant-in-aid, or tax records, for more than three years; in
connection with a statistical survey, that is not designed to produce valid and reliable results
that can be generalized to the universe of study; requiring the use of a statistical data
classification that has not been reviewed and approved by OMB; that includes a pledge of
confidentiality that is not supported by authority established in statute or regulation, that is not
supported by disclosure and data security policies that are consistent with the pledge, or which
unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secrets, or other confidential information
unless the agency can demonstrate that it has instituted procedures to protect the information's
confidentiality to the extent permitted by law.
Not applicable. This data collection does not require any one of the reporting requirements listed.

8. FEDERAL REGISTER ANNOUNCEMENT AND CONSULTATION OUTSIDE THE AGENCY
If applicable, provide a copy and identify the date and page number of publication in the
Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on
the information collection prior to submission to OMB. Summarize public comments received in
response to that notice and describe actions taken by the agency in response to these comments.
Specifically address comments received on cost and hour burden.
•

The 60-day Federal Register Notice, Volume 83, Number 399 (pages 399-400) was
published on 1/3/2018. No comments were received from the public.

•

The 30-day Federal Register Notice, Volume 83, Number 9870 (pages 9870-9871)
was published on 3/8/2018. No comments were received from the public.

Describe efforts to consult with persons outside the agency to obtain their views on the
availability of data, frequency of collection, the clarity of instructions and recordkeeping,
disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed,
or reported.
The NASA Office of Education (OE) will continue to leverage its civil servant and contractor
workforce to develop strategies, design programs, sustain operations, implement new
application and capabilities, develop business processes and training guidance, and provided
support to stakeholders and end users. Key to an effective portfolio of programs is having a
more rigorous approach to planning and implementation of activities through the use of
evidence-based effective practices for STEM education and evaluation. An important
component of these performance assessment and evaluation activities, is the review and input by
a panel of nationally recognized experts in STEM. For this reason, NASA OE will also consult
15

with relevant expertise from individuals outside of the agency through a Performance
Assessment and Evaluation Expert Review Panel (ERP) to obtain views and feedback on
performance measurement activities including, but not limited to: internal and external performance
measures and recommended data collection sources, process and tools, as well as NASA evidencebased decision making. The ERP will act as a technical review working group providing
expertise and feedback in the following areas: program structure and evaluation, K12/higher
education and diversity, building technical research capacity at higher education institutions,
information technology systems/social media and emerging technologies, science literacy and
large scale public engagement campaigns.

9. PAYMENT OR GIFTS TO RESPONDENTS
Explain any decision to provide any payment or gift to respondents, other than remuneration
of contractors or grantees.
Not applicable. NASA Office of Education does not offer payment or gifts to respondents.

10. ASSURANCE OF CONFIDENTIALITY
Describe any assurance of confidentiality provided to respondents and the basis for
the assurance in statute, regulation, or agency policy.
NASA Education is committed to protecting the confidentiality of all individual respondents
that participant in data collection instrumentation testing. Any information collected under the
purview of this clearance will be maintained in accordance with the Privacy Act of 1974, the eGovernment act of 2002, the Federal Records Act, and as applicable, the Freedom of
Information Act in order to protect respondents’ privacy and the confidentiality of the data
collected (See Appendix E.)
The data collected from respondents will be tabulated and analyzed only for the purpose of
evaluating the research in question. Laboratory respondents will be asked to read and sign a
Consent form, a personal copy of which they are provided to retain. The Consent form explains
the voluntary nature of the studies and the use of the information, describes the parameters of
the interview (taped or observed), and provides assurance of confidentiality as described in
NASA Procedural Requirements (NPR) 7100.1.9
The consent form administered will be edited as appropriate to reflect the specific testing
situation for which the participant is being recruited (See Appendix C). The confidentiality
statement, edited per data collection source, will be posted on all data collection screens and
instruments, and will be provided to participants in methodological testing activities per
NPR 7100.1 (See Appendix E.)

16

9

The entire NPR 7100.1 Protection of Human Research Subjects (Revalidated 6/26/14) may be found at:
http://nodis3.gsfc.nasa.gov/displayDir.cfm?Internal_ID=N_PR_7100_0001_&page_name=main

11. JUSTIFICATION FOR SENSITIVE QUESTIONS
Provide additional justification for any questions of a sensitive nature, such as sexual
behavior and attitudes, religious beliefs, and other matters that are commonly considered
private. This justification should include the reasons why the agency considers the questions
necessary, the specific uses to be made of the information, the explanation to be given to
persons from whom the information is requested, and any steps to be taken to obtain their
consent.
Assuring that students participating in NASA education projects are representative of the
diversity of the Nation requires NASA Education to capture the race, ethnicity, and disability
statuses of its participants. Therefore, to assure the reliability and validity of its data collection
instruments, PAEIM Team in collaboration with the OEIT Systems Team and Center
Education Offices, will need to ascertain that study participants are representative of students
participating in NASA education projects. Race and ethnicity information is collected
according to Office of Management and Budget (1997) guidelines in “Revisions to the
Standards for the Classification of Federal Data on Race and Ethnicity.”10 Although disclosure
of race and ethnicity are not required to be considered for opportunities at NASA, respondents
are strongly encouraged to submit this information. The explanation given to respondents for
acquiring this information is as follows:
In order to determine the degree to which members of each ethnic and racial group are reached
by this internship/fellowship program, NASA requests that the student select the appropriate
responses below. While providing this information is optional, you must select decline to answer
if you do not want to provide it. Mentors will not be able to view this information when
considering students for opportunities. For more information, please visit
http://www.nasa.gov/about/highlights/HP_Privacy.html.

Information regarding disabilities is collected according to guidelines reflected in the “SelfIdentification of Disability” form SF-256 published by the Office of Personnel Management
(Revised July 2010) and is preceded by the following statement:
An individual with a disability: A person who (1) has a physical impairment or mental
impairment (psychiatric disability) that substantially limits one or more of such person's major
life activities;
(2) has a record of such impairment; or (3) is regarded as having such an impairment. This
definition is provided by the Rehabilitation Act of 1973, as amended (29 U.S.C 701 et.
seq.)11

Regulations safeguarding this information is provided to study participants on the informed
consent form as governed by NPR 7100.1.

17

10
11

http://www.whitehouse.gov/omb/fedreg_1997standards
http://www.opm.gov/forms/pdf_fill/sf256.pdf

12. ESTIMATE OF RESPONDENT BURDEN
Provide estimates of the hour burden of the collection of information. The statement
should: Indicate the number of respondents, frequency of response, annual hour burden,
and an explanation of how the burden was estimated. Unless directed to do so, agencies
should not conduct special surveys to obtain information on which to base hour burden
estimates.
Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour
burden on respondents is expected to vary widely because of differences in activity, size, or
complexity, show the range of estimated hour burden, and explain the reasons for the variance.
The estimate of respondent burden for methodological testing is as follows (See Table 1):
Table 1: Estimate of Respondent Burden for Methodological Testing

Data Collection
Sources
Office of Education
Performance
Measurement
System

One Stop Shopping
Initiative

Statistically
Adjusted Number
of Respondents

Frequency
of
Response

Total
minutes
per
Response

Total
Response
Burden
in Hours

629

2

20

420

639

2

15

319

External program
manager- Data
collection screens

264

2

60

528

Pre-College
surveys

517

2

10

172

Undergraduate
surveys
Graduate surveys

618
444

2
2

20
20

412
296

Post-Graduate
surveys

247

2

20

165

Respondent
Category
Undergraduate
and graduate
student profiles
Educator
participant
surveys

Total Burden for
Methodological
Testing

3,358

Generally, estimates should not include burden hours for customary and usual business
practices. If this request for approval covers more than one form, provide separate hour
18

2,312

burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.
Not applicable.
Provide estimates of annualized cost to respondents for the hour burdens for collections of
information, identifying and using appropriate wage rate categories. The cost of contracting out
or paying outside parties for information collection activities should not be included here.
Instead, this cost should be included in Item 13.
The estimate of annualized cost to respondents for methodological testing is as follows (See
Table 2). Annualized Cost to Respondents is calculated by multiplying Total Response Burden
in Hours by Wage specific to Respondent Category (Bureau of Labor Statistics, 2014).
Table 2: Estimate of Annualized Cost to Statistically Adjusted Number of Respondents Required for
Methodological Testing

Total
Response
Burden
in Hours

Wage

Annualized Cost to
Respondents

Data Collection Sources

Respondent
Category

Office of Education
Performance Measurement
System

Undergraduate and
graduate student
profile

420

7.25

$3,042.52

Educator
participant surveys

319

25.09

$8,015.32

External program
manager- Data
collection screens

528

25.09

$13,243.60

Pre-College surveys

172

7.25

$1,249.98

Undergraduate
surveys
Graduate surveys

412
296

7.25
7.25

$2,985.26
$2,146.71

Post-Graduate
surveys

165

7.25

$1,192.98

One Stop Shopping Initiative

Total Burden for
Methodological
Testing

2,312

$31,876.37

13. COST BURDEN TO RESPONDENTS
Provide an estimate for the total annual cost burden to respondents or record keepers resulting
from the collection of information. (Do not include the cost of any hour burden shown in Items
12 and 14). The cost estimate should be split into two components: (a) a total capital and startup cost component (annualized over its expected useful life) and (b) a total operation and
19

maintenance and purchase of services component. The estimates should take into account costs
associated with generating, maintaining, and disclosing or providing the information. Include
descriptions of methods used to estimate major cost factors including system and technology
acquisition, expected useful life of capital equipment, the discount rate(s), and the time period
over which costs will be incurred. Capital and start-up costs include, among other items,
preparations for collecting information such as purchasing computers and software;
monitoring, sampling, drilling and testing equipment; and record storage facilities. If cost
estimates are expected to vary widely, agencies should present ranges of cost burdens and
explain the reasons for the variance. The cost of purchasing or contracting out information
collections services should be a part of this cost burden estimate. In developing cost burden
estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day
pre-OMB submission public comment process and use associated with the rulemaking
containing the information collection, as appropriate. Generally, estimates should not include
purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2)
to achieve regulatory compliance with requirements not associated with the information
collection, (3) for reasons other than to provide information or keep records for the
government, or (4) as part of customary and usual business or private practices.
Not applicable. Participation in testing does not require respondents to purchase equipment,
software, or contract out services. The instruments used will be available in electronic format
only. NASA Office of Education’s expectation is all targeted respondents can access the NASA
OEPM System/forms/instruments electronically for the purposes of testing as they have in the
past when applying to NASA opportunities.

14. COST BURDEN TO FEDERAL GOVERNMENT
Provide estimates of annualized costs to the Federal government. Also, provide a description of
the method used to estimate cost, which should include quantification of hours, operational
expenses (such as equipment, overhead, printing, and support staff), and any other expense that
would not have been incurred without this collection of information. Agencies may also
aggregate cost estimates from Items 12, 13, and 14 in a single table.
The total annualized cost estimate for this information collection is $0.7 million based on
existing contract expenses that include contract staffing, staff training for data collection, data
cleaning, validation, and management, and reporting relating to contract staffing for online
systems including but not limited to OEPM and OSSI that compose the OEIT Systems Team
data collection suite. Note, the two online systems will be assessed for continuous improvement
opportunities in alignment with the performance assessment and evaluation strategic framework.

15. REASON FOR CHANGE IN BURDEN
Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the
OMB Form 83-I.
Not applicable. This is a renewal application for methodological testing of data
20

collection instrumentation within the NASA Office of Education by the PAEIM
Team.

16. SCHEDULE FOR INFORMATION COLLECTION AND PUBLICATION
For collections of information whose results will be published, outline plans for tabulation and
publication. Address any complex analytical techniques that will be used. Provide the time
schedule for the entire project, including beginning and ending dates of the collection of
information, completion of report, publication dates, and other actions.
NASA Education may make available results of methodological testing to other federal STEM
agencies in the form of peer-reviewed methods reports or white papers describing best practices and
lessons learned on an as-appropriate basis determined by NASA Education leadership. Although
there is no intent to publish in academic journals, standards for drafting will reflect peer-reviewed,
publication-level standards of quality.

17. DISPLAY OF OMB EXPIRATION DATE
If seeking approval to not display the expiration date for OMB approval of the information
collection, explain the reasons that display would be inappropriate.
The OMB Expiration Date will be displayed on every data collection instrument, once approval
is obtained.

18. EXCEPTION TO THE CERTIFICATE STATEMENT
Explain each exception to the certification statement identified in Item 19, "Certification for
Paperwork Reduction Act Submissions," of OMB Form 83-I.
NASA does not take exception to the certification statements below:
The proposed collection of information –
(a) is necessary for the proper performance of the functions of NASA, including that the information to be
collected will have practical utility;
(b) is not unnecessarily duplicative of information that is reasonably accessible to the agency;
(c) reduces to the extent practicable and appropriate the burden on persons who shall provide information
to or for the agency, including with respect to small entities, as defined in the Regulatory Flexibility Act (5
U.S.C. 601(6)), the use of such techniques as:
(1) establishing differing compliance or reporting requirements or timelines that take into account
the resources available to those who are to respond;
(2) the clarification, consolidation, or simplification of compliance and reporting requirements; or
(3) an exemption from coverage of the collection of information, or any part thereof;
(d) is written using plain, coherent, and unambiguous terminology and is understandable to those who are
targeted to respond;

21

(e) indicates for each recordkeeping requirement the length of time persons are required to maintain the
records specified;
(f) has been developed by an office that has planned and allocated resources for the efficient and effective
management and use of the information to be collected, including the processing of the information in a
manner which shall enhance, where appropriate, the utility of the information to agencies and the public;
(g) when applicable, uses effective and efficient statistical survey methodology appropriate to the purpose
for which the information is to be collected; and
(h) to the maximum extent practicable, uses appropriate information technology to reduce burden and
improve data quality, agency efficiency and responsiveness to the public; and
(i) will display the required PRA statement with the active OMB control number, as validated on
www.reginfo.gov

Name, title, and organization of NASA Information Collection Sponsor certifying statements
above:
NAME: Richard L. Gilmore Jr., M.Ed.
TITLE: Educational Programs Specialist/Evaluation Manager
ORG: Office of Education Performance Assessment and Evaluation Information Management
(PAEIM) Team

22

References
Bureau of Labor Statistics. (2014). Retrieved from http://www.bls.gov/home.htm.
Colton, D., & Covert, R. W. (2007). Designing and constructing instruments for social reserch
and evaluation. San Francisco: John Wiley and Sons, Inc.
Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four
recommendations for getting the most from your analysis. Practical Assessment,
Research & Evaluation, 10(7), 1-9.
Crede, E., & Borrego, M. (2013). From ethnography to items: A mixed methods approach to
developing a survey to examine graduate engineering student retention. Journal of Mixed
Methods Research, 7(1), 62-80.
Davidshofer, K. R., & Murphy, C. O. (2005). Psychological testing: Principles and applications.
(6th ed.). Upper Saddle River, NJ: Pearson/Prentice Hall.
DeMars, C. (2010). Item response theory. New York: Oxford University Press.
Duckworth, A. L., Peterson, C., Matthews, M. D., & Kelly, D. R. (2007). Grit: Perserverance and
passion for long-term goals. Journal of Personality and Social Psychology, 92(6), 10871101.
Fabrigar, L. R., & Wegener, D. T. (2011). Exploratory factor analysis. New York, NY: Oxford
University Press.
Haladyna, T. M. (2004). Developing and validating multiple-choice test items (3rd ed.).
Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Jaaskelainen, R. (2010). Think-aloud protocol. In Y. Gambier, & L. Van Doorslaer (Eds.),
Handbook of translation studies (pp. 371-373). Philadelphia, PA: John Benjamins.
Komrey, J. D., & Bacon, T. P. (1992). Item analysis of acheivement tests based on small
numbers of examinees. Paper presented at the annual meeting of the American
Educational Research Association. San Francisco.
Kota, K. (n.d.). Testing your web application: A quick 10-step guide. Retrieved from
http://www.adminstrack.com/articles/testing_web_apps.pdf.

23

National Center for Education Statistics, U. (2009). High School Longitudinal Study of 2009,
First Follow-up. OMB No: 1850-0852.
National Science and Technology Council. (2013). Federal science, technology, engineering,
and mathematics (STEM) education 5 year strategic plan. Retrieved from
http://www.whitehouse.gov/sites/default/files/microsites/ostp/stem_stratplan_2013.pdf.
Reckase, M. D. (2000). The minimum sample size needed to calibrate items using the threeparameter logistic model. Paper presented at the annual meeting of the American
Educational Research Association. New Orleans.
Wilson, M. (2005). Constructing measures: An item response modeling approach. New York:
Psychology Press.

24

APPENDIX A: NASA Education Goals
Education is a fundamental part of NASA's work to execute its vision to discover and expand
knowledge for the benefit of humanity. NASA will continue to pursue three major education
goals:
•
•
•

Strengthening NASA and the Nation's future workforce
Attracting and retaining students in science, technology, engineering and mathematics, or
STEM, disciplines
Engaging Americans in NASA's mission

NASA's education program strives to "inspire and motivate students to pursue careers in science,
technology, engineering, and mathematics" by supporting education in the Nation's schools and
to "engage the public in shaping and sharing the experience of exploration and discovery" by
supporting informal education and public outreach efforts. NASA's commitment to education
places special emphasis on these goals by increasing elementary and secondary education
participation in NASA projects; enhancing higher education capability in STEM disciplines;
increasing participation by underrepresented and underserved communities; expanding eEducation; and expanding NASA's participation with the informal education community.
The Office of Education will continue to support NASA's strong historical role in education at all
levels, with linkages to NASA research as a central part of our focus. The majority of NASA
support to higher education is delivered through the NASA Mission Directorates.
The Office of Education supports the work of the Mission Directorates by coordinating projects
for students, faculty, and institutions that broaden the base of those who compete for NASA
research awards. These efforts will help create and sustain the scientific and engineering
workforce of the future. In addition, the Office of Education will continue to emphasize sharing
the results of NASA missions and research programs with wider audiences by using science
discoveries and research applications as vehicles to improve teaching and learning at all levels.

25

APPENDIX B: NASA Center Education Offices
Strategic management of the NASA education portfolio requires the participation of the Office
of Education (headquarters), the four Mission Directorates and all ten NASA Centers. This
extensive participation provides broad education engagement with NASA content, people and
facilities. Close and effective consultation, coordination and cognizance among all entities are
critical to the optimal fulfillment of NASA's objectives relative to its education investment.
The Office of Education provides integration and evaluation support to the Education
Coordinating Committee (ECC). As such, the Office of Education IT (OEIT) Systems Team
maintains a centralized database of all NASA education activities and investments, and supports
coordination of evaluation and assessment of the Agency education portfolio. The Performance
Assessment and Evaluation Information Management (PAEIM) Team works closely with the
Office of the Chief Information Officer (OCIO) to develop Paperwork Reduction Act (PRA)
guidance and training resources for Center Education Office. Upon improved compliance of the
Center Education Offices, all Centers will submit data collection instruments for development
and clearance through the PAEIM Team first and then approval by the NASA OMB liaison
prior to submission to OMB. This process will reduce burden on the Education community
while optimizing data collection.
Center Education Offices are responsible for implementing NASA education programs, projects
and activities for the Mission Directorates and the Office of Education, as well as planning and
implementing education projects that are unique to and funded by their Centers. Centers are
responsible for execution of programs and projects and for institutional assets. The Center
Education Offices provide expertise in state standards and requirements in their area of
geographic responsibility for K-12 education, and provide valuable field-based input into
education program planning.
Locations of NASA Center Education Offices
Ames Research Center
Ames specializes in research geared towards creating new knowledge and new technologies that
span the spectrum of NASA interests.

Armstrong Flight Research Center
As the lead for flight research, Armstrong continues to innovate in aeronautics and space
technology. The newest, fastest, the highest -- all have made their debut in the vast, clear desert
skies over Armstrong.

26

Glenn Research Center
Glenn Research Center develops and transfers critical technologies that address national priorities
through research, technology development, and systems development for safe and reliable
aeronautics, aerospace, and space applications.

Goddard Space Flight Center
The mission of the Goddard Space Flight Center is to expand knowledge on the Earth and its
environment, the solar system, and the universe through observations from space.

Jet Propulsion Laboratory
The Jet Propulsion Laboratory, managed by the California Institute of Technology is NASA's lead
center for robotic exploration of the Solar System.

Johnson Space Center
From the early Gemini, Apollo, and Sky Lab projects to today's Space Shuttle and International
Space Station programs, Johnson Space Center continues to lead NASA's effort in Human Space
Exploration.

Kennedy Space Center
Kennedy Space Center is America's Gateway to the Universe -- leading the world in preparing and
launching missions around the Earth and beyond.

Langley Research Center
Langley continues to forge new frontiers in aviation and space research for aerospace,
atmospheric sciences, and technology commercialization to improve the way the world lives.

Marshall Space Flight Center
Bringing people to space; bringing space to people. Marshall Space Flight Center is world leader
in the access to space and use of space for research and development to benefit humanity.

Stennis Space Center
Stennis is responsible for NASA's rocket propulsion testing and for partnering with industry to
develop and implement remote sensing technology.

27

APPENDIX C: Data Instrument Collection Testing Participation Generic
Consent Form12
In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this
study is sponsored by the National Aeronautics and Space Administration (NASA) Office of Education
Performance Assessment and Evaluation Information Management (PAEIM) Team, under authority of the
Government Performance and Results Modernization Act (GPRMA) of 2010 that requires quarterly
performance assessment of Government programs for purposes of assessing agency performance and
improvement. Your participation is important to the success of this study. The information we collect will
help us improve the nature of NASA education project activities and the accuracy with which NASA Office
of Education can report to the stakeholders about the project activities offered. The NASA PAEIM Team will
use the information provided for statistical purposes related to data collection instrument development only
and will hold the information in confidence to the full extent permitted by law. Information will be secured
and removed from this server and location upon guidelines set out by the NASA Records Retention Schedule
1392, 68-69. Although the following efforts will be taken to ensure confidentiality, there remains a remote
risk of personal data becoming identifiable. A non-identifying code number will be assigned to participants’
data records, which will be stored in accordance with federal regulatory procedures and accessible only to the
investigator. Any use of individual data to illustrate specific assessment results will be labeled in a manner to
preserve the participants’ anonymity. In no way does refusing participation in this instrument development
study preclude you from eligibility for NASA education project activities now or in the future.

Introduction
This research seeks to support the mission of the NASA Office of Education by asking you to take
part in a (focus group/cognitive interview/ instrument development testing) pertaining to our
interest in the ways in which NASA project activities impact outcomes for participants.13 The
information we collect will help us to improve the nature of the project activity and the accuracy
with which NASA Office of Education can report to the community about the project activities it
offers.
Purpose of the Study
Determine the degree to which this instrument accurately captures the ways participant outcomes
are measured by this data collection instrument.
Description of Study Procedures
Participants will be asked to complete XXX.
There are no foreseeable risks to participants electing to participate in this study.
Estimation of Time Required
We estimate it will take you an average of [enter #] minutes to participate in this research (ranging from
[enter #] minutes to [enter #] minutes).
Securing Your Responses
Under no circumstances will the results of your surveys be shared with anyone without your
explicit permission. The results of this research may be presented at meetings or in publications,
12

Once approved by OMB, this form will be submitted to NASA Forms Management according to NASA Policy
Directive (NPD) 1420. Thus, this form, and all others used under this clearance, will have both an OMB control
number and an NPD 1420 control number that also restricts access to NASA internal users only.
13
This clearance package is to obtain permission to develop instruments to be used in testing that will be approved
by OMB first for inclusion under this clearance prior to testing.

28

however your identity will not be disclosed. Presentations and manuscripts typically contain
participants’ quotes, but participants are never identified by name. Your involvement in the
development of this instrument is entirely voluntary and you have the right to discontinue
participation at any time.
Contact Persons
If you have any additional questions concerning the research, this informed consent, or
confidentiality of responses, please contact Richard L. Gilmore Jr., Evaluation Manager, at
[email protected] or call (216)433-5493.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

I have read and understand the contents of this study information and informed consent form and
have been encouraged to ask questions. I have received answers to the questions I have asked. I
give my consent to participate freely in this research. I have signed and retained a copy of the
information and consent form for my records and future reference. I have signed and submitted
this information and consent form for the researcher’s records.

Participant's signature

Date

Participant's printed name

Researcher's signature

OMB Control Number: XXXX-XXXX
Expiration Date: [enter expiration date]
HQ-Form-XXXX MM/YYYY

PREVIOUS EDITIONS ARE OBSOLETE
29

APPENDIX D: Descriptions of Methodological Testing Techniques
•

•

•

Usability testing: Pertinent are the aspects of the web user interface (UI) that impact the
User’s experience and the accuracy and reliability of the information Users submit. The
ease with which Users navigate the data collection screens and the ease at which the User
accesses the actions and functionality available during the data input process are equally
important. User experience is also impacted by the look and feel of the web UI and the
consistency of aesthetics from page to page, including font type, size, color scheme
utilized and the ways in which screen real estate is used (Kota, n.d.). The foundation for
Usability testing will be a think-aloud protocol analysis as described by Jääskeläinen
(2010) that exposes distractions to accurate input of data whereas a short Likert Scale
survey with qualitative questions will determine the extent of distraction and nature of the
distractions that impede accurate data input.
Think-aloud protocols (commonly referred to as cognitive interviewing): This data
elicitation method is also called ‘concurrent verbalization’, meaning subjects are asked to
perform a task and to verbalize whatever comes to mind during task performance. The
written transcripts of the verbalizations are referred to as think-aloud protocols (TAPs)
(Jääskeläinen, 2010, p 371) and constitute the data on the cognitive processes involved in
a task (Ericsson & Simon, 1984/1993). When elicited with proper care and instruction,
think-aloud does not alter the course or structure of thought processes, except with a
slight slowing down of the process. Although high cognitive load can hinder
verbalization by occupying all available cognitive resources, that property is of no
concern regarding the tasks under analysis that are restricted to information actively
processed in working memory (Jääskeläinen, 2010, p. 371). For the purposes of NASA
Education, think-aloud protocols will be especially useful towards the improvement of
existing and developing of new data collection screens, which are different in purpose
from online applications. Whereas an online application is an electronic collection of
fields that one either scrolls through or submits, completed page by completed page, data
collection screens represent hierarchical layers of interconnected information for which
user training is required. Since user training is required for proper navigation, think-aloud
protocols capture the user experience to incorporate it into a more user-friendly design
and implementation of this kind of technology. Lastly, data from think-aloud protocols is
used to ensure that user experiences are reliable and consistent towards collecting robust
data.
Focus group interviews: With groups of nine or less per instrument, this qualitative
approach to data collection is a matter of brainstorming to creatively solve remaining
problems identified after early usability testing of data collection screen and program
application form instruments (Colton & Covert, 2007, p. 37). Data from this type of
research will include audiotapes obtained with participant consent, meeting minutes taken
30

•

•

•

•

•

by a subject matter expert in administration assistance, and reflective comments
submitted by participants after conclusion of the focus group. Focus group interviews
may be used to refine items that failed initial reliability testing for the purposes of
retesting. Lastly, focus group interviews may be used with participants as a basis for a
grounded theory approach to instrument development or for refining an already existing
instrument to be appropriate to a specific audience.
Comprehensibility testing: Comprehensibility testing of program activity survey
instrumentation will determine if items and instructions make sense, are ambiguous, and
are understandable by those who will complete them. For example, comprehensibility
testing will determine if items are complex, wordy, or incorporate discipline- or
culturally-inappropriate language (Colton & Covert, 2007, p. 129).
Pilot testing: After program activity survey instruments have performed satisfactorily in
readability and comprehensibility testing, the next phase is pilot testing with a sample of
the target population that will yield statistically significant data, a random sample of at
least 200 respondents (Komrey and Bacon, 1992; Reckase, 2000). The goal of pilot
testing is to yield preliminary validity and reliability data to determine if items and the
instrument are functioning properly (Haladyna, 2004; Wilson, 2005). Data gleaned from
pilot testing will be used to fine-tune items and the instrument in preparation for more
complex statistical analysis upon large-scale statistical testing.
Large-scale statistical testing: Instrument testing conducted with a statistically
representative sample of responses from a population of interest. In the case of
developing scales, large-scale statistical testing provides sufficient data points for
exploratory factor analysis (EFA), a multivariate statistical method used to uncover the
underlying structure of a relatively large set of variables and is commonly used when
developing a scale, a collection of questions used to measure a particular research topic
(Fabrigar & Wegener, 2011). EFA is a “large-sample” procedure where generalizable
and/or replicable results is a desired outcome (Costello & Osborne, 2005, p.5). This
technique is particularly relevant to examining relationships between participant traits
and the desired outcomes of NASA Education project activities.
Item response approach to constructing measures: Foundations for testing that address the
importance of item development for validity purposes, address item content to align with
cognitive processes of instrument respondents, and that acknowledge guidelines for
proper instrument development will be utilized in a systematic and rigorous process.
Validity will be determined as arising from item development, from statistical study of
item responses, and from exploring item response patterns via methods prescribed by
Haladyna (2004) and Wilson (2005.)
Split-half method: This method for determining test reliability is an efficient solution to
parallel-forms or test/retest methods. Split-half method does not require developing
alternate forms of a survey and it places a reduced burden on respondents in comparison
to other methods, requiring participation in a single test scenario rather than requiring
retesting at a later date. This method involves administering a test to a group of
31

individuals, dividing the test in half along odd and even item numbers, and then
correlating scores on one half of the test with scores on the other half of the test
(Davidshofer & Murphy, 2005).

32

APPENDIX E: Privacy Policies and Procedures
•

•
•

•

•

•

Information collected under the purview of this clearance will be maintained in
accordance with the Privacy Act of 1974, the e-Government act of 2002, the Federal
Records Act, NPR 7100.1, and as applicable, the Freedom of Information Act in order to
protect respondents’ privacy and the confidentiality of the data collected.14
Data is maintained on secure NASA servers and protected in accordance with NASA
regulations at 14 CFR 1212.605.
Approved security plans are in place for the Office of Education Performance
Measurement (OEPM) system in accordance with the Federal Information Security
Management Act of 2002 and Office of Management and Budget, Circular A-130,
Management of Federal Information Resources.
Only authorized personnel requiring information in the official discharge of their duties
are authorized access to records from workstations within the NASA Intranet or via a
secure Virtual Private Network (VPN) connection that requires two-factor hardware
token authentication.
OEPM resides in a certified NASA data center and has met strict requirements relating to
application security, network security, and backup/recovery of the NASA Office of the
Chief Information Officer’s security plan.
Data will be secured and removed from this server and location upon guidelines set out
by the NRRS/1392, 68-69. Specific guidelines relevant to the OPEM system include the
following:
o Project management records documenting basic information about projects and/or
opportunities, including basic project descriptions, funding amounts and sources,
project managers, and NASA Centers, will be destroyed when 10 years old or
when no longer needed, whichever is longer.
o Records of participants (in any format), maintained either as individual files
identified by individual name or number, or in aggregated files of multiple
participants identified by name or number, including but not limited to application
forms, personal information supplied by the individuals, will be destroyed 5 years
after the last activity with the file.
o Survey responses and other feedback (in any format) from project participants and
the general public concerning NASA educational programs, including interest
area preferences, participant feedback, and reports of experiences in projects, will
be destroyed when 10 years old or when no longer needed, whichever is longer.

14

http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html

33

The following Confidentiality Statement and Paperwork Reduction Act (PRA) statement, edited
per data collection source, will be posted on all data collection screens and instruments, and will
be provided to participants in methodological testing activities per NPR 7100.1:
Privacy Act Statement: In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are
hereby notified that this study is sponsored by the National Aeronautics and Space Administration (NASA)
Office of Education, under authority of the Government Performance and Results Modernization Act
(GPRMA) of 2010 that requires quarterly performance assessment of Government programs for purposes
of assessing agency performance and improvement. Your participation is important to the success of this
study. The information we collect will help us improve the nature of NASA education project activities and
the accuracy with which NASA Office of Education can report to the stakeholders about the project
activities offered.
Paperwork Reduction Act Statement: This information collection meets the requirements of 44 U.S.C.
§3507, as amended by section 2 of the Paperwork Reduction Act of 1995. You do not need to answer these
questions unless we display a valid Office of Management and Budget (OMB) control number. The OMB
control number for this collection is 2700-0159 and expires 04/30/2018. Send comments to:
[email protected].

34

APPENDIX F: Overview: NASA Education Data Collection Instrument
Development Process

FROM OUTPUTS TO SCIENCE, TECHNOLOGY, ENGINEERING, AND MATHEMATICS
(STEM) EDUCATION OUTCOMES MEASUREMENT: DATA COLLECTION
INSTRUMENT DEVELOPMENT PROCESS
WORKING WITH THE PROJECT MANAGERS AND PROGRAM DIRECTORS
I. Develop a logic model
a. Information & training sessions to provide guidance
b. Facilitation of logic modeling upon request
c. Review and recommendations to ensure incorporation of evidence-based practice
II. Identify outputs and short-term outcomes from logic models for performance
indicators
a. Identify outputs and outcomes across lines of business and projects aligned with
CAP goals and FC-STEM investment priority areas
b. Convert outputs and outcomes into performance indicators and outcome measures,
identifying required data elements and data collection methods
UNDERSTANDING THE IMPACT OF STEM EDUCATION PROJECT ACTIVITIES ON PARTICIPANTS
III. Develop survey instruments based on NASA Education project performance
indicators and outcome measures
a. Conduct a scholarly STEM education and measurement literature review (assures
that the evidence base is rigorous and current)
b. Connect outcomes from literature review with identified outcome measures, given
constraints of inputs and within the context of activities
c. Search the STEM education research and measurement literature for instrument
candidates for adaptation (previous literature review augments this step)15
d. Create a draft instrument targeting a specific project activity to explore specific
outcomes impacted by the quality of outputs (e.g., non-cognitive competencies
associated with STEM degree attainment in the NASA Internships and
Fellowships)16

15

Provides opportunity to add to the research literature while using an instrument already determined to be reliable
and valid for a particular respondent population.
16
For example, reporting on STEM undergraduate attainment is much less meaningful without understanding what
kinds of experiences contributed to degree attainment and the quality of their NASA experience.

35

i. Draft should be lengthy and exhaustive to allow editing down in the testing
process
ii. Draft should reflect many questions that ask the same question to allow
editing down
iii. Draft should demonstrate multiple items per construct as convergence is
important
e. Obtain stakeholder feedback & edit instrument draft
i. Editing question type
ii. Adding new constructs and items
f. Conduct cognitive interviews with a small number (less than 10) of appropriate
respondents & edit accordingly17
i. Editing question language
ii. Editing question type
DEVELOPING VALID AND RELIABLE DATA COLLECTION INSTRUMENTS
IV. Conduct field test of an instrument draft
a. Provide draft to OMB to approve for testing under the NASA OE methodological
testing generic clearance (no official timeline associated with this informal process)
b. Small scale field testing18
i. Statistical analysis of responses
ii. Remove items with low p-values
c. Large scale field testing
i. Determine population/universe size for respondent audience
ii. Implement steps to enhance response rate
iii. Remove items with low p-values
OBTAINING AND MAINTAINING OMB-APPROVED DATA COLLECTION INSTRUMENTS
V. Obtain clearance from OMB for tested data collection instruments
a. Update OMB-approved drafts according to results obtained from large scale field
testing
b. Submit tested data collection instrument for review by OMB, in accordance with
the terms of clearance set upon approval of the plan as stipulated in the generic
clearance.19

17

Involves qualitative research skills and analysis using software NASA Ed has provided for this purpose.
Involves statistical analysis skills and analysis using software NASA Ed has procured.
19
PRA_Gen_ICRs_5-28-2010.pdf.
Accessed at https://www.whitehouse.gov/sites/default/files/omb/assets/inforeg/PRA_Gen_ICRs_5-28-2010.pdf
18

36

VI. Reevaluate instrument function
a. Maintain first universe of collected responses as baseline data
b. On an annual basis, pool recently collected instrument responses with current data
set and rerun statistical analyses
c. Take barely passing items back through process starting at III.f.
d. Integrate refreshed items into instrument and forward draft to OMB for approval
under the NASA OE methodological testing generic clearance
VII.

Reevaluate alignment of data collection instruments
a. Maintain alignment with portfolio as updated
b. Maintain alignment with line of business logic model as updated

37

APPENDIX G: Explanatory Content for Information Collections for
Testing Purposes
Every information collection for the purposes of methodological testing will be prefaced by a
version of the information categories, edited to be appropriate for that particular instrument
and audience. Below is a sample that demonstrates the type of information and content that
reflects the following: 1.) Source of adaptation (if applicable); 2) Constructs of interest; 3)
Bibliographic sources that support the particular adaption or instrument draft; 4) Privacy
statement; 5) Instrument introduction; 6) Purpose of the study; 7) Description of study
procedures; 8) Estimate of time to complete the instrument; 9) Assurance of confidentiality; 10)
Contact person’s information; 11) Office of Management and Budget control information; and
12) NASA headquarters form information.

38

39

List of Tables
Table 1: Estimate of Respondent Burden for Methodological Testing ........................................ 12
Table 2: Estimate of Annualized Cost to Statistically Adjusted Number of Respondents Required
for Methodological Testing........................................................................................................... 13

40


File Typeapplication/pdf
AuthorWills, Lisa E (HQ-HA000)[VALADOR INC]
File Modified2018-04-27
File Created2018-04-27

© 2024 OMB.report | Privacy Policy