NASA Internships Retrospective Survey

NASA Internships Retrospective Survey and Cognitive Interview Protocol (MT Generic Clearance) SHORT FORM (4-19-23).docx

Generic Clearance for the NASA Office of Education Performance Measurement and Evaluation (Testing)

NASA Internships Retrospective Survey

OMB: 2700-0159

Document [docx]
Download: docx | pdf

REQUEST FOR APPROVAL under the Generic Clearance for NASA STEM Engagement Performance Measurement and Evaluation, OMB Control Number 2700-0159, expiration 09/30/2024.

_____________________________________________________________________________________


  1. TITLE OF INFORMATION COLLECTION:

NASA Internships Retrospective Survey & Cognitive Interview Protocol


  1. TYPE OF COLLECTION:

þ

Attitude/Behavior Scale

o

Baseline Survey

þ

Cognitive Interview Protocol

o

Consent Form

o

Focus Group Protocol

o

Follow-up Survey

þ

Instructions

o

Satisfaction Survey

o

Usability Protocol


GENERAL OVERVIEW: NASA Science, Technology, Engineering, and Mathematics (STEM) Engagement is comprised of a broad and diverse set of programs, projects, activities and products developed and implemented by HQ functional Offices, Mission Directorates and Centers. These investments are designed to attract, engage, and educate students, and to support educators, and educational institutions. NASA’s Office of STEM Engagement (OSTEM) delivers participatory, experiential learning and STEM challenge activities for young Americans and educators to learn and succeed. NASA STEM Engagement seeks to:


  • Create unique opportunities for students and the public to contribute to NASA’s work in exploration and discovery.

  • Build a diverse future STEM workforce by engaging students in authentic learning experiences with NASA people, content, and facilities.

  • Strengthen public understanding by enabling powerful connections to NASA’s mission and work.


To achieve these goals, NASA STEM Engagement strives to increase Intern engagement in NASA projects, enhance higher education, support underrepresented communities, strengthen online education, and boost NASA's contribution to informal education. The intended outcome is a generation prepared to code, calculate, design, and discover its way to a new era of American innovation. The NASA Internships Program leverages NASA’s unique missions and programs to enhance and increase the capability, diversity and size of the nation’s future science, technology, engineering and mathematics (STEM) workforce. Internships are available from high school to graduate level. Internships provide students with the opportunity to participate in either research or other experiential learning, under the guidance of a mentor at NASA.


The purpose of this pilot study is to develop, design, pilot, and validate a survey instrument that captures measurable (quantitative & qualitative) data on students’ outcomes one-year after participating in a NASA STEM Internship and assess how and in what ways NASA Internships contribute to students’ career placements, trajectories, and/or planned educational pursuits. The Internship Retrospective Pilot study will utilize survey data and analysis methods to validate the Retrospective Survey Instrument. Based on the pilot findings, the evaluation team will conduct cognitive interviews with select respondents to ensure that cultural validity, as well as the clarity and completeness of questions, are considered in the piloting process. Additionally, interview questions have also been created to gather feedback on the structure and administration of the NASA Internships Retrospective Survey.

  1. INTRODUCTION AND PURPOSE: NASA STEM has implemented several internships studies since 2019. Most relevant to this pilot retrospective study are the FY21 Internships Outcome Evaluation, the FY22 Internships Longitudinal Evaluation, and the FY22 Follow-On Internships Outcome Assessment. The findings from these studies demonstrated high levels of program satisfaction and interns were found to have high gains in science and research-related outcomes. Furthermore, these studies found that the internship experience was effective at maintaining and growing perceptions and aspirations regarding future STEM interests and intentions.


The previous and multi-year evaluation studies primarily used three modes of data collection: 1) participant surveys or questionnaires, including quantitative (Likert-type scale) and qualitative (open-ended response) data; 2) phone interviews; and 3) a 21st Century Skills Assessment. Survey research relies on self-reporting to obtain information about such variables as attitudes, opinions, behaviors, and demographic characteristics. These prior-year studies implemented a cross-sectional survey method. That is, survey data were collected from selected individuals at a single time. Cross-sectional designs effectively provide a snapshot of a population’s current behaviors, attitudes, and beliefs. This design also provides data relatively quickly but is limited in understanding trends or development over time (Gay et al., 2012). While the findings are valuable and significant in understanding the immediate outcomes of participating in an internship, it provides limited insight into the outcomes for the participants once the internship has been completed.


In addition to the positive findings from the FY21 and FY22 NASA STEM Internships Outcome Assessments Evaluation reports, knowledge about how this vital experience impacts the future education and career trajectories of past interns remains limited. To help fill this knowledge gap, the overarching purpose of the Internships Retrospective Study is to understand the impact of the internship experience on future education and career plans one year after participation in a NASA STEM Internship.


The current NASA Internships Retrospective Pilot Study will follow up with interns who participated in a NASA STEM Internship during FY22 (Spring, Summer, and/or Fall) and learn where they are with their education and career goals approximately one year out from their NASA STEM Internship experience. We are delimitating our participant group to FY22 for two reasons: 1) Understand this as a snapshot in time, approximately one year after completion of a NASA STEM Internship; and 2) Leverage the available rich contextual data from this pool of participants in the NASA Gateway system. Specifically, this retrospective study aims to pilot and validate a survey that examines how participation in the NASA STEM Internship program impacts interns’ education and careers one year after completion of the internship.


This data information collection request includes one retrospective survey instrument and a cognitive interview protocol that will be used to collect data from student participants of the FY22 NASA Internship Program.


  1. RESEARCH DESIGN OVERVIEW: Research has demonstrated that internships and work-based learning experiences are positively associated with student outcomes such as STEM concept knowledge and STEM persistence (National Academies of Sciences, Engineering, and Medicine, 2017). Thus, participation in such experiences has been viewed as an important evidence-based practice to addressing current STEM workforce needs. Although there is an extant literature documenting the outcomes of such experiences on students, there is much less research documenting the contributions of such experiences to the STEM field.


The proposed pilot retrospective survey will used in ongoing program evaluations by NASA. The evaluation will answer one question (see Figure 1). The evaluation question aims to provide information about the impact that the NASA STEM Internship experience had on the education and career plans of interns one year after the completion of their internship. This question will allow the OSTEM to assess the impact of the internship experience to support NASA STEM with evidence-based decision making and continual improvement. The evaluation team will use this evaluation question as a guide for creating and validating the pilot retrospective survey. The evaluation team will first examine if the survey data is valid on the whole, and next if the data is sufficient in answering the evaluation question. Finally, the evaluation team will conduct cognitive interviews after the survey has closed to gather feedback on the structure and administration of the surveys as a final step in the validation of the NASA Internships Retrospective Survey.


Shape1

Figure 1. Evaluation Question


The developed instrument will be placed into Survey Monkey online software, and a survey link will be distributed through email to ~2500 NASA Interns from FY22. Note that this sample size is nominal and will depend on the number of participants that agree to participate in the study. Quantitative and qualitative methods will be used to analyze survey data. Quantitative data will be summarized using descriptive statistics such as numbers of respondents, frequencies and proportions of responses, average response when responses categories are assigned to a Likert scale (e.g., 1 = “Never Used” to 4 = “Used Every day”), and standard deviations. Emergent coding will be used for the qualitative data to identify the most common themes in responses.


Construct survey item analysis. Step 1 includes, but is not limited to, assessing and reviewing the prior NASA STEM Internships studies, not only for what data was captured, but also for what data was not being captured. The general notion, or broad question, for this pilot survey focused on intern’s prior and current education goals/enrollment and their past/current career goals. The survey consists of a combination of multiple choice, Likert scale, and open response questions (Davies, 2019). The evaluation team reviewed and commented on the initial draft. After the review period concluded, all evaluation team member feedback was considered and reviewed for its fit to the provided SOW. The final survey included all suggested alterations made by the evaluation team, along with a detailed survey description. The survey description was constructed to ensure informed consent by all awardees that complete the survey as intended. The survey description includes a brief purpose statement, the overall number of questions that the survey contains, an estimated length of time for survey completion, the risks and benefits that accompany completing the survey, and finally a confidentiality statement.


The respondent options for the education questions were gathered from the National Center for Education Statistics (NCES) College Navigator official website, and career responses were gathered from the Bureau of Labor and Statistics website. The Likert scale denotations were derived based upon the overall goal of the survey, which is to gauge the overall impact(s) that the NASA STEM Internship had on the awardee’s education and career goals.


Cognitive Interview Protocol.  Interview questions have also been created to gather feedback on the structure and administration of the NASA Internships Retrospective Survey.


  1. TIMELINE: Pilot testing of the NASA Internships Retrospective Survey will take place June 2023 – May 2024, with Internship Program student participants from the FY22 cohort.


  1. SAMPLING STRATEGY: The universe of FY22 NASA Internship Program student participants is 2500 or below.  The NASA Internships Retrospective Survey items will be placed into Survey Monkey online software, and a survey link will be distributed through email to ~2500 NASA Internship Program student participants.


Table 1. Calculation chart to determine statistically relevant number of respondents

Data Collection Source

(N)

Population Estimate

(A)

Sampling Error +/-

5% (.05)

(Z) Confidence Level 95%/ Alpha 0.05

(P) *Variability (based on consistency of intervention administration) 50%

Base Sample Size

Response Rate

(n) Number of Respondents

FY22 NASA Internship Student Participants

2500

N/A

N/A

N/A

2500

N/A

2500

TOTAL







2500



  1. BURDEN HOURS: Burden calculation is based on a respondent pool of individuals as follows:


Data Collection Source

Number of Respondents

Frequency of Response

Total minutes per Response

Total Response Burden in Hours

FY22 NASA Internship Student Participants

2,500

1

10

416.66

TOTAL




416.66



  1. DATA CONFIDENTIALITY MEASURES: Any information collected under the purview of this clearance will be maintained in accordance with the Privacy Act of 1974, the e-Government Act of 2002, the Federal Records Act, and as applicable, the Freedom of Information Act in order to protect respondents’ privacy and the confidentiality of the data collected.


  1. PERSONALLY IDENTIFIABLE INFORMATION:

    1. Is personally identifiable information (PII) collected? oYes þ No


    1. If yes, will any information that is collected by included in records that are subject to the Privacy Act of 1974? oYes o No Not Applicable


    1. If yes, has an up-to-date System of Records Notice (SORN) been published?

þYes o No

Published March 17, 2015, the Applicable System of Records Notice is NASA 10EDUA, NASA STEM Engagement Program Evaluation System - http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html.



APPLICABLE RECORDS:


    1. Applicable System of Records Notice: SORN: NASA 10EDUA, NASA STEM Engagement Program Evaluation System - http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html


    1. Completed surveys will be retained in accordance with NASA Records Retention Schedule 1,

Item 68D. Records will be destroyed or deleted when ten years old, or no longer needed, whichever is longer.


  1. PARTICIPANT SELECTION APPROACH:


  1. Does NASA STEM Engagement have a respondent sampling plan? þYes o No


If yes, please define the universe of potential respondents. If a sampling plan exists, please describe? The universe of FY22 NASA Internship Program student participants is 2500 or below. The NASA Internships Retrospective Survey items will be placed into Survey Monkey online software, and a survey link will be distributed through email to ~2500 NASA Internship Program student participants.


If no, how will NASA STEM Engagement identify the potential group of respondents and how will they be selected? Not applicable.


  1. INSTRUMENT ADMINISTRATION STRATEGY

Describe the type of Consent: o Active þ Passive

    1. How will the information be collected:

þ Web-based or other forms of Social Media

o Telephone

o In-person

o Mail

o Other


If multiple approaches are used for a single instrument, state the projected percent of responses per approach.


    1. Will interviewers or facilitators be used? þYes o No



  1. DOCUMENTS/INSTRUMENTS ACCOMPANYING THIS REQUEST:

o Consent form

þ Instrument (attitude & behavior scales, and surveys)

þ Protocol script (Specify type: Script)

þ Instructions NOTE: Instructions are included in the instrument

o Other (Specify ________________)


  1. GIFTS OR PAYMENT: o Yes þ No  If you answer yes to this question, please describe and provide a justification for amount.


ANNUAL FEDERAL COST: The estimated annual cost to the Federal government is $5,925. The cost is based on an annualized effort of 75 person-hours at the evaluator’s rate of $79/hour for administering the survey instrument, collecting and analyzing responses, and editing the survey instrument for ultimate approval through the methodological testing generic clearance with OMB Control Number 2700-0159, exp. exp. 09/30/2024.


  1. CERTIFICATION STATEMENT:

I certify the following to be true:

  1. The collection is voluntary.

  2. The collection is low burden for respondents and low cost for the Federal Government.

  3. The collection is non-controversial and does raise issues of concern to other federal agencies.

  4. The results will be made available to other federal agencies upon request, while maintaining confidentiality of the respondents.

  5. The collection is targeted to the solicitation of information from respondents who have experience with the program or may have experience with the program in the future.


Name of Sponsor: Richard Gilmore

Title: Performance Assessment and Evaluation Program Manager, NASA

Office of STEM Engagement (OSTEM)

Email address or Phone number: [email protected]

Date: 12/14/2023





References


Davies, J. (2019). Think you're sending too many surveys? How to avoid survey fatigue.

Gay, L. R., Mills, G. E., & Airasian, P. W. (2012). Educational research: Competencies for analysis and  

Application (10th Ed.). Pearson. 

National Academies of Sciences, Engineering, and Medicine. (2017). Undergraduate research experiences for STEM

students: Successes, challenges, and opportunities. National Academies Press.



NASA Office of STEM Engagement 8

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTeel, Frances C. (HQ-JF000)
File Modified0000-00-00
File Created2023-12-14

© 2024 OMB.report | Privacy Policy