Supporting Statement Retrospective Survey Grades 3-8

Supporting Statement Retrospective Survey Grades 3-8.docx

Generic Clearance for the NASA Office of Education Performance Measurement and Evaluation (Testing)

Supporting Statement Retrospective Survey Grades 3-8

OMB: 2700-0159

Document [docx]
Download: docx | pdf

REQUEST FOR APPROVAL under the Generic Clearance for NASA STEM Engagement Performance Measurement and Evaluation, OMB Control Number 2700-0159, expiration 06/30/2021

_____________________________________________________________________________________


  1. TITLE OF INFORMATION COLLECTION:

NASA Office of STEM Engagement Engineering Design Challenge Impact Surveys: Student Retrospective Survey


  1. TYPE OF COLLECTION:

Attitude/Behavior Scale

Baseline Survey

Cognitive Interview Protocol

Consent Form

Focus Group Protocol

Follow-up Survey

Instructions

Satisfaction Survey

Usability Protocol


GENERAL OVERVIEW: NASA Science, Technology, Engineering, and Mathematics (STEM) Engagement is comprised of a broad and diverse set of programs, projects, activities and products developed and implemented by HQ functional Offices, Mission Directorates and Centers. NASA’s Office of STEM Engagement (OSTEM) delivers participatory, experiential learning and STEM challenge activities for young Americans and educators to learn and succeed. NASA STEM Engagement seeks to:


  • Create unique opportunities for students and the public to contribute to NASA’s work in exploration and discovery.

  • Build a diverse future STEM workforce by engaging students in authentic learning experiences with NASA people, content, and facilities.

  • Strengthen public understanding by enabling powerful connections to NASA’s mission and work.


To achieve these goals, NASA STEM Engagement strives to increase K-12 involvement in NASA projects, enhance higher education, support underrepresented communities, strengthen online education, and boost NASA's contribution to informal education. The intended outcome is a generation prepared to code, calculate, design, and discover its way to a new era of American innovation.


The retrospective survey for this information collection is specific to determining the impact of the engineering design challenge program and activities on students (grades 3 through 8). STEM-related skills development, application of the Engineering Design Process, and quality engagement of students are also measures of interest.


  1. INTRODUCTION AND PURPOSE: Engineering Design Challenge activities are based on best practices in motivation, engagement, and learning for students and educators in formal and informal settings (e.g., Farland-Smith, 2012; Gasiewski, Eagan, Garcia, Hurtado, & Change, 2012; Kim, et al., 2015; Leblebicioglu, Metin, Yardimci, & Cetin, 2011; Maltese & Tai, 2011). This retrospective survey was developed as a modified version of the valid and reliable Student Attitudes toward STEM (S-STEM) Surveys (Unfried, A. S., Feber, M. S., Stanhope, D. S., & Wiebe, E. S., 2015) which assesses student attitudes toward science, mathematics, engineering and technology, 21st century skills, and student interest in STEM careers. In a NASA Engineering Design Challenge (EDC) activity, the focus is a design task in which students must meet certain criteria through a series of steps that engineers follow to arrive at a solution to a problem. This engineering problem is within the context of NASA-unique content and subject matter experts.


Our interest is in understanding why, how, and in what ways students are impacted in the short-, intermediate, and long-term by participation in Engineering Design Challenge activities with an engineering design process focus. Thus, the purpose for pilot testing is to develop a valid instrument that reliably explains the ways in which participants’ attitudes and behaviors are impacted by participation in these activities. Guided by current STEM education and measurement methodologies, it is the goal of this rigorous instrument development and testing procedure to provide information that becomes part of the iterative assessment and feedback process for the NASA STEM Engagement Engineering Design Challenge activities.


Hence, the goals of this cycle of pilot testing are as follows:

  • Determine clarity, comprehensibility, and preliminary psychometric properties (e.g., validity, reliability) of these instruments. And, to explore individual item functioning, and to make any necessary adjustments in preparation for large-scale testing as the basis for more sophisticated statistical testing.

  • Determine an accurate response burden for these instruments.


Overall rationale for the survey’s 4-point Likert Scale- the retrospective survey will be asking students to consider their perceptions and feelings before and after the project simultaneously. Our goal is to ensure that students are not overwhelmed with answer choices that may convolute or distort their ability to respond accurately. The advantages to using the 4-point Likert scale include:

  • Forces students to chose

  • Students may be more discriminating and more thoughtful

  • Eliminates possible misinterpretation of mid‐point




  1. RESEARCH DESIGN OVERVIEW: NASA STEM Engagement is using a quasi-experimental design. Responses will be used to validate the retrospective survey for clarity, comprehensibility, and to determine psychometric properties with the respondent pool.


Following this pilot phase of testing, NASA STEM Engagement has tentative research questions and hypotheses to test regarding the impact of STEM Challenge activities on all participants—students and teachers alike. Thus, this work is integral to the iterative assessment and feedback process for NASA STEM Engagement Engineering Design Challenge (EDC) activities.


NASA STEM Engagement is pilot testing a retrospective survey. Despite the absence of a control group, this design can still yield strong causal effects when effort is made to satisfy requirements of quasi-experimentation such as identifying and reducing the plausibility of alternative explanations for the intervention- as- treatment effect (Shadish, Cook, & Campbell, 2002), identifying conceivable threats to internal validity, and statistically probing likelihood of treatment-outcome covariation (Mark & Reichardt, 2009).


According to Norman (2003), “[r]esponse shift theory presumes that [participants’] prior state is adjusted in retrospective judgment on the basis of new information acquired in the interim, so that the retrospective judgment is more valid” (p. 243). The statistical manifestation of rating oneself on a different dimension or metric at post-test results in a mismatch between pre- and post-test scores known as response shift bias (Goedhart & Hoogstraten, 1992). The retrospective pretest is considered to be a valid assessment tool when respondents cannot be expected to know what they do not know at the onset of an intervention (Pelfrey and Pelfrey, 2009). Such may be the case with respondents who are participating in a NASA activity and/or are completing an attitude and behavior or knowledge survey for the very first time. According to Verhoeven (2008), “retrospective surveys are relatively easy, cheap and reliable means of collecting data on lifecycle events” (p. 9).


Following this pilot phase of testing and subsequent determination of instrument psychometric properties, indeed NASA STEM Engagement has tentative research questions and hypotheses to test regarding the impact of challenge activity training on NASA STEM Challenge educator participants. Thus, this work is integral to the iterative assessment and feedback process for the NASA STEM Engagement Engineering Design Challenge activities.


  1. TIMELINE: Pilot testing of surveys will take place approximately January 2020 through January 2021, coordinated with the implementation periods of the STEM Challenge activities.


  1. SAMPLING STRATEGY: The universe of EDC student participants is 3600 or below, NASA STEM Engagement will administer surveys for testing to the census of EDC student participants and randomly select student participants across sites and grade levels that meet inclusion criteria and match demographics according to the strategy described below.


Once the participating Engineering Design Challenge sites have been categorized by location, the second step will be to stratify the participants, at those sites, by grade and other demographic information (e.g., race, gender). Once that is determined, participants will be randomly selected participants who meet the inclusion criteria (attendance/participation in 80% of the activities) to match the demographics. This approach will enable the collection of data from every participant and from this point, we will identify a representative sample of participants across sites and grade levels. For example, if girls represent 55 percent of the participants, 55% of the sample will be girls.


Table 1. Calculation chart to determine statistically relevant number of respondents

Data Collection Source

(N)

Population Estimate

(A)

Sampling Error +/-

5% (.05)

(Z) Confidence Level 95%/ Alpha 0.05

(P) *Variability (based on consistency of intervention administration) 50%

Base Sample Size

Response Rate

(n) Number of Respondents

EDC Students

3600

N/A

N/A

N/A

3600

N/A

3600

TOTAL







3600



  1. BURDEN HOURS: Burden calculation is based on a respondent pool of individuals as follows:


Data Collection Source

Number of Respondents

Frequency of Response

Total minutes per Response

Total Response Burden in Hours

EDC Students

3600

1

15

900

EDC Educators

300

1

5

25

TOTAL




925

*Burden for Educators, in this instance, is calculated to determine the amount of time spent reading instructions to student survey respondents.


  1. DATA CONFIDENTIALITY MEASURES: Any information collected under the purview of this clearance will be maintained in accordance with the Privacy Act of 1974, the e-Government Act of 2002, the Federal Records Act, and as applicable, the Freedom of Information Act in order to protect respondents’ privacy and the confidentiality of the data collected.


  1. PERSONALLY IDENTIFIABLE INFORMATION:

    1. Is personally identifiable information (PII) collected? Yes No

NOTE: First and Last Name are not collected but demographic information is collected (i.e., birthdate, grade level, ethnicity, race and gender)


    1. If yes, will any information that is collected by included in records that are subject to the Privacy Act of 1974? Yes No


    1. If yes, has an up-to-date System of Records Notice (SORN) been published?

Yes No

Published March 17, 2015, the Applicable System of Records Notice is NASA 10EDUA, NASA STEM Engagement Program Evaluation System - http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html.


APPLICABLE RECORDS:


    1. Applicable System of Records Notice: SORN: NASA 10EDUA, NASA STEM Engagement Program Evaluation System - http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html


    1. Completed surveys will be retained in accordance with NASA Records Retention Schedule 1,

Item 68D. Records will be destroyed or deleted when ten years old, or no longer needed, whichever is longer.


  1. PARTICIPANT SELECTION APPROACH:


  1. Does NASA STEM Engagement have a respondent sampling plan? Yes No


If yes, please define the universe of potential respondents. If a sampling plan exists, please describe? The universe of EDC student participants is 3600 or below, NASA STEM Engagement will administer surveys for testing to the census of EDC student participants and randomly select student participants across sites and grade levels that meet inclusion criteria and match demographics according to the strategy described below.


Once the participating Engineering Design Challenge sites have been categorized by location, the second step will be to stratify the participants, at those sites, by grade and other demographic information (e.g., race, gender). Once that is determined, participants will be randomly selected participants who meet the inclusion criteria (attendance/participation in 80% of the activities) to match the demographics. This approach will enable the collection of data from every participant and from this point, we will identify a representative sample of participants across sites and grade levels. For example, if girls represent 55 percent of the participants, 55% of the sample will be girls.


If no, how will NASA STEM Engagement identify the potential group of respondents and how will they be selected? Not applicable.


  1. INSTRUMENT ADMINISTRATION STRATEGY

Describe the type of Consent: Active Passive

    1. How will the information be collected:

Web-based or other forms of Social Media (95%)

Telephone

In-person

Mail

Other (5%)


If multiple approaches are used for a single instrument, state the projected percent of responses per approach. The retrospective survey will be administered via the web. Because it is preferable that all retrospective surveys be administered at the end of the Engineering Design Challenge activity, hard copy surveys will be made available to collect survey responses in the event web access is temporarily unavailable. In the past, no more than 5% of respondents were asked to complete hard copy surveys due to internet or computer difficulties.


    1. Will interviewers or facilitators be used? Yes No


    1. Will interviewers or facilitators be used? Yes No

Note: “Facilitators” refers to Educators who will read and explain student retrospective survey instructions.


  1. DOCUMENTS/INSTRUMENTS ACCOMPANYING THIS REQUEST:

Consent form

Instrument (attitude & behavior scales, and surveys)

Protocol script (Specify type: Script)

Instructions NOTE: Instructions are included in the instrument

Other (Specify ________________)


  1. GIFTS OR PAYMENT: Yes No  If you answer yes to this question, please describe and provide a justification for amount.


  1. ANNUAL FEDERAL COST: The estimated annual cost to the Federal government is $1120. The cost is based on an annualized effort of 20 person-hours at the evaluator’s rate of $56/hour for development and administering the survey instrument, collecting and analyzing responses, and editing the survey instrument for ultimate approval through the methodological testing generic clearance with OMB Control Number 2700-0159, exp. exp. 06/30/2021.




  1. CERTIFICATION STATEMENT:

I certify the following to be true:

  1. The collection is voluntary.

  2. The collection is low burden for respondents and low cost for the Federal Government.

  3. The collection is non-controversial and does raise issues of concern to other federal agencies.

  4. The results will be made available to other federal agencies upon request, while maintaining confidentiality of the respondents.

  5. The collection is targeted to the solicitation of information from respondents who have experience with the program or may have experience with the program in the future.


Name of Sponsor: Richard Gilmore

Title: Educational Programs Specialist/Evaluation Manager, NASA GRC

Office of STEM Engagement

Email address or Phone number: [email protected]

Date: 1/14/2021





Bibliography

Farland-Smith, D. (2012). Personal and Social Interactions Between Young Girls and Scientists: Examining Critical Aspects for Identity Construction. Journal of Science Teacher Education, 23(1), 1-18.

Gasiewski, J. A., Eagan, M. K., Garcia, G. A., Hurtado, S., & Change, M. J. (2012). From gatekeeping to engagement: A multicontextual, mixed method study of student academic engagement in introductory STEM courses. Research in Higher Education, 53(2), 229-261.

Goedhart, H., & Hoogstraten, J. (1992). The retrospective pretest and the role of pretest information in valuative studies. Psychological Reports, 70(3), 699-704.

Kim, C., Kim, D., Yuan, J., Hill, R. B., Doshi, P., & Thai, C. N. (2015). Robotics to promote elementary education pre-service teachers' STEM engagement, learning, and teaching. Computers & Education, 91, 14-31.

Leblebicioglu, G., Metin, D., Yardimci, E., & Cetin, P. S. (2011). The Effect of Informal and Formal Interaction between Scientists and Children at a Science Camp on Their Images of Scientists. Science Education International, 22(3), 158-174.

Maltese, A. V., & Tai, R. H. (2011). Pipeline persistence: Examining the association of educational experiences with earned degrees in STEM among US students. Science Education, 95(5), 877-907.

Mark, M. M., & Reichardt, C. S. (2009). Quasi-experimentation. In L. Bickman, & D. J. Rog (Eds.), The SAGE handbook of applied social research methods (2nd ed., pp. 182-214). Thousand Oaks, CA: SAGE Publications, Inc.

Norman, G. (2003). Hi! How are you? Response shift, implicit theories and differing epistemologies. Quality of Life Research, 12, 239-249.

Pelfrey, Sr., W. V., & Pelfrey, Jr., W. V. (2009). Curriculum evaluation and revision in a nascent field: The utility of the retrospective pretest-posttest model in a Homeland Security program of study. Evaluation Review, 33(1), 54-82.

Unfried, A. S., Feber, M. S., Stanhope, D. S., & Wiebe, E. S. (n.d.) (2015). The Development and Validation

of a Measure of Student Attitudes Toward Science, Technology, Engineering, and Math (S-STEM). Journal of Psychoeducational Assessment, vol. 33, 7: pp. 622-639.

Verhoeven, M., Arentze, T., Timmermans, H., & van der Waerden, P. (n.d.). Retrospective Surveys: Some Experiences in the Context of Measuring Lifecycle Events. Paper for the 87th Annual

Meeting of the Transportation Research Board. Washington, DC., 2008.




NASA Office of STEM Engagement 9

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTeel, Frances C. (HQ-JF000)
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy