Supporting Statement Engineering Design Challenge Impact Educator Feedback Survey

Supporting Statement Engineering Design Challenge Impact Educator Feedback Survey.docx

Generic Clearance for the NASA Office of Education Performance Measurement and Evaluation (Testing)

Supporting Statement Engineering Design Challenge Impact Educator Feedback Survey

OMB: 2700-0159

Document [docx]
Download: docx | pdf

REQUEST FOR APPROVAL under the Generic Clearance for NASA STEM Engagement Performance Measurement and Evaluation, OMB Control Number 2700-0159, expiration 06/30/2021

_____________________________________________________________________________________


  1. TITLE OF INFORMATION COLLECTION:

NASA Office of STEM Engagement Engineering Design Challenge Impact Surveys: Educator Feedback Survey


  1. TYPE OF COLLECTION:

Attitude/Behavior Scale

Baseline Survey

Cognitive Interview Protocol

Consent Form

Focus Group Protocol

Follow-up Survey

Instructions

Satisfaction Survey

Usability Protocol


GENERAL OVERVIEW: NASA Science, Technology, Engineering, and Mathematics (STEM) Engagement is comprised of a broad and diverse set of programs, projects, activities and products developed and implemented by HQ functional Offices, Mission Directorates and Centers. NASA’s Office of STEM Engagement (OSTEM) delivers participatory, experiential learning and STEM challenge activities for young Americans and educators to learn and succeed. NASA STEM Engagement seeks to:


  • Create unique opportunities for students and the public to contribute to NASA’s work in exploration and discovery.

  • Build a diverse future STEM workforce by engaging students in authentic learning experiences with NASA people, content, and facilities.

  • Strengthen public understanding by enabling powerful connections to NASA’s mission and work.


To achieve these goals, NASA STEM Engagement strives to increase K-12 involvement in NASA projects, enhance higher education, support underrepresented communities, strengthen online education, and boost NASA's contribution to informal education. The intended outcome is a generation prepared to code, calculate, design, and discover its way to a new era of American innovation.


The educator feedback survey for this information collection is specific to determining the impact of the engineering design challenge program and activities on educators and their students (grades 3 through 8). STEM-related skills development, application of the Engineering Design Process, and quality engagement of students are also measures of interest.


  1. INTRODUCTION AND PURPOSE: Engineering Design Challenge activities are based on best practices in motivation, engagement, and learning for students and educators in formal and informal settings (e.g., Farland-Smith, 2012; Gasiewski, Eagan, Garcia, Hurtado, & Change, 2012; Kim, et al., 2015; Leblebicioglu, Metin, Yardimci, & Cetin, 2011; Maltese & Tai, 2011). This educator feedback survey includes items focused on past participation in Engineering Design Challenges, feedback on the face-to-face professional development training, knowledge of STEM topics, comfort teaching STEM topics, STEM teaching efficacy and quality of technical assistance and supports. In a NASA Engineering Design Challenge (EDC) activity, the focus is a design task in which students must meet certain criteria through a series of steps that engineers follow to arrive at a solution to a problem. This engineering problem is within the context of NASA-unique content and subject matter experts.


We are proposing to collect data from each of the f educators who (1) participate in the NASA 21st CCLC professional development training and (2) provide instruction on the content guide modules. Educators will be asked to self-identify if they are a formal educator (i.e., certified/licensed teacher) or an informal educator (e.g., volunteer, professional). Location at which the course content is delivered will be identified as either school-based meaning instruction occurs on school grounds or community-based meaning the instruction takes places at the location of a community agency. Our interest is in understanding how educators are impacted in the short-, intermediate, and long-term by participation in Engineering Design Challenge activities. Thus, the purpose for pilot testing is to develop a valid instrument that reliably explains the ways in which participants’ attitudes and behaviors are impacted by participation in these activities. Guided by current STEM education and measurement methodologies, it is the goal of this rigorous instrument development and testing procedure to provide information that becomes part of the iterative assessment and feedback process for the NASA STEM Engagement Engineering Design Challenge activities.


Hence, the goals of this cycle of pilot testing are as follows:

  • Determine clarity, comprehensibility, and preliminary psychometric properties (e.g., validity, reliability) of these instruments. And, to explore individual item functioning, and to make any necessary adjustments in preparation for large-scale testing as the basis for more sophisticated statistical testing.

  • Determine an accurate response burden for these instruments.


  1. RESEARCH DESIGN OVERVIEW: NASA STEM Engagement is using a quasi-experimental design. Responses will be used to validate the educator feedback survey for clarity, comprehensibility, and to determine psychometric properties with the respondent pool.


Following this pilot phase of testing, NASA STEM Engagement has tentative research questions and hypotheses to test regarding the impact of STEM Challenge activities on all participants—students and teachers alike. Thus, this work is integral to the iterative assessment and feedback process for NASA STEM Engagement Engineering Design Challenge (EDC) activities.


NASA STEM Engagement is pilot testing an educator feedback survey. Despite the absence of a control group, this design can still yield strong causal effects when effort is made to satisfy requirements of quasi-experimentation such as identifying and reducing the plausibility of alternative explanations for the intervention- as- treatment effect (Shadish, Cook, & Campbell, 2002), identifying conceivable threats to internal validity, and statistically probing likelihood of treatment-outcome covariation (Mark & Reichardt, 2009).


According to Norman (2003), “[r]esponse shift theory presumes that [participants’] prior state is adjusted in retrospective judgment on the basis of new information acquired in the interim, so that the retrospective judgment is more valid” (p. 243). The statistical manifestation of rating oneself on a different dimension or metric at post-test results in a mismatch between pre- and post-test scores known as response shift bias (Goedhart & Hoogstraten, 1992). The retrospective pretest is considered to be a valid assessment tool when respondents cannot be expected to know what they do not know at the onset of an intervention (Pelfrey and Pelfrey, 2009). Such may be the case with respondents who are participating in a NASA activity and/or are completing an attitude and behavior or knowledge survey for the very first time. According to Verhoeven (2008), “retrospective surveys are relatively easy, cheap and reliable means of collecting data on lifecycle events” (p. 9).


  1. TIMELINE: Pilot testing of surveys will take place approximately February 2020 through January 2021, coordinated with the implementation periods of the STEM Challenge activities.


  1. SAMPLING STRATEGY: The universe of EDC educator participants is 400 or below, NASA STEM Engagement will administer surveys for testing to the universe of EDC educator participants.


Table 1. Calculation chart to determine statistically relevant number of respondents

Data Collection Source

(N)

Population Estimate

(A)

Sampling Error +/-

5% (.05)

(Z) Confidence Level 95%/ Alpha 0.05

(P) *Variability (based on consistency of intervention administration) 50%

Base Sample Size

Response Rate

(n) Number of Respondents

EDC Educators

400

N/A

N/A

N/A

400

N/A

400

TOTAL







400



  1. BURDEN HOURS: Burden calculation is based on a respondent pool of individuals as follows:


Data Collection Source

Number of Respondents

Frequency of Response

Total minutes per Response

Total Response Burden in Hours

EDC Educators

400

1

20

133

TOTAL




133


  1. DATA CONFIDENTIALITY MEASURES: Any information collected under the purview of this clearance will be maintained in accordance with the Privacy Act of 1974, the e-Government Act of 2002, the Federal Records Act, and as applicable, the Freedom of Information Act in order to protect respondents’ privacy and the confidentiality of the data collected.


  1. PERSONALLY IDENTIFIABLE INFORMATION:

    1. Is personally identifiable information (PII) collected? Yes No

NOTE: First and Last Name are not collected but location and demographic information is collected (i.e., name of state and the name of site where the EDC was implemented, ethnicity and racial category)


    1. If yes, will any information that is collected by included in records that are subject to the Privacy Act of 1974? Yes No


    1. If yes, has an up-to-date System of Records Notice (SORN) been published?

Yes No

Published March 17, 2015, the Applicable System of Records Notice is NASA 10EDUA, NASA STEM Engagement Program Evaluation System - http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html.


APPLICABLE RECORDS:


    1. Applicable System of Records Notice: SORN: NASA 10EDUA, NASA STEM Engagement Program Evaluation System - http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html


    1. Completed surveys will be retained in accordance with NASA Records Retention Schedule 1,

Item 68D. Records will be destroyed or deleted when ten years old, or no longer needed, whichever is longer.


  1. PARTICIPANT SELECTION APPROACH:


  1. Does NASA STEM Engagement have a respondent sampling plan? Yes No


If yes, please define the universe of potential respondents. If a sampling plan exists, please describe? The universe of EDC educator participants is 400 or below, NASA STEM Engagement will administer surveys for testing to the universe of EDC educator participants..


If no, how will NASA STEM Engagement identify the potential group of respondents and how will they be selected? Not applicable.


  1. INSTRUMENT ADMINISTRATION STRATEGY

Describe the type of Consent: Active Passive

    1. How will the information be collected:

Web-based or other forms of Social Media (95%)

Telephone

In-person

Mail

Other (5%)


If multiple approaches are used for a single instrument, state the projected percent of responses per approach. The educator feedback survey will be administered via the web. Because it is preferable that all educator feedback surveys be administered at the end of the Engineering Design Challenge activity, hard copy surveys will be made available to collect survey responses in the event web access is temporarily unavailable. In the past, no more than 5% of respondents were asked to complete hard copy surveys due to internet or computer difficulties.


    1. Will interviewers or facilitators be used? Yes No


    1. Will interviewers or facilitators be used? Yes No


  1. DOCUMENTS/INSTRUMENTS ACCOMPANYING THIS REQUEST:

Consent form

Instrument (attitude & behavior scales, and surveys)

Protocol script (Specify type: Script)

Instructions NOTE: Instructions are included in the instrument

Other (Specify ________________)


  1. GIFTS OR PAYMENT: Yes No  If you answer yes to this question, please describe and provide a justification for amount.


  1. ANNUAL FEDERAL COST: The estimated annual cost to the Federal government is $1120. The cost is based on an annualized effort of 20 person-hours at the evaluator’s rate of $56/hour for development and administering the survey instrument, collecting and analyzing responses, and editing the survey instrument for ultimate approval through the methodological testing generic clearance with OMB Control Number 2700-0159, exp. exp. 06/30/2021.




  1. CERTIFICATION STATEMENT:

I certify the following to be true:

  1. The collection is voluntary.

  2. The collection is low burden for respondents and low cost for the Federal Government.

  3. The collection is non-controversial and does raise issues of concern to other federal agencies.

  4. The results will be made available to other federal agencies upon request, while maintaining confidentiality of the respondents.

  5. The collection is targeted to the solicitation of information from respondents who have experience with the program or may have experience with the program in the future.


Name of Sponsor: Richard Gilmore

Title: Educational Programs Specialist/Evaluation Manager, NASA GRC

Office of STEM Engagement

Email address or Phone number: [email protected]

Date: 1/14/2021





Bibliography

Farland-Smith, D. (2012). Personal and Social Interactions Between Young Girls and Scientists: Examining Critical Aspects for Identity Construction. Journal of Science Teacher Education, 23(1), 1-18.

Gasiewski, J. A., Eagan, M. K., Garcia, G. A., Hurtado, S., & Change, M. J. (2012). From gatekeeping to engagement: A multicontextual, mixed method study of student academic engagement in introductory STEM courses. Research in Higher Education, 53(2), 229-261.

Goedhart, H., & Hoogstraten, J. (1992). The retrospective pretest and the role of pretest information in valuative studies. Psychological Reports, 70(3), 699-704.

Kim, C., Kim, D., Yuan, J., Hill, R. B., Doshi, P., & Thai, C. N. (2015). Robotics to promote elementary education pre-service teachers' STEM engagement, learning, and teaching. Computers & Education, 91, 14-31.

Leblebicioglu, G., Metin, D., Yardimci, E., & Cetin, P. S. (2011). The Effect of Informal and Formal Interaction between Scientists and Children at a Science Camp on Their Images of Scientists. Science Education International, 22(3), 158-174.

Maltese, A. V., & Tai, R. H. (2011). Pipeline persistence: Examining the association of educational experiences with earned degrees in STEM among US students. Science Education, 95(5), 877-907.

Mark, M. M., & Reichardt, C. S. (2009). Quasi-experimentation. In L. Bickman, & D. J. Rog (Eds.), The SAGE handbook of applied social research methods (2nd ed., pp. 182-214). Thousand Oaks, CA: SAGE Publications, Inc.

Norman, G. (2003). Hi! How are you? Response shift, implicit theories and differing epistemologies. Quality of Life Research, 12, 239-249.

Pelfrey, Sr., W. V., & Pelfrey, Jr., W. V. (2009). Curriculum evaluation and revision in a nascent field: The utility of the retrospective pretest-posttest model in a Homeland Security program of study. Evaluation Review, 33(1), 54-82.

Verhoeven, M., Arentze, T., Timmermans, H., & van der Waerden, P. (n.d.). Retrospective Surveys: Some Experiences in the Context of Measuring Lifecycle Events. Paper for the 87th Annual

Meeting of the Transportation Research Board. Washington, DC., 2008.




NASA Office of STEM Engagement 8

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTeel, Frances C. (HQ-JF000)
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy