NASA STEM Challenges. Supporting Statement B....

NASA STEM Challenges. Supporting Statement B.....docx

NASA Office of Education STEM Challenges

OMB: 2700-0150

Document [docx]
Download: docx | pdf

2700-0150

SUPPORTING STATEMENT

FOR OMB CLEARANCE

PART B



NASA STEM Challenges


EVALUATION DATA COLLECTION





National Aeronautics and Space Administration



July 14, 2015





Part B: Collection of Information Employing Statistical Methods


Introduction

The NASA STEM Challenges activity is the result of previous OMB guidance to redesign the Summer of Innovation (SoI) pilot as a sustainable model for STEM engagement across the Federal STEM Agencies and to offer SoI as a model through the work of the Committee on STEM (CoSTEM). NASA applied its previous design work and evaluation findings to the design of a STEM Challenges pilot collaboration with the U.S. Department of Education (ED) in 2013-14.1 This pilot paired the extensive reach and infrastructure of ED with NASA’s experience with training community partners through Summer of Innovation in STEM engagement activities and its access to world-class subject matter experts and content in support of the shared CoSTEM goal of increasing and supporting the public engagement of youth (National Science and Technology Council, 2013). During the 2013-14 school year NASA collaborated with 3 states to provide dynamic and engaging STEM Design Challenges to students in the 21st Century Community Learning Center (21CCLC) afterschool programs. During the 2014-15 school year, NASA and ED expanded the activity to more states and 21CCLC sites, reaching 10 states with high quality NASA STEM Design Challenges.


The STEM Challenges activity focuses on STEM Design Challenges for middle school students designed by NASA to meet the content needs of out-of-school time sites (e.g., 21CCLC, 4-H). NASA facilitates the Challenges by providing a blended professional development strategy to support instructional staff in their implementation with a minimum of one in person training session in each participating state. NASA also provides regular opportunities for 21CCLC sites and students to engage with NASA scientists and engineers through a range of technology-based experiences (e.g., Skype) during a minimum of 20 hours of implementation across an 8 week implementation cycle scheduled during the school year. Following the success of the 21CCLC pilot, NASA continues to offer STEM Design Challenges in collaboration with the Department of Education and to seek other partnerships through which this activity could be offered.


This clearance package modifies the SoI evaluation activities previously approved under OMB control number 2700-0150 to align with the new circumstances of this information collection. This request includes the following instruments that collect standardized data from 10 or more respondents:


  • Baseline youth survey (Appendix 1; item by item justification provided in Appendix 2)

  • Follow-up youth survey (Appendix 3; item by item justification provided in Appendix 4)


The data to be collected are not available elsewhere unless collected through this information collection. The youth instruments will be used to gather data prior to and following the STEM Challenge activities in order to assess change in the key short-term outcome of youth attitude toward STEM. Information about implementation will be gathered from numerous sources, including review of student work products and activity observations. These data will allow NASA to collect fidelity of implementation and formative data to inform continuous program improvement.


Part A of this clearance package describes the information collection activities associated with this clearance package that require Paperwork Reduction Act clearance, specifically the baseline and follow-up student surveys. The student surveys will collect the data needed to respond to key research questions associated with the outcome evaluation.


B.1 DESCRIBE THE POTENTIAL RESPONDENT UNIVERSE AND ANY SAMPLING OR OTHER RESPONDENT SELECTION METHOD TO BE USED.


This section addresses the potential respondent universe and provides an outline of the selection criteria that defines the universe. It also provides justification for the decision to not utilize a sampling strategy for the surveys. The numerical estimate for the respondent universe, and anticipated response rates, are provided in tabular form in Exhibit 1. The projected unconditional response rate for youth surveys is 42%. Actual response rates from the 2015 evaluation study were not available at the time this package was prepared.


Exhibit 1. Data Collection to Be Analyzed using Statistical Methods

Instrument

Timing of Data Collection

Respondent Universe

Estimated Response Rate

Student Data

Youth Baseline Survey

Prior to start of STEM challenge (est. February)

810

80%

Youth Follow-up Survey

After the conclusion of STEM challenge (est. May)

810

52%

Total Respondents

810



Criteria to Define Respondent Universe. The potential respondent universe for the STEM Challenges evaluation study is all students participating in STEM Challenge activities administered by NASA in collaboration with its national partners, particularly the Department of Education. According to the last evaluation study conducted of the FY2015 NASA Challenges activity, 810 students participated in STEM Challenges at a total at 67 after-school sites across ten states; this count was used as the basis for the burden estimate for this clearance package. All sites were financially supported by the Department of Education’s 21st Century Community Learning Centers program. The U.S. Department of Education selected participating states and sites based on interest and availability for training. All of the 21CCLC sites participating in this collaboration provided service to students in grades five through nine in public schools with a low socio-economic designation.


In future years, NASA intends to identify for STEM Challenges evaluation studies the universe of students in all participating sites associated with a Challenge. These sites will be recruited to participate using the same criteria as in past years, including participation of students in grades five through nine in out of school time programs affiliated with public schools with a low socio-economic designation. There are minimum criteria associated with participation in STEM Challenges:


  • Instructor participation in NASA-facilitated training on how to facilitate the STEM Challenges;

  • Implementation of at least one STEM Challenge; and

  • Use of NASA resources, including interaction with NASA subject matter experts (SMEs).

Given the small size of the universe of STEM Challenge participating students and sites, a sampling strategy will not be employed.


Response Rates. In determining the sample size for the study, we incorporated assumptions about response rates and attrition at different stages of data collection. Specifically, our calculations assumed a youth survey response rate of 80% at baseline and an attrition rate of 48% (or, a follow-up response rate of 52%) between baseline and follow-up youth survey, for an unconditional response rate of 42%. We relied on response data from previous evaluation studies focused on out-of-school STEM with the middle school student population to conservatively estimate a follow-up youth attrition rate of 48%.2 However, given the revised administration of instruments, the revised consent process, dedicated external evaluation team member, camp-level evaluation point of contact, and inclusion of all grade-eligible students at a camp, we may achieve a higher response rate than in the previous evaluation study. This attrition rate is higher than that supported by existing research evidence on response rates for follow-up youth surveys.3


Response rates will be further addressed in section B.3.



B.2. DESCRIBE THE PROCEDURES FOR THE COLLECTION OF INFORMATION


The data collection procedures and instruments were designed to investigate youth-related outcomes. Exhibit 1 in section B.1 outlines the data collection schedule to be implemented in the 2016 STEM Challenges evaluation.

This data collection will not be using any statistical methodology for stratification, estimation procedures, and sample selection because the youth surveys will be administered to all participants. What follows below is a summary of procedures for the collection of information.


Surveys: Procedures for Data Collection. Prior to the start of the summer program, the external evaluator will obtain Institutional Review Board (IRB) approval for the modifications to the FY2016 study design and instruments. NASA has consulted with sites and the external evaluator and made decisions collaboratively regarding the most appropriate data collection strategies. Specifically, NASA discussed the following data collection strategies:


  1. Timing of administration of baseline student survey. In past SoI evaluations, the baseline student survey had been administered during the first morning of the summer camp. While this approach has generally produced a strong response rate provided the paper surveys were delivered to the awardee/center in sufficient time for administration, there can be greater variability in response since the site instructors are responsible for administering the survey. OMB had recommended the administration of the baseline survey at the time of registration, with the survey (or survey link) given to the parent/caregiver. However, it also potentially widens the time period of administration from one week to several months and places burden on camp administrators and instructors to follow up with individual students who may not have responded on the first day of camp. NASA met with the external evaluator and the participating SoI awardees to fully discuss the pros and cons of the baseline student survey timing. Following this consultation, NASA made the decision to administer the youth surveys on the first day of camp. The primary reason given for this choice is the stronger level of control the camp administrators have to ensure youth complete the surveys. NASA will continue to work closely with STEM challenges sites to ensure that response rates on the baseline student survey are high.


  1. Mode of administration for the youth surveys. In the most recent evaluation cycle, a small number of sites requested a paper-based survey since access to a computer lab was not possible. Although there is a chance that mode effects may influence survey responses, NASA and its external evaluator believe that providing awardees with a survey administration mode (online vs. paper) that works within their setting will improve the response rates.


Following these decisions, the external evaluator, working under the oversight of the NASA Headquarters Evaluation Manager, will provide training to site points of contact4 to ensure rigorous and systematic data collection procedures. Throughout the program, the external evaluator will support the awardees and camps in their data collection efforts. Evaluation guidance will be provided by the external evaluator to awardees in the form of a comprehensive evaluation manual available online and in hardcopy.


Baseline and Follow-Up Youth Surveys


As discussed earlier, the baseline youth survey form will be available in paper and online formats. The survey will be administered by site instructors to all youth meeting grade level and participation requirements in participating sites during the first day of the STEM challenge. An evaluation point of contact at each site will hold responsibility for ensuring administration and collection of the baseline and follow-up youth surveys. Paper survey forms will be returned to the external evaluator for safe-keeping and data entry.


The external evaluator will conduct analyses to provide descriptive statistics (e.g., proportions and averages) of student data across all sites. In addition, the external evaluator will provide descriptions of change in a variable over time by comparing baseline with follow-up survey results. Statistical tests will be run to assess whether the difference in proportions and/or means between two time points is zero. To do so, the evaluator will use a McNemar test or paired t-test, depending on the distribution of the outcome variables. Further, survey data will also be integrated with the quantitative implementation data to conduct correlational analyses on site characteristics and program quality and student attitudes and behaviors.


There will not be any use of periodic data collection cycles to reduce burden since a one-time administration of all forms is anticipated.




B.3 DESCRIBE METHODS TO MAXIMIZE RESPONSE RATES AND TO DEAL WITH ISSUES OF NON-RESPONSE.


As reported in B.1, response rates for these student surveys have been low during past administrations. Multiple factors have been involved in preventing the return of the materials, including the brief of amount of time available for planning between the award announcement and program implementation, the late delivery of data collection instruments to awardees and Centers, delayed access to SoI funding, and lack of clarity and prescriptiveness regarding evaluation responsibilities and requirements.


The current evaluation design has been informed by lessons from previous SoI evaluation efforts. Importantly, changes were made for FY2013 to address some of the earlier hurdles to data collection that were encountered in previous efforts and reflected in lower than expected response rates. Some of the key strategies now in place to address low response rates are as follows:



  1. Consulting with participating sites about locally appropriate data collection strategies prior to finalizing data collection plan, including providing paper or online versions of surveys if identified as appropriate by the site;

  2. providing adequate time between activity start-up and administration of surveys;

  3. making participation in evaluation activities a requirement for sites to participate in the STEM Challenge activities;

  4. revising activities and timeline for PRA package and OMB approval so that approval is received prior to when sites begin recruitment (in progress);

  5. identifying an evaluation point of contact at each site;

  6. assigning a designated external evaluation team member/help desk point of contact for each site;

  7. conducting a webinar for site evaluation points of contact when the evaluation materials are distributed to review data collection processes, reiterate participation requirements regarding the evaluation, and emphasize the importance of collecting the baseline survey before the start of STEM Challenge programming;

  8. developing an evaluation manual/guidebook that outlines the study and data collection responsibilities and processes for sites;

  9. providing self-returned stamped envelopes for the return of completed paper survey forms to their office for processing;

  10. tracking the completion of online surveys and the return of paper surveys by site in order to conduct appropriate follow-up to encourage survey completion;

  11. actively collaborating with the site evaluation point of contact leading up to and during the administration and return of student baseline and follow-up surveys; and

  12. surveying all students participating in a STEM Challenge at a site, instead of selecting a sample.



Non-Response Bias


With the strategies (outlined above) to maximize, NASA expects to achieve a response rate of 80% or higher for the baseline surveys. It is difficult to estimate the response rate to the youth follow-up survey, since the last administration of this survey in FY2014 was mailed to youth approximately six months following the SoI experience. The response rate for the FY2014 follow-up administration was 49%. Similarly, the FY2012 administration of the mail follow-up survey of SoI youth yielded a 52% response rate for the survey. Since the follow-up survey will be administered on-site immediately following the activity, we anticipate a stronger response rate than the FY2014 and FY2012 administrations but have estimated conservatively at 52%.


Given this projected response rate, non-response will pose a problem in our analyses as it introduces bias into our population estimates. Bias occurs if the youth that refuse to participate or leave the study have different characteristics and/or give systematically different responses to the survey (had they responded to it) than the youth who complete the surveys. Poor response rates do not guarantee a biased estimate, as the decision to not participate or leave the study could be completely unrelated to survey answers. The external evaluator will conduct a non-response bias analysis on that administration should the response rate drop below 80%.


Student Non-Response


In FY2014, NASA expanded its plan for addressing potential non-response bias. While poor response rates alone do not guarantee a biased estimate, as the decision to not participate or leave the study could be completely unrelated to survey answers, NASA will examine the bias in estimates because of non-response to either youth survey by following the two steps described below.

1. Examination of Response Rates. The first step will be to monitor the overall response rate and response rate by site and state. High response rates (over 85 percent) for the entire sample might indicate no need for further analysis of bias due to non-response. Large differences in the response rates by site or state serve as indicators that potential biases may exist. For example, if response rate from one state is very low then any difference in the characteristic of interest between this state and other states would result in a bias in the estimates. From the survey results we will examine whether there are differences in the characteristics within states, especially in a state where the response rate is low.


In order to conduct this comparison, the external evaluator will compare returned surveys to site registration lists provided by the sites. Sharing of registration lists will be a condition of sites’ participation in the evaluation.


2. Non-Response Propensity Model. Should the response rate fall below 85 percent we will construct a propensity model to estimate the probability of a student in responding to the survey both for responding and non-responding students; this is called a propensity score. The estimated propensity scores come from a logistic regression model. The model will be based on variables which are available both for non-responding and responding students. Students will be grouped using the estimated propensity scores. Within each group we will compare the frame characteristics of responding and non-responding students (e.g., grade level). This grouping in addition to assessing the bias will also provide a method of forming weighting classes for adjusting the weights of responding students to reduce the bias due to non-response.



To statistically adjust procedures to account for student non-response, the external evaluator will alter the models to include weights that compensate for the missing data from non-responders. These weights will be derived from estimates of propensity scores, defined as the probability of being a complete case (i.e. a responder to multiple survey waves) given a responder’s demographic characteristics. The estimates will be derived from a logistic regression predicting whether or not a student is a responder to both the first and the second survey based on his/her demographic variable values. For students who responded to the first but not the second of the survey waves (i.e. partial responders), estimated probabilities can be obtained from the logistic regression, and multiplying these estimated probabilities by one minus the proportion of non-responders gives estimates of the propensity scores. Weights derived from these propensity score estimates can be used to prevent biased data analyses if (i) data from non-responders is missing “completely at random,” (ii) non-response on a single survey only is missing “at random” with respect to the demographic variables, and (iii) the logistic regression model is correct. While, in practice, it is unlikely that these assumptions strictly hold, if the non-response rate is relatively low, then they are sufficiently plausible that weights based on them will have some value in limiting bias due to non-response.


Under these assumptions, weights equal to the reciprocals of the estimated propensity scores can be used in complete case data analyses to produce approximately unbiased results; e.g., performing weighted t-tests on continuous outcomes. However, the presence of observations with large weights (i.e., reciprocals of very small propensity scores) may result in estimates with high variability. It is therefore often useful to “trade off” some bias for a lessening of variance by developing weighting classes based on the estimated propensity scores of complete cases and of students who only responded to one survey. All of these students are sorted by their estimated propensity scores, and the sorted list is partitioned into quintiles. Each quintile constitutes a weighting class, and all students in a weighting class are assigned the same weight, namely, the reciprocal of the proportion of complete cases in the weighting class.


B. . DESCRIBE ANY TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN.


Survey development and procedures were tested and refined as follows. The 2010 pilot surveys were fielded in summer 2010, revised in fall 2010, and updated in winter 2010 to measure outcomes of interest in FY2011. The survey instruments, including the STEM attitudinal scale, were modified and administered in FY2013 and FY2015. Because NASA has administered the modified instruments for two consecutive years, no further field testing was conducted.


For the survey instruments included in this clearance package, existing question items or validated item scales were selected after an extensive literature review and consultation with experts. Given that the youth surveys were developed using validated question items and scales that have been utilized in other national studies, no cognitive testing was completed. The youth surveys were tested in FY2013 with 6 middle school students to assess any adapted question items for comprehensibility and to estimate time for completion. Estimated times for completion are based on these tests.



B.5 PROVIDE THE NAME AND TELEPHONE NUMBER OF INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS OF THE DESIGN AND THE NAME OF THE AGENCY UNIT, CONTRACTOR(S), GRANTEE(S), OR OTHER PERSON(S) WHO WILL ACTUALLY COLLECT AND/OR ANALYZE THE INFORMATION FOR THE AGENCY.


The plans for statistical analyses for this study were primarily developed by NASA staff. Paragon TEC with the Pacific Institute for Research and Evaluation provided information for the sub-sections on response rates and the non-response bias analysis.


Patricia Moore Shaffer, Ph.D., NASA Office of Education, 202-358-5230

William Scarbrough, Ph.D., Pacific Institute for Research and Evaluation, 502-238-7326


The Office of Education Infrastructure Services (OEIS) was responsible for developing this clearance package. OEIS will provide oversight of the evaluation study. The following individual is the primary statistical leads for this collection:


Lisa Wills, Ph.D., Valador, Inc., Support Contractor to the NASA Office of Education, 202-358-1474




REFERENCES


Bulunuz, M. & Jarrett, O., (2009). Developing an interest in science: Background experiences of preservice elementary teachers. International Journal of Environmental and Science Education 5, 1 (January 2010), 65-84.


Cabrera, A. F. N., & Steven M. (2000). Understanding the college-choice process. New Directions for Institutional Research: 5-22; Choy, S. P. (2002). Access & persistence: Findings from 10 Years of Longitudinal Research on Students. Washington DC, American Council on Education Center for Policy Analysis.


Cohen, J. (1969). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.

Dillman, D. A. (2000). Mail and Internet surveys: The tailored design method (2nd ed.). New York: John Wiley.


Ferry, N. (2006). Factors influencing career choices of adolescents and young adults in rural Pennsylvania. Journal of Extension 44, 3 (June 2006). Retrieved January 3, 2013, from: http://www.joe.org/joe/2006june/rb7.php.


Hossler, D., Schmidt, J. & Vesper, N. (1999). Going to college: How social, economic, and educational factors influence the decisions students make. Baltimore: The Johns Hopkins University Press.


Jarrett, O.S., & Burnley, P.C. (2007). The role of fun, playfulness, and creativity in science: Lessons from geoscientists. In D. Sluss & O. Jarrett (Eds.) Investigating play in the 21st Century: Play and Culture Studies, Vol. 7 (pp. 188-202). Lanham, MD: University Press of America.


Kegan, T. R. (1989). How Charles Darwin became a psychologist. In D. B. Wallace & H. E. Gruber (Eds.), Creative people at work: Twelve cognitive case studies (pp. 107-126). New York: Ox-ford University Press.


L’Engle, K.L., Pardun, C., & Brown, J. (2004). Accessing adolescents: A school-recruited, home-based approach to conducting media and health research. Journal of Early Adolescence, 24, 2, 144-158. Retrieved January 2, 2013, from: http://www.unc.edu/depts/jomc/teenmedia/pdf/Accessing.pdf.


McDonough, P. M. (1997). Choosing colleges: How social class and schools structure opportunity. Albany: State University of New York Press.


Rowsey, R.E. (1997). The effects of teachers and schooling on the vocational choice of university research scientists. School Science and Mathematics, 97(1), 20-27.


Shepard, R. (1988). The imagination of the scientist. In K. Egan & D. Nadaner (Eds.), Imagination across the curriculum (pp.153-185). New York: Teachers College Press.


Swail, W. S. & Hosford, S. (2007). Missouri students and the pathway to college. Virginia Beach, VA: Educational Policy Institute.


Trice, A. D., Hughes, M. A., Odom, C., Woods, K., McClellan, N. C. (1995). The origins of children's career aspirations: IV. Testing hypotheses from four theories. The Career Development Quarterly 43, 4 (June 1996), 307-322.


Tripney, J., Newman, M., Bangpan, M., Niza, C., MacKintosh, M., & Sinclair, J. (2010). Factors influencing young people (aged 14-19) in education about STEM subject choices: A systematic review of the UK literature. London: EPPI-Centre, University of London.

1 The STEM Challenges pilot with NASA is part of the Department of Education’s multi-year initiative to expand high-quality STEM programming in 21CCLC. This initiative created a technical assistance working group of researchers, evaluators, practitioners, and other Federal agencies to support the development of a strategy and series of tools that would assist both state education agencies and sub-grantee sites in the implementation of high-quality STEM efforts. Through this effort ED developed a support strategy to collaborate with other federal agencies to achieve this goal.



2 The FY2014 Summer of Innovation study had comparable response rates, with an 81% response rate for baseline surveys and a 49% response rate for follow-up surveys administered by mail approximately three months following the conclusion of the STEM activity.

3 Although there is a sizeable literature on conducting mail surveys of adults (see, for example, Dillman, 2000), few previous studies have attempted to gather data from adolescents through the mail (L’Engle, Pardon & Brown, 2004). In one of the few studies conducted on this topic—the L’Engle, Pardon & Brown study (2004)—the initial mailed survey generated a response rate of 40%. Additional contact similar to what is proposed for this activity raised the final response rate to 65% or 35% attrition.

4 Evaluation points of contact are site staff designated with responsibility for coordinating the NASA data collection requirements and administering the student surveys.

National Aeronautics and Space Administration Part B: Collection of Information B-9


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Double-Sided Body Template
AuthorAbt Associates Inc
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy