NASA STEM CHALLENGES.Supporting Stmt A. 2015x

NASA STEM CHALLENGES.Supporting Stmt A. 2015x.docx

NASA Office of Education STEM Challenges

OMB: 2700-0150

Document [docx]
Download: docx | pdf





2700-0150

SUPPORTING STATEMENT

FOR OMB CLEARANCE

PART A



NASA STEM Challenges


EVALUATION DATA COLLECTION





National Aeronautics and Science Administration



July 14, 2015











TYPE OF INFORMATION COLLECTION:

Revision of a currently approved information collection

Part A: Justification

The National Aeronautics and Space Administration (NASA) Office of Education, requests that the Office of Management and Budget (OMB) approve, under the Paperwork Reduction Act of 1995, a clearance for NASA to collect youth survey data as part of an outcome evaluation study of NASA STEM Challenge activities, an outgrowth of the Summer of Innovation pilot that concluded in FY2014.


A.1 EXPLAIN THE CIRCUMSTANCES THAT MAKE THE COLLECTION OF INFORMATION NECESSARY.

Background

The NASA STEM Challenges activity is the result of previous OMB guidance to redesign the Summer of Innovation (SoI) pilot as a sustainable model for STEM engagement across the Federal STEM Agencies and to offer SoI as a model through the work of the Committee on STEM (CoSTEM). NASA applied its previous design work and evaluation findings to the design of a STEM Challenges pilot collaboration with the U.S. Department of Education (ED) in 2013-14.1 This pilot paired the extensive reach and infrastructure of ED with NASA’s experience with training community partners through Summer of Innovation in STEM engagement activities and its access to world-class subject matter experts and content in support of the shared CoSTEM goal of increasing and supporting the public engagement of youth (National Science and Technology Council, 2013). During the 2013-14 school year NASA collaborated with 3 states to provide dynamic and engaging STEM Design Challenges to students in the 21st Century Community Learning Center (21CCLC) afterschool programs. During the 2014-15 school year, NASA and ED expanded the activity to more states and 21CCLC sites, reaching 10 states with high quality NASA STEM Design Challenges.


The STEM Challenges activity focuses on STEM Design Challenges for middle school students designed by NASA to meet the content needs of out-of-school time sites (e.g., 21CCLC, 4-H). NASA facilitates the Challenges by providing a blended professional development strategy to support instructional staff in their implementation with a minimum of one in person training session in each participating state. NASA also provides regular opportunities for 21CCLC sites and students to engage with NASA scientists and engineers through a range of technology-based experiences (e.g., Skype) during a minimum of 20 hours of implementation across an 8 week implementation cycle scheduled during the school year. Following the success of the 21CCLC pilot, NASA continues to offer STEM Design Challenges in collaboration with the Department of Education and to seek other partnerships through which this activity could be offered.


This clearance package modifies the SoI evaluation activities previously approved under OMB control number 2700-0150 to align with the new circumstances of this information collection. This request includes the following instruments that collect standardized data from 10 or more respondents:


  • Baseline youth survey (Appendix 1; item by item justification provided in Appendix 2)

  • Follow-up youth survey (Appendix 3; item by item justification provided in Appendix 4)


The data to be collected are not available elsewhere unless collected through this information collection. The youth instruments will be used to gather data prior to and following the STEM Challenge activities in order to assess change in the key short-term outcome of youth attitude toward STEM. Information about implementation will be gathered from numerous sources, including review of student work products and activity observations; the implementation data collection is not subject to the Paperwork Reduction Act.


A.2 INDICATE HOW, BY WHOM, AND FOR WHAT PURPOSE THE INFORMATION IS TO BE USED.

How Information Will Be Collected

Data will be collected through surveys; see Exhibit 1 for an overview of the data collection associated with the STEM Challenges summative evaluation, including a crosswalk between the research questions and the data collection strategies. Youth outcome data will be collected through survey instruments. Data collected through the OMB-cleared forms will be complemented by other information collected through different strategies (e.g., activity observations, analysis of student work products) that are not required to be cleared through the Paperwork Reduction Act regular clearance process due to the small number of participants or the nature of the information collection.


Consent Process

Site coordinators selected for participation in this evaluation study will be asked to administer a parent consent notice that describes the evaluation components (a youth baseline survey and a youth follow-up survey) and asks parents to give consent for their children’s inclusion in the evaluation (see Appendix 5). NASA intends to use a passive parental consent notice,2 which will include information about the study, time required, risks, benefits, confidentiality, and voluntary participation, as well as contact information for further questions. Parents will be informed that if they do not wish for their child(ren) to participate in the evaluation, then they can opt them out of these activities by calling or emailing the designated evaluation staff person. Parents will be assured that students will not be punished nor rewarded for parental consent. Students without consent for the evaluation will still be allowed to participate in the activity.


As recommended for the protection of human subjects,3 a statement that participation in individual surveys is voluntary is included on each survey form. That is, survey participation remains voluntary and will not affect student participation in a STEM Challenge. Students can choose to skip any questions that make them uncomfortable. The consent strategy also will be reviewed internally as part of the IRB clearance process.


Youth Surveys

The youths in grades 5-8 taking part in STEM Challenges will be asked to complete baseline and follow-up surveys (see Appendices 1 and 3). The baseline survey form provides information regarding the purpose of the data collection and collects data on motivation for signing up for a STEM Challenge, personal interest in STEM, and participation in STEM activities. The follow-up survey repeats questions on interest and participation in STEM to track differences in these areas and also asks questions that require youth to recall the STEM Challenge and its impact on them. Crosswalks that describe how the survey items link to the research questions, their purpose, and their sources are included in Appendices 2 and 4.


The baseline youth survey form will be made available in paper format due to anticipated constraints in access to technology in sites. For example, NASA found that some 21CCLC sites did not have adequate technology to survey students using an online form; paper survey forms were provided to sites upon request. The survey will be administered to all participating youth by the site coordinators after they have signed up for a STEM Challenge but prior to participation in any Challenge-related activities. Paper survey forms will be returned to the external evaluator using a self-addressed stamped envelope for safe-keeping and data entry.


Follow-up surveys will be administered by the site coordinators following the conclusion of the STEM Challenge – approximately 9 weeks following the administration of the baseline survey. Since student transience is expected to be ongoing during the school year, we anticipate some attrition as youth move out of their out-of-school time program or leave the STEM Challenge. For those students who complete the STEM Challenge, we anticipate a high response rate since students will remain in the same out of school time setting. Follow-up efforts with the site coordinators to increase response rates include up to three phone calls to encourage non-responsive sites to complete their surveys and mailing additional sets of surveys if paper survey forms were used.


While dates may vary in ensuing years, we anticipate that baseline survey administration in 2016 will begin in early February 2016 and follow-up survey administration in early May 2016. The broad survey administration period is required since historically some sites have initiated the activity at different times. Data collection procedures are also discussed in Part B.


Who Will Collect the Information


In early communications with selected sites, NASA will provide an overview of the evaluation study and articulate the role of site coordinators / instructors in supporting the parent consent and survey administration process. An evaluation point of contact will be requested for each site. This individual will hold responsibility for ensuring that parent consent notices and surveys are administered by instructors and collected and submitted to the external evaluator. The site POCs will be accountable to NASA for ensuring that all data collection occurs in a timely manner. The NASA Headquarters Evaluation Manager is responsible for providing oversight of the evaluation, while the evaluation contractor will coordinate evaluation activities.


Prior to administration, mandatory webinar trainings will be provided to evaluation points of contacts in order to prepare them to collect parent consent and administer the surveys to ensure that the data are collected consistently across sites. Additional evaluation guidance will be provided in the form of a comprehensive guide to the evaluation activities available in electronic format.


For What Purpose


The purpose of this data collection effort is to support the evaluation of the STEM Challenges activity. The goal of this evaluation is twofold: to collect information on implementation to inform NASA’s continued improvement of the program model, and to collect outcome data to assess the activity’s effectiveness.


Exhibit 1 below outlines the key summative research questions for the STEM Challenges evaluation, data collection instruments and sources, and analytical approach. As is explained in more detail in Part B, the anticipated annual participation includes 80 sites within ten states participating in at least one of the NASA STEM Challenges. Data collections for which we are seeking PRA clearance are highlighted in bold in the Data Source/ Instrument column.


Exhibit 1. STEM Challenge Summative Evaluation Questions

Evaluation Question

Data Source/Instrument

Data Analysis Approach

Summative Evaluation


1. Do students completing a NASA Challenge demonstrate increased positive attitudes toward STEM?

  • Pre and post student attitudes toward STEM (Surveys)

  • Descriptive statistical analysis

]

2. To what extent is there an association between participation in the NASA Challenges and a change in students’ attitudes towards STEM?

  • Pre and Post student attitudes toward STEM (Surveys)

  • Descriptive statistical analysis including multi-level models such as HLM

3. Do students describe a quality engineering design process in their final product (i.e., video)?

  • Observation of the final student products using scoring rubric

  • Rating of engineering design process and skills

  • Descriptive statistical analysis including multi-level models such as HLM


Summative research questions 1-3 will be answered in part using the baseline and follow-up surveys. Analysis of survey data will allow the external evaluator to explore changes associated with youth interest and participation in STEM. In the survey, NASA focuses primarily on youths’ interest in science. While NASA is certain science will be addressed by all STEM Challenges, technology and engineering are also addressed through the adoption in these instruments of attitudinal scales for the youth surveys that incorporate statements about engineering and technology. Example attitudinal statements addressing engineering and technology are as follows:

  • I like to take things apart to learn more about them.

  • I like to be part of a team that designs and builds a hands-on project.

  • I like to design a solution to a problem.

  • I’m curious to learn how to program a computer game.

  • I like to design and build something mechanical that works.


As mentioned earlier, while measuring outcomes at multiple points in time can provide evidence of whether the outcomes of interest change, it will not allow us to rule out the possibility that something other than the program is affecting this change. However, it will support investigation into associations between implementation and outcomes of interest to inform future program strategy, as well as inform the future decision about whether an investment in a more rigorous impact evaluation should be undertaken.


A.3 DESCRIBE WHETHER, AND TO WHAT EXTENT, THE COLLECTION OF INFORMATION INVOLVES THE USE OF AUTOMATED, ELECTRONIC, MECHANICAL, OR OTHER TECHNOLOGICAL COLLECTION TECHNIQUES OR OTHER FORMS OF INFORMATION TECHNOLOGY, E.G. PERMITTING ELECTRONIC SUBMISSION OF RESPONSES, AND THE BASIS FOR THE DECISION FOR ADOPTING THIS MEANS OF COLLECTION. ALSO DESCRIBE ANY CONSIDERATION OF USING INFORMATION TECHNOLOGY TO REDUCE BURDEN.


The surveys will be administered using an online survey administration tool as well as by paper. Online administration will be used when students have adequate access to an onsite computer lab for survey-taking. Paper administration will be conducted at the request of host sites; this request is typically made when access to a computer lab is difficult. The evaluator will provide self-returned stamped envelopes for the return of completed paper survey forms to their office for processing. The evaluator will track the completion of online surveys and the return of paper surveys by site in order to conduct appropriate follow-up to ensure a strong response rate.


A.4 DESCRIBE EFFORTS TO IDENTIFY DUPLICATION.


This effort will yield data to assess STEM Challenge implementation and measures of participant outcomes; as such, there is no similar evaluation being conducted and there is no alternative source for collecting the information. NASA has identified a contractor who will be responsible for coordinating the requests for information from the STEM Challenge evaluation team to ensure that duplicative questions are not asked.


A.5 IF THE COLLECTION OF INFORMATION IMPACTS SMALL BUSINESSES OR OTHER SMALL ENTITIES (ITEM 5 OF THE OMB FORM 83-1), DESCRIBE THE METHODS USED TO MINIMIZE BURDEN.


No small businesses will be involved as respondents. The primary survey entities for data collection efforts described in this package are youths and educators employed by 21CCLC sites. Burden is minimized for all respondents by requesting only the minimum information to meet study objectives.

A.6 DESCRIBE THE CONSEQUENCE TO FEDERAL PROGRAM OR POLICY ACTIVITIES IF THE COLLECTION IS NOT CONDUCTED OR IS CONDUCTED LESS FREQUENTLY, AS WELL AS ANY TECHNICAL OR LEGAL OBSTACLES TO REDUCING BURDEN.


Each form is used once annually in this evaluation, therefore frequency of use of individual forms is not an issue. None of these information forms in their present state have been utilized before with this target population in 21CCLC sites.


If the proposed youth survey data were not collected, NASA would not fulfill its objectives in investigating youth outcomes that may be associated with participation in STEM Challenges. By not collecting survey data, Federal resources would be allocated and program decisions would be made in the absence of information about the actual outcomes achieved in 21CCLC sites.

A.7 EXPLAIN ANY SPECIAL CIRCUMSTANCES THAT WOULD CAUSE AN INFORMATION COLLECTION TO BE CONDUCTED IN A MANNER:


- REQUIRING RESPONDENTS TO REPORT INFORMATION TO THE AGENCY MORE OFTEN THAN QUARTERLY;


- REQUIRING RESPONDENTS TO PREPARE A WRITTEN RESPONSE TO A COLLECTION OF INFORMATION IN FEWER THAN 30 DAYS AFTER RECEIPT OF IT;


- REQUIRING RESPONDENTS TO SUBMIT MORE THAN AN ORIGINAL AND TWO COPIES OF ANY DOCUMENT;


- REQUIRING RESPONDENTS TO RETAIN RECORDS, OTHER THAN HEALTH, MEDICAL, GOVERNMENT CONTRACT, GRANT-IN-AID, OR TAX RECORDS FOR MORE THAN 3 YEARS;


- IN CONNECTION WITH A STATISTICAL SURVEY, THAT IS NOT DESIGNED TO PRODUCE VALID AND RELIABLE RESULTS THAT CAN BE GENERALIZED TO THE UNIVERSE OF STUDY;


- REQUIRING THE USE OF A STATISTICAL DATA CLASSIFICATION THAT HAS NOT BEEN REVIEWED AND APPROVED BY OMB;


  • THAT INCLUDES A PLEDGE OF CONFIDENTIALITY THAT IS NOT SUPPORTED BY AUTHORITY ESTABLISHED IN STATUE OR REGULATION, THAT IS NOT SUPPORTED BY DISCLOSURE AND DATA SECURITY POLICIES THAT ARE CONSISTENT WITH THE PLEDGE, OR WHICH UNNECESSARILY IMPEDES SHARING OF DATA WITH OTHER AGENCIES FOR COMPATIBLE CONFIDENTIAL USE; OR


  • REQUIRING RESPONDENTS TO SUBMIT PROPRIETARY TRADE SECRET, OR OTHER CONFIDENTIAL INFORMATION UNLESS THE AGENCY CAN DEMONSTRATE THAT IT HAS INSTITUTED PROCEDURES TO PROTECT THE INFORMATION'S CONFIDENTIALITY TO THE EXTENT PERMITTED BY LAW.



There are no special circumstances associated with this data collection.


A.8 IF APPLICABLE, PROVIDE A COPY AND IDENTIFY THE DATE AND PAGE NUMBER OF PUBLICATION IN THE FEDERAL REGISTER OF THE AGENCY'S NOTICE, REQUIRED BY 5 CFR 1320.8(d), SOLICITING COMMENTS ON THE INFORMATION COLLECTION PRIOR TO SUBMISSION TO OMB.


In accordance with the Paperwork Reduction Act of 1995, on January 20, 2015, Vol. 80 No. 12, pages 2745 to 2746, NASA published the 60-day federal register notice associated with this information collection in the Federal Register. Comments were not received.


DESCRIBE EFFORTS TO CONSULT WITH PERSONS OUTSIDE THE AGENCY TO OBTAIN THEIR VIEWS ON THE AVAILABILITY OF DATA, FREQUENCY OF COLLECTION, THE CLARITY OF INSTRUCTIONS AND RECORDKEEPING, DISCLOSURE, OR REPORTING FORMAT (IF ANY), AND ON THE DATA ELEMENTS TO BE RECORDED, DISCLOSED, OR REPORTED.



The youth surveys and guidance on their administration were developed by NASA staff in consultation with several external experts, including Laura LoGerfo, the former Project Officer for High School Longitudinal Study of 2009 at the U.S. Department of Education National Center for Education Statistics; Gil Noam, Founder and Director of the Program in Education, Afterschool & Resiliency (PEAR), Harvard University; and Sara Spiegel, Director of Administration at the Noyce Foundation. Several experts also advised on the evaluation design, including Henry Frierson, University of Florida; Anita Krishnamurthi, Afterschool Alliance; Carol Stoel, National Science Foundation; Robert Tai, University of Virginia; and Diego Zapata-Rivera, Educational Testing Service. Copies of the instruments were also distributed to representatives of Summer of Innovation awards, who were responsible for administration of the baseline youth survey. The awardees provided some feedback on individual question items and survey administration.


The surveys are based on the theory of change depicted in the SoI logic model and the agency’s STEM Engagement logic model and informed by the evaluators’ knowledge of the programmatic activities. Survey question items were selected and/or adapted from previously field-tested and validated instruments, eliminating the need for cognitive testing. The source instruments for the survey question items are as follows:


  • Student Baseline Survey and Parent Baseline Survey, High School Longitudinal Study (HSLS) of 2009, IES/Department of Education

  • Assessing Women and Men In Engineering (AWE), Middle School Students Pre-Activity Surveys and Immediate Post-Activity Surveys for Middle School-Aged Participants – Science and Engineering (2009)

  • 4-H Science Youth Survey (2012)

  • Summer of Innovation Parent Survey and Baseline Student Survey (2011)

  • Excited, Engaged and Interested Science Learner Survey (2011), Noyce Foundation


An entire scale of question items on youth interest in science (“Enthusiasm for Science”) was adopted from the Excited, Engaged and Interested Science Learner Survey developed by the Program in Education, Afterschool & Resiliency (PEAR) for the Noyce Foundation and recently validated with a middle school audience as part of the national Youth Engagement, Attitudes, and Knowledge study of the 4-H Science Initiative. This scale incorporates question items from NAEP - Science (2005, 2009), allowing comparison of SoI survey data to nationally representative NAEP results available from the Department of Education. An adapted version of this scale was released as the Common Instrument in 2013 following the release of a validation study by Harvard University.


The use of validated question items and scales enabled NASA to reduce cost by foregoing cognitive testing of instruments. However, the survey instruments were tested for comprehensibility with six youth ranging in grade level from 5th through 8th grade; burden estimates were also obtained through this process. As a result of the testing, minor modifications were made to question items that were not part of a scale.


A.9 EXPLAIN ANY DECISION TO PROVIDE ANY PAYMENT OR GIFT TO RESPONDENTS, OTHER THAN REMUNERATION OF CONTRACTORS OR GRANTEES.


No payment or gifts will be provided to respondents.


A.10 DESCRIBE ANY ASSURANCE OF CONFIDENTIALITY PROVIDED TO RESPONDENTS AND THE BASIS FOR THE ASSURANCE IN STATUTE, REGULATION, OR AGENCY POLICY.


Every effort will be made to maintain the privacy of respondents to the extent provided by law, including the use of several procedural and control measures to protect the data from unauthorized use. Collected data will not be released with personally identifiable information, and results will be presented only in aggregated form. A statement to this effect will be included on all instruments. Respondents will be assured that all information identifying them will be kept private.


The procedures to protect data during information collection, data processing, and analysis activities are as follows:


  • All respondents included in the study sample will be informed that the information they provide will be used only for the purpose of this research. Individuals will not be cited as sources of information in prepared reports.

  • Hard-copy data collection forms will be delivered to a locked area at the external evaluator’s office for receipt and processing. The contractor will maintain restricted access to all data preparation areas (i.e., receipt, coding, and data entry). All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only.

  • The external evaluator’s security measures comply with NASA's privacy and security requirements

  • Individual identifying information will be maintained separately from completed data collection forms and from computerized data files used for analysis.


A.11 PROVIDE ADDITIONAL JUSTIFICATION FOR ANY QUESTIONS OF A SENSITIVE NATURE, SUCH AS SEXUAL BEHAVIOR AND ATTITUDES, RELIGIOUS BELIEFS, AND OTHER MATTERS THAT ARE COMMONLY CONSIDERED PRIVATE.


Questions of a sensitive nature are not asked on these survey instruments.


A.12 PROVIDE ESTIMATES OF THE HOUR BURDEN OF THE COLLECTION OF INFORMATION.


Exhibit 2 presents estimates of the reporting burden for the youth surveys. NASA estimates that the annualized response burden for the entire evaluation is 162 hours. The estimate of the number of respondents is based on an estimate of the universe of STEM Challenge participants across the participating ten states. This estimate is based on actual Challenge participation numbers in FY2014.


For the youth surveys, this estimate assumes that it will take youths about 6 minutes to read each survey’s introduction and answer the questions. Estimates for the youth burden are based on timed administration of the survey instruments to six youth within the targeted grade range.


Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.


An estimate of respondents’ time to complete the surveys is provided in Exhibit 2. We estimate that the annualized cost burden for their time is $1,174.50 for youth to complete the baseline and follow-up surveys. The cost burden associated with the surveys was estimated using the federal minimum wage of $7.25.


Exhibit 2. Estimates of Annualized Burden Hours and Cost for Data Collection; Data Collection Format


Category of Respondent

Number of Respondents

Frequency of Response

Total Minutes per Respondent

Total Response Burden in Hours

Estimated Cost Per Hour

Total Cost Burden

Individuals (Pre-Post Student Surveys)

810 a

2

6

162

$ 7.25b

$ 1,174.50

Total Annual Burden

810

 

 

162

 

$ 1,174.50

a Number of respondents based on estimated total universe.

b Estimated cost per hour for youths is calculated based on federal minimum wage of $7.25 per hour effective July 24, 2009.

Information Collection Instrument Title (and form number when applicable)

A1 Youth Survey (Baseline)

A3 Youth Survey (Follow-up)

Instrument format (paper or electronic )

Electronic; Paper upon request

Electronic; Paper upon request

If electronic, is the instrument fillable? Yes/No

Yes

Yes

If electronic, is the instrument savable electronically Yes, No

No

No

Can the instrument be filed electronically? Yes, No

Yes

Yes


A.13 PROVIDE AN ESTIMATE OF THE TOTAL ANNUAL COST BURDEN TO RESPONDENTS OR RECORDKEEPERS RESULTING FROM THE COLLECTION OF INFORMATION. (

Other than youth’s time to complete the surveys, which are estimated in Exhibit 2, there are no direct monetary costs to respondents. That is, there are no capital and start-up costs nor are there total operation and maintenance and purchase of services costs.


A.14 PROVIDE ESTIMATES OF ANNUALIZED COST TO THE FEDERAL GOVERNMENT.


The total annualized cost of the STEM Challenges evaluation study is $200,000 estimated based on past contracts awarded by NASA. The government estimate for this contract award was developed using actual costs of past contracts for this type of evaluation work and on the actual labor rates charged by this contractor.


A.15 EXPLAIN THE REASON FOR ANY PROGRAM CHANGES OR ADJUSTMENTS REPORTED IN ITEMS 13 OR 14 OF THE OMB FORM 83-1.


This data collection significantly reduces the respondent burden by narrowing the focus of the evaluation to 5th through 8th grade youth participating in STEM Challenges and eliminating information collections from other respondent groups, including parents and teachers.


Exhibit 3. Program Change

REG. NO.

REASON

PREVIOUS BURDEN

NEW BURDEN

DIFFERENCE

TYPE OF CHANGE

N/A

Program change: Change in evaluation design; Reduction in respondent types

822.7 burden hours

162 burden hours

660.7 burden hours

Program change


A.16 FOR COLLECTIONS OF INFORMATION WHOSE RESULTS WILL BE PUBLISHED, OUTLINE PLANS FOR TABULATION, AND PUBLICATION.


The schedule shown in Exhibit 3 displays the sequence of activities required to conduct the information collection activities and includes key dates for activities related to data collection, analysis, and reporting. An evaluation report based on findings from the surveys and other evaluation data will be prepared: the outcome evaluation study report is anticipated by July 2016 and annually thereafter. Please note that this is a notional schedule that may be revised as programmatic details are confirmed.


Exhibit 4. FY 2016 Sample Evaluation Schedule





Activities and Deliverables

Responsible Party

Date

Kick-off meeting with evaluation contractor

NASA

October 2015

Preparation of evaluation plan by contractor

NASA, ED, evaluation contractor

November – December 2015

Recruitment of external evaluation experts

Evaluation contractor

November 2015 – January 2016

Finalization of evaluation plan

Evaluation contractor

December 2015 – January 2016

Orientation of sites; dissemination of evaluation guidance

Evaluation contractor

December 2015 – January 2016

Baseline youth survey collection

Evaluation contractor & site coordinators/ instructors

February 2016

Follow-up youth survey collection

Evaluation contractor

May 2016

Data analysis of baseline/follow-up youth surveys, including non-response bias analysis

Evaluation contractor

May – June 2016

Evaluation report (deliverable)

Evaluation contractor

July 2016

Review of evaluation report by evaluation experts

NASA

August 2016

Program recommendations for NASA portfolio based on outcome evaluation findings by evaluation experts in collaboration with NASA staff

NASA

August 2016


Analysis of Survey Data

Below, the analysis plan for the survey data is summarized. It is discussed in fuller detail in Supporting Statement B.


Descriptive Cross-Sectional Analyses

Because the universe of youth will be sampled, the descriptive statistics for a single point in time do not need to be adjusted for sampling design. Means and standard deviations will be used to describe central tendency and variation for survey items using continuous scales. Frequency distributions and percentages will be used to summarize answers given on ordinal scales.


Descriptive Change Over Time Analyses

NASA will examine the youth survey data to provide simple descriptions of change in a variable over time. We will test whether the difference in proportions and means between two time points is zero using a McNemar test or paired t-test, depending on the distribution of the outcome variables.

A.17 IF SEEKING APPROVAL TO NOT DISPLAY THE EXPIRATION DATE FOR OMB APPROVAL OF THE INFORMATION COLLECTION, EXPLAIN THE REASONS THAT DISPLAY WOULD BE INAPPROPRIATE.


The agency will display the expiration date and burden estimate on the information collection instruments, within the PRA Statement.

A.18 EXPLAIN EACH EXCEPTION TO THE CERTIFICATION STATEMENT IDENTIFIED IN ITEM 19, "CERTIFICATION FOR PAPERWORK REDUCTION ACT SUBMISSIONS," OF OMB FORM 83-1.

The NASA office conducting or sponsoring this information collection certifies compliance with all provisions listed above.

The proposed collection of information –

(a) is necessary for the proper performance of the functions of NASA, including that the information to be collected will have practical utility;

(b) is not unnecessarily duplicative of information that is reasonably accessible to the agency;

(c) reduces to the extent practicable and appropriate the burden on persons who shall provide information to or for the agency, including with respect to small entities, as defined in the Regulatory Flexibility Act (5 U.S.C. 601(6)), the use of such techniques as:

(1) establishing differing compliance or reporting requirements or timelines that take into account the resources available to those who are to respond;

(2) the clarification, consolidation, or simplification of compliance and reporting requirements; or

(3) an exemption from coverage of the collection of information, or any part thereof;

(d) is written using plain, coherent, and unambiguous terminology and is understandable to those who are targeted to respond;

(e) indicates for each recordkeeping requirement the length of time persons are required to maintain the records specified;

(f) has been developed by an office that has planned and allocated resources for the efficient and effective management and use of the information to be collected, including the processing of the information in a manner which shall enhance, where appropriate, the utility of the information to agencies and the public;

(g) when applicable, uses effective and efficient statistical survey methodology appropriate to the purpose for which the information is to be collected; and

(h) to the maximum extent practicable, uses appropriate information technology to reduce burden and improve data quality, agency efficiency and responsiveness to the public; and

(i) will display the required PRA statement with the active OMB control number, as validated on www.reginfo.gov




Name: Carolyn Knowles , NASA Education Infrastructure Division

References


Assessing Women and Men in Engineering. (Undated). STEM assessment tools. Retrieved October 20, 2012, from: http://www.engr.psu.edu/awe/.


Berry, S., Pevar, J., & Zander-Cotugno, M. (2008). Use of incentives in surveys supported by Federal grants: Paper presented at Council of Professional Associations on Federal Statistics seminar titled “Survey Respondent Incentives: Research and Practice.” March 10, 2008. Retrieved October 22, 2012, from: http://www.rand.org/content/dam/rand/pubs/working_papers/2008/RAND_WR590.pdf.

Church, A. (1993). Incentives in Mail Surveys: A Meta-Analysis. Public Opinion Quarterly 57(1):62-79.

Conrad, F.G., Couper, M.P., Tourangeau, R., and Peytchev, A. (2006). Use and non-use of clarification features in web surveys. Journal of Official Statistics 22, 245-269.


Couper, M. P. (2008). Designing Effective Web Surveys. Cambridge, England: Cambridge University Press.


Dillman, D. A., Smyth, J.D.; and Christian, L.M. (2009). Internet, mail, and mixed-mode surveys: The

tailored design method, 3rd Edition, Hoboken, NJ: John Wiley & Sons.


Dillman, D. A. (undated), Token financial incentives and the reduction on nonresponse error in mail surveys. Retrieved October 22, 2012, from: http://sesrc.wsu.edu/pap_tok.htm.


Galesic, M., Tourangeau, R., Couper, M.P., and Conrad, F. (2008). Eye-tracking data: New insights on response order effects and other cognitive shortcuts in survey responding. Public Opinion Quarterly 72, 892-913.


Heerwegh, D. and Loosveldt, G. (2003). An evaluation of the semiautomatic login procedure to control web survey access. Social Science Computer Review 21, 223-234.


Hunt-White, T. (2007). The influence of selected factors on student survey participation and mode of completion. Retrieved December 19, 2012, from: http://www.fcsm.gov/07papers/Hunt-White.III-C.pdf.


James, J.M., and Bolstein, R. (1992). Large monetary incentives and their effect on mail survey response

rates. Public Opinion Quarterly, 56, 442-453.


Kerachsky, S. J., & Mallar, C.D. (1981). The effects of monetary payments on survey responses: Experimental evidence from a longitudinal study of economically disadvantaged youth. Proceedings of the Section on Survey Research Methods, pp. 258-263. Alexandria, VA: American Statistical Association.


McGrath, J. (2006). An Incentives Experiment in the U.S. Consumer Expenditure Quarterly Survey. Retrieved December 19, 2012, from: http://www.bls.gov/ore/pdf/st060030.pdf.


National Center for Education Statistics. (Undated). National Assessment of Educational Progress: More about NAEP science. Retrieved October 25, 2012, from: http://nces.ed.gov/nationsreportcard/science/moreabout.asp.


National Center for Education Statistics. (Undated). High School Longitudinal Study of 2009 (HSLS: 2009): Questionnaires. Retrieved October 20, 2012, from: http://nces.ed.gov/surveys/hsls09/questionnaires.asp.


Peytchev, A., Couper, M.P., McCabe, S., & Crawford, S. (2006). Web survey design: Paging versus scrolling. Public Opinion Quarterly 70, 596-607.


Policy Studies Associates, Inc. (2012). 4-H Science Initiative: Youth engagement, attitudes, and knowledge study. Prepared for the National 4-H Council. Retrieved October 13, 2012, from: http://www.pearweb.org/atis/tools/62.


Redline, C. & Dillman, D. (2002). The Influence of alternative visual designs on respondents’ performance with branching questions in self-administered questionnaires. In R.M. Groves, D.A. Dillman, J.A. Eltinge, and R.J.A. Little (Eds.), Survey Nonresponse. New York: Wiley, 179-193.


Rosoff, P., Werner, C., Clipp, E.C., Buill, A.B., Bonner, M., & Demark-Wahnefried, W. (2005). Response rates to a mailed survey of childhood cancer survivors: A comparison of conditional versus unconditional incentives. Cancer Epidemiology, Biomarkers & Prevention, May 2005, 14.


Singer, E. (2002). “The use of incentives to reduce nonresponse in household surveys” Chapter 11 in Survey Nonresponse, John Wiley and Sons, Inc. New York (2002).


Singer, E., & Kulka, R.A. (2004). Paying respondents for survey participation. Retrieved December 19, 2012, from: http://aspe.hhs.gov/hsp/welf-res-data-issues02/pdf/04.pdf.


U.S. Department of Health and Human Services (HHS). (2006). Research-Based Web Design & Usability Guidelines. Washington D.C.: Government Printing Office.


1 The STEM Challenges pilot with NASA is part of the Department of Education’s multi-year initiative to expand high-quality STEM programming in 21CCLC. This initiative created a technical assistance working group of researchers, evaluators, practitioners, and other Federal agencies to support the development of a strategy and series of tools that would assist both state education agencies and sub-grantee sites in the implementation of high-quality STEM efforts. Through this effort ED developed a support strategy to collaborate with other federal agencies to achieve this goal.


2 Research on the use of active versus passive consent in middle school settings has found that a passive consent strategy is a cost-effective, viable alternative to active consent when supplemented by appropriate back-up and privacy safeguard measures. See P. Ellickson & J. Hawes (1989), An assessment of active versus passive methods for obtaining parental consent, Santa Monica, CA: RAND Corporation. Retrieved December 14, 2014, from: http://www.rand.org/content/dam/rand/pubs/notes/2005/N2935.pdf.

3 See, for instance, see Subpart A, Section 46.116 (General Requirements for Informed Consent) of the Code of Federal Regulations, TITLE 45, Public Welfare, Department of Health and Human Services, Part 46, Protection of Human Subjects. Retrieved January 22, 2013, from: http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html#46.116.

PART A-ii


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT
AuthorPatricia Moore Shaffer
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy