2700.0150.Supp Stmt A.RENEWAL

2700.0150.Supp Stmt A.RENEWAL.docx

NASA 2013 Summer of Innovation Program

OMB: 2700-0150

Document [docx]
Download: docx | pdf





2700-0150

SUPPORTING STATEMENT

FOR OMB CLEARANCE

PART A



NASA Summer of Innovation


FY2013 EVALUATION DATA COLLECTION





National Aeronautics and Science Administration



October 30, 2012

(Revised January 24, 2013)











Part A: Justification

The National Aeronautics and Space Administration (NASA) Office of Education, requests that the Office of Management and Budget (OMB) approve, under the Paperwork Reduction Act of 1995, a clearance for NASA to collect parent survey, youth survey, and teacher focus group data as part of an implementation/outcome evaluation study of NASA’s Summer of Innovation (SoI) Project FY2013. The Summer of Innovation engages and supports external partners in the delivery of evidence-based summer engagement opportunities in STEM to youth from underserved/underrepresented populations with the intent of increasing interest and participation in STEM and contributing toward the national-level impact of increased numbers of high school graduates pursuing STEM majors and careers.


A.1 EXPLAIN THE CIRCUMSTANCES THAT MAKE THE COLLECTION OF INFORMATION NECESSARY. IDENTIFY ANY LEGAL OR ADMINISTRATIVE REQUIREMENTS THAT NECESSITATE THE COLLECTION.


This clearance request pertains to information collection that will occur between February 15, 2013 and August 15, 2013. This information collection supports an implementation/outcome evaluation of SoI. The implementation component of this evaluation is intended to provide information that will inform the continued improvement of this STEM education investment. The outcome evaluation is intended to assess the primary evaluation question of how the SoI experience affects youth engagement with STEM, which includes interest in STEM and participation in STEM activities.


This clearance package revises the SoI evaluation activities previously approved under two OMB control numbers, 2700-0150 and 2700-0151. NASA was asked by OSTP and OMB to propose a new evaluation design for SoI following two consecutive years of low response rates on surveys associated with the previous evaluation design implemented in FY2010 and FY2011. The new design for this evaluation was discussed and agreed upon by the NASA Office of Education and OMB/OIRA, with final consensus on the evaluation design and high-level plan reached on October 10, 2012.

Data under this clearance will be collected during the third year of SoI implementation from the 2011 cohort of SoI national awardees and NASA Centers (collectively referred to as “SoI awardees”) who offer stand-alone SoI camps with a minimum dosage of 30 hours of SoI content1 and an expectation of meeting the minimum success criterion of greater than 75% of enrolled students in actual attendance 51% or more of camp time. This request includes the following instruments that collect standardized data from 10 or more respondents:

  • Parent survey (Appendix 1; item by item justification provided in Appendix 2)

  • Baseline youth survey (Appendix 3; item by item justification provided in Appendix 4)

  • Follow-up youth survey (Appendix 5; item by item justification provided in Appendix 6)

  • Teacher focus group protocol (Appendix 7; consent script presented in Appendix 8)

The data to be collected are not available elsewhere unless collected through this information collection. The youth instruments will be used to gather data prior to and three months following the summer activities in order to assess change in SoI’s key short term outcome of youth engagement with STEM. Information about implementation will be gathered from numerous sources, including teacher group focus group discussions. These data will allow NASA to collect fidelity of implementation and formative data to inform continuous program improvement.


A.2 INDICATE HOW, BY WHOM, AND FOR WHAT PURPOSE THE INFORMATION IS TO BE USED. EXCEPT FOR A NEW COLLECTION, INDICATE THE ACTUAL USE THE AGENCY HAS MADE OF THE INFORMATION RECEIVED FROM THE CURRENT COLLECTION.



How Information Will Be Collected

Data will be collected using several methods; see Exhibit 1 for an overview of the data collection associated with the SoI evaluation, including a crosswalk between the research questions and the data collection strategies. Background and demographic information will be collected through parent surveys that will be part of the registration process. Youth outcome data will be collected through survey instruments, while implementation data for summer activities will be collected via teacher focus group discussions. Data collected through the OMB-cleared forms will be complemented by other information collected through different strategies (e.g., camp observations, PI interviews) that are not required to be cleared by the Paperwork Reduction Act due to the small number of participants or the nature of the information collection.


Consent Process

As part of the registration materials, awardees will include a parent consent form that describes the evaluation components (a parent survey, a student baseline survey, and a student follow-up survey) and asks parents to give consent for their and their children’s inclusion in the evaluation (see Appendix 10). Study inclusion means that registration contact information will be made available to the study team, and that parents and students will be contacted in the fall about the student follow-up survey. However, as recommended for the protection of human subjects,2 a statement that participation in individual surveys is voluntary is included on each survey form. That is, survey participation remains voluntary and will not affect student participation in SoI. The signed parent consent form will be a required component of the SoI registration process. Parents that do not sign and submit the consent form allowing for inclusion in the study will not be eligible to register their children for SoI.


Careful attention will be paid to the ordering of the parent consent form and the parent survey (discussed below) in registration packages. NASA will work with awardees to ensure that the parent consent form precedes the parent survey in paper-based packages. The parent consent form will be made available in online format for those selected awardees who use an online registration process. NASA will work with awardees to ensure that when the consent form is made available in an online registration package, it will precede the link to the parent survey.


Parent Surveys

Participating SoI awardees will include the parent survey (Appendices 1 and 2) in the camp registration materials that parents/caregivers will be asked to return with the rest of their registration materials. The brief survey form provides information regarding the purpose of the data collection and collects data on youth and parent demographic and personal characteristics as well as the reasons for enrolling their child in the program. While NASA has projected a 85% response rate, it is NASA’s intent to work closely with participating awardees to seek a 100% response rate on surveys since the data collected through this form will enable the agency to conduct non-response bias analysis on the youth surveys.3 An item-by-item justification and crosswalk describing how the items are used to address research questions is included in Appendix 2.


As part of the registration process, awardees will distribute the parent survey (Appendix 1) using either a paper or online format. Offering multiple survey modes will ease burden on the awardees which collect the information, allowing the awardees to offer parents the most convenient mode. Awardees, especially those with online registration procedures, may opt to offer the parent survey form online by providing all parents with the survey URL and a site-specific PIN to gain access to the survey in their registration materials (see Appendix 9 for an example of the PIN and agreement to participate screens). Each access to the survey will create a new record and existing records will not be accessible to the survey responder. Respondents accessing the online survey will go through the survey vendor’s website where they are protected by the vendor’s strict data security system. Only those given the PIN can enter the survey.  Upon entering the PIN, the respondent will need to fill in and submit their surveys. They will not be able to return to a submitted survey. A parent will not be able to save an incomplete survey. If they cannot complete the survey in one sitting, they would need to begin again. The PIN will only give respondents access to a single version of the survey; respondents will not have access to any other respondents’ surveys.  The data collected on the online surveys will be automatically maintained on the survey vendor’s secure server and then safely transferred to the external evaluator. These data will not be accessible by Awardee, Center, or camp administrative staff.  A link to the online survey via the survey vendor’s website will also be available on the NASA Summer of Innovation website.  Awardees that choose to administer the paper version will return them to the external evaluator for safe-keeping and data entry. We anticipate that the registration process will begin by mid-February 2013.


Youth Surveys

The youths in grades 6-8 taking part in stand-alone SoI camps across the awardees will be asked to complete baseline and follow-up surveys (see Appendices 3 and 5). The baseline survey form provides information regarding the purpose of the data collection and collects data on motivation for registering for SoI, personal interest in STEM, and participation in STEM activities. The follow-up survey repeats questions on interest and participation in STEM to track differences in these areas and also asks questions that require youth to recall SoI and its impact on them. Crosswalks that describe how the survey items link to the research questions, their purpose, and their sources are included in Appendices 4 and 6.


The baseline youth survey form will be made available in paper format. The survey will be administered to all enrolled youth during the first day of the camp experience. Paper survey forms will be returned to the external evaluator for safe-keeping and data entry.


Follow-up surveys will be administered by mail in October 2013, approximately three months following the completion of a camp. Since mobility is expected to be ongoing prior to the start of the school year, in September, the evaluation team will send via first-class mail a pre-notification letter to the parent at their home addresses on file reminding them that a survey for their child will be coming in October. Given the short time period between follow-up survey reminder letter and follow-up survey administration, only one reminder will be sent. In addition to the the survey reminder, the first-class letter will contain a pre-paid postcard addressed to the external evaluator requesting any updated contact information. In addition to any postcards returned to the external evaluator, updated addresses will be obtained through letters returned by the U.S. Postal Service with forwarding addresses. Further, the external evaluator will use the Lexis-Nexis database, which provides access to public records to verify information, to update home addresses. Follow-up student surveys will be mailed to the home address on file in October, with a return postage paid envelope. Follow-up efforts to increase response rates include up to three phone calls to encourage non-responders to complete their surveys and mailing another copy of the survey to non-responders.


We anticipate that baseline survey administration will begin as early as May 2013 and follow-up survey administration in October 2013. Data collection procedures are also discussed in Part B.


Teacher Focus Group Discussions

Approximately 50 teachers will be asked to participate in focus group discussions held in conjunction with site visits conducted to each participating awardees’ summer program. See Appendices 7 and 8 for the focus group protocol, including the consent script. The protocol asks questions about camp-level implementation intended to ascertain the supports and challenges awardees faced in implementing SoI curricula; the staff, materials, and NASA resources necessary for successful SoI activities; and the timing of plans and preparation for successful program implementation. These interviews will allow the evaluators to collect qualitative descriptions of the SoI programs as implemented that will complement information collected through the PI interviews and the quantitative performance data collected through reporting forms.


The external evaluator will work closely with the participating awardees and camp staff to issue an invitation to all lead camp teachers to participate in one focus group discussion per awardee. The emailed invitation will include in part the consent script included in Appendix 8. The focus group discussion will be facilitated on-site as part of a regularly scheduled evaluation site visit. The awardee will be responsible for coordinating the logistics for the focus group discussion, including providing a location appropriate to convening a focus group discussion. Representatives from the awardee administration and NASA will not be permitted to observe the discussion. The external evaluator, with the permission of the participating teachers, will audio-record the focus group discussion and produced a transcript for analysis. Content analysis will be supported using qualitative data analysis software (N-Vivo), with themes and sub-themes identified using the original questions as guiding categories.


Who Will Collect the Information


As part of the solicitation, awardees have been notified that they will be required to identify an evaluation coordinator among their staff who will assist in the evaluation’s data collection. These coordinators will be responsible for overseeing the registration process, including the overall administration of the parent survey form and the baseline youth surveys as well as supporting the external evaluator in the recruitment of teachers for the focus group discussions. An evaluation point of contact will also be identified at each camp. This individual will hold responsibility for ensuring that baseline youth surveys are administered by camp instructors and collected and submitted to the external evaluator. The awardee- and camp-level evaluation points of contact will also be responsible for ensuring that student contact information is updated at the conclusion of each camp.4 The awardee PIs/POCs will be accountable to NASA for ensuring that all data collection occurs in a timely manner. The external evaluator will be responsible for administering the follow-up survey and facilitating the focus group discussions. The NASA Headquarters Evaluation Manager is responsible for providing oversight of the evaluation, while the SoI Evaluation Team, which includes SoI project staff, external evaluator representatives, and the Evaluation Manager, will coordinate evaluation activities.


NASA will negotiate the awardees’ statements of collaboration and agreements to explicitly include specific roles and responsibilities for the collection of data, including evaluation data. During the kick-off meeting held in November 2012, the NASA evaluation manager presented the purpose of the evaluation to the principal investigators (PIs) and their evaluation coordinators and outlined their responsibilities; individual conversations were also held with center POCs explaining the evaluation redesign. Prior to administration, mandatory webinar trainings will be provided to evaluation points of contacts and PIs in order to prepare them to collect parent consent and administer the surveys to ensure that the data are collected consistently across sites. Additional evaluation guidance will be provided in FY 2013 to awardees in the form of a comprehensive guide to the evaluation activities available online and in hardcopy.


For What Purpose


The purpose of this data collection effort is to support the national evaluation of the SoI project. The goal of this evaluation is twofold: to collect information on implementation to inform NASA’s continued improvement of the program model, and to collect outcome data to assess the project’s effectiveness. As such, the evaluation will focus on describing SoI’s implementation and associated outcomes, but will not determine whether there is a causal link between the program and outcomes.


NASA has collected evaluation and performance data on the SoI project since FY2010. While past evaluation studies have not yielded high-quality findings, NASA has utilized performance data to improve the project. For instance, NASA recently utilized performance data, including on the number of students as well as qualitative data collected through a site visit, to identify awardees who are under-performing. This identification was followed by targeted assistance to improve performance and increased accountability.


Exhibit 1 below outlines the research questions for the SoI national evaluation, data collection instruments and sources (including, but not limited to, data collections that are part of this clearance package), and constructs. As is explained in more detail in Part B, the participating sample includes five Awardees whose programming meets the evaluation requirement of a minimum of 30 hours of SoI content during a one-week, stand-alone SoI camp for rising 6th through 8th grade youth. Data collections for which we are seeking PRA clearance are highlighted in bold in the Data Collection Instrument/Source column.


Exhibit 1: National Evaluation Research Questions

Research Questions

Indicators

Data Collection Instrument/Source

Timing of Collection

1. To what extent do SoI awardees meet new project requirements?

% of SoI camps, by awardee and across project, that include:
- Engagement of parents/caregivers in family events;
- Student to teacher ratio no greater than 15:1;
- Inquiry-based NASA-related STEM experiences

SoI project activity reporting form and back-up documentation (program flyers) providing evidence for audit purposes

Monthly

2. To what extent do SoI camps meet the minimum expectation of 30 hours of SoI content in a one-week camp?

% of SoI camps including:
- Minimum 30 hours of SoI content during a one-week period

SoI project activity reporting form and back-up documentation (program flyers, attendance logs, camp curriculum, etc.) providing evidence of minimum 30 hours of SoI content for auditing purposes

Monthly

3. To what extent do SoI camps meet the minimum success criterion of >75% of enrolled students in actual attendance > 51% of camp time?

Percentage of students reported as attending an SoI camp at least 51% of camp time

SoI project activity reporting form and back-up documentation (i.e., attendance logs) providing evidence for auditing purposes

Monthly

4. What are the characteristics of SoI camps and their participants?


- NASA curricula used in camp
- % of camp educators who are certified teachers )
- Average # of hours spent by camp educators in PD
- Average student to teacher ratio in SoI camps
- # of youth by gender, race, ethnicity, grade
- # of youth from underserved/ underrepresented populations participating in SOI camps

- Average parent educational expectations for child

- Average youth’s educational expectations for self

- Median household income of zip code areas in which camp participants are resident

- Average parent educational achievement level

- Parent and youth motivation for enrollment

- % youth with previous SoI experience

- % of parents with academic degrees in STEM

- % of parents in STEM careers

- # of SoI contact hours for students
- # of unique parents/caregivers attending SoI events5

SoI project activity reporting form and back-up documentation on NASA curricula used in camp


Parent survey


Baseline and follow-up youth survey

Monthly




Once per summer

5. To what extent do SoI camps meet program quality expectations?

% of observed SoI camps provide reasonable or compelling evidence of effective OST STEM practices for the learning environment, activity engagement, STEM knowledge and practices, and youth development in STEM

Site visit with camp observations using the Dimensions of Success observation tool (PEAR, 2012)

Once per summer

6. What supports and challenges do awardees face in implementing SoI curricula? How do they handle these challenges?

Qualitative description of supports and challenges

Teacher focus group protocol

PI interview protocol

Review of quarterly reports

Once per summer



Quarterly

7. What staff, materials, and NASA resources are necessary for successful SoI activities?

Qualitative description of necessary staff, materials, and NASA resources

8. How early and to what extent must plans and preparation begin for successful program implementation?

Qualitative description of necessary timeline of activities for SoI camp implementation

9. What processes and materials do camps use to register students for SoI?

Qualitative description of camp registration process, timeline, requirements, and materials, including forms

PI interview protocol

Once per summer

10. How does the SoI experience affect youth engagement with STEM?

% of youth whose interest in STEM changed significantly between the baseline and follow-up surveys, by awardee and camp, by total population and underserved/underrepresented

% of youth whose participation in STEM changed significantly in either in-school, extracurricular, or out-of-school activities, by total population and underserved/underrepresented

Baseline and follow-up youth survey with validated scales for assessment of student interest and participation in STEM




Parent survey

Baseline: Prior to camp activities

Follow-up: Administered 3 months following end of camp


Once per summer



Implementation data collected through the project activity reporting form, focus group discussions, activity observations, and interviews answer questions 1-9. The data will be vital in monitoring implementation to ensure accountability and providing critical information to inform continuous project improvement.


The last research question will be answered using the baseline and follow-up youth survey forms. Analysis of survey data will allow the external evaluator to explore changes associated with youth interest and participation in STEM. In the youth survey, NASA focuses primarily on youths’ interest in science. While NASA is certain science will be addressed by all SoI programs, technology and engineering are addressed in the SoI curriculum to a significantly lesser degree and mathematics is rarely addressed. Since engineering and technology are part of the SoI curriculum, NASA did adopt attitudinal scales for the youth surveys that incorporate statements about engineering and technology. Example attitudinal statements addressing engineering and technology are as follows:

  • I like to take things apart to learn more about them.

  • I like to be part of a team that designs and builds a hands-on project.

  • I like to design a solution to a problem.

  • I’m curious to learn how to program a computer game.

  • I like to design and build something mechanical that works.


As mentioned earlier, while measuring outcomes at multiple points in time can provide evidence of whether the outcomes of interest change, it will not allow us to rule out the possibility that something other than the program is affecting this change. However, it will support investigation into associations between implementation and outcomes of interest to inform future program strategy, as well as inform the future decision about whether a more rigorous impact evaluation should be undertaken.



A.3 DESCRIBE WHETHER, AND TO WHAT EXTENT, THE COLLECTION OF INFORMATION INVOLVES THE USE OF AUTOMATED, ELECTRONIC, MECHANICAL, OR OTHER TECHNOLOGICAL COLLECTION TECHNIQUES OR OTHER FORMS OF INFORMATION TECHNOLOGY, E.G. PERMITTING ELECTRONIC SUBMISSION OF RESPONSES, AND THE BASIS FOR THE DECISION FOR ADOPTING THIS MEANS OF COLLECTION. ALSO DESCRIBE ANY CONSIDERATION OF USING INFORMATION TECHNOLOGY TO REDUCE BURDEN.


Parent surveys will be available online and on paper. These dual modes of survey data collection will allow awardees to choose the approach that aligns most closely with their own approach to camp registration (online or paper). NASA anticipates that approximately 20% of the parent surveys will be administered electronically. The external evaluator’s electronic mail address and toll-free telephone number will be included on the first page of the survey instruments for participants who have questions.


A.4 DESCRIBE EFFORTS TO IDENTIFY DUPLICATION. SHOW SPECIFICALLY WHY ANY SIMILAR INFORMATION ALREADY AVAILABLE CANNOT BE USED OR MODIFIED FOR USE FOR THE PURPOSE(S) DESCRIBED IN ITEM 2 ABOVE.


This effort will yield data to assess SoI implementation and measures of participant outcomes; as such, there is no similar evaluation being conducted and there is no alternative source for collecting the information. NASA has identified technical representatives who will be responsible for coordinating the requests for information from the SoI project team and contractors to ensure that duplicative questions are not asked.


A.5 IF THE COLLECTION OF INFORMATION IMPACTS SMALL BUSINESSES OR OTHER SMALL ENTITIES (ITEM 5 OF THE OMB FORM 83-1), DESCRIBE THE METHODS USED TO MINIMIZE BURDEN.


No small businesses will be involved as respondents. The primary survey entities for data collection efforts described in this package are parents, youths, teachers, and awardees. Burden is minimized for all respondents by requesting only the minimum information to meet study objectives. All primary data collection will be coordinated by the awardee PIs and center POCs with strong support provided by dedicated staff from the external evaluator, so as to reduce the burden on the SoI awardees.

A.6 DESCRIBE THE CONSEQUENCE TO FEDERAL PROGRAM OR POLICY ACTIVITIES IF THE COLLECTION IS NOT CONDUCTED OR IS CONDUCTED LESS FREQUENTLY, AS WELL AS ANY TECHNICAL OR LEGAL OBSTACLES TO REDUCING BURDEN.


Each form is used once in this evaluation, therefore frequency of use of individual forms is not an issue. None of these information forms in their present state have been utilized before.


If the proposed parent survey data were not collected, NASA would not fulfill NASA’s compliance need to ascertain the demographic characteristics of the SoI participants. If the proposed youth survey data were not collected, NASA would not fulfill its objectives in investigating youth outcomes that may be associated with participation in SoI. Without the implementation data collected from teachers, NASA would not understand the supports and challenges awardees face in implementing SoI curricula or how they handle those challenges. NASA also would not fully understand the staff, materials, and NASA resources necessary for successful SoI implementation nor how early and to what extent must plans and preparation begin for successful implementation. In addition, NASA would not know what would be required to replicate the models, should they be associated with promising outcomes. Thus, by not collecting survey and implementation data, Federal resources would be allocated and program decisions would be made in the absence of information about the actual activities provided by the SoI awardees and lessons learned.


A.7 EXPLAIN ANY SPECIAL CIRCUMSTANCES THAT WOULD CAUSE AN INFORMATION COLLECTION TO BE CONDUCTED IN A MANNER:


- REQUIRING RESPONDENTS TO REPORT INFORMATION TO THE AGENCY MORE OFTEN THAN QUARTERLY;


- REQUIRING RESPONDENTS TO PREPARE A WRITTEN RESPONSE TO A COLLECTION OF INFORMATION IN FEWER THAN 30 DAYS AFTER RECEIPT OF IT;


- REQUIRING RESPONDENTS TO SUBMIT MORE THAN AN ORIGINAL AND TWO COPIES OF ANY DOCUMENT;


- REQUIRING RESPONDENTS TO RETAIN RECORDS, OTHER THAN HEALTH, MEDICAL, GOVERNMENT CONTRACT, GRANT-IN-AID, OR TAX RECORDS FOR MORE THAN 3 YEARS;


- IN CONNECTION WITH A STATISTICAL SURVEY, THAT IS NOT DESIGNED TO PRODUCE VALID AND RELIABLE RESULTS THAT CAN BE GENERALIZED TO THE UNIVERSE OF STUDY;


- REQUIRING THE USE OF A STATISTICAL DATA CLASSIFICATION THAT HAS NOT BEEN REVIEWED AND APPROVED BY OMB;


  • THAT INCLUDES A PLEDGE OF CONFIDENTIALITY THAT IS NOT SUPPORTED BY AUTHORITY ESTABLISHED IN STATUE OR REGULATION, THAT IS NOT SUPPORTED BY DISCLOSURE AND DATA SECURITY POLICIES THAT ARE CONSISTENT WITH THE PLEDGE, OR WHICH UNNECESSARILY IMPEDES SHARING OF DATA WITH OTHER AGENCIES FOR COMPATIBLE CONFIDENTIAL USE; OR


  • REQUIRING RESPONDENTS TO SUBMIT PROPRIETARY TRADE SECRET, OR OTHER CONFIDENTIAL INFORMATION UNLESS THE AGENCY CAN DEMONSTRATE THAT IT HAS INSTITUTED PROCEDURES TO PROTECT THE INFORMATION'S CONFIDENTIALITY TO THE EXTENT PERMITTED BY LAW.



There are no special circumstances associated with this data collection.


A.8 IF APPLICABLE, PROVIDE A COPY AND IDENTIFY THE DATE AND PAGE NUMBER OF PUBLICATION IN THE FEDERAL REGISTER OF THE AGENCY'S NOTICE, REQUIRED BY 5 CFR 1320.8(d), SOLICITING COMMENTS ON THE INFORMATION COLLECTION PRIOR TO SUBMISSION TO OMB. SUMMARIZE PUBLIC COMMENTS RECEIVED IN RESPONSE TO THAT NOTICE AND DESCRIBE ACTIONS TAKEN BY THE AGENCY IN RESPONSE TO THESE COMMENTS. SPECIFICALLY ADDRESS COMMENTS RECEIVED ON COST AND HOUR BURDEN.


In accordance with the Paperwork Reduction Act of 1995, NASA published a notice in the Federal Register announcing the agency’s intention to request an OMB review of data collection activities. The notice was published on June 30, 2011 (FRN 11-056) for a 60-day review period. The Agency issued a second notice on November 14, 2012 (FRN 12-097) for a 30-day review period. No comments have been received.



DESCRIBE EFFORTS TO CONSULT WITH PERSONS OUTSIDE THE AGENCY TO OBTAIN THEIR VIEWS ON THE AVAILABILITY OF DATA, FREQUENCY OF COLLECTION, THE CLARITY OF INSTRUCTIONS AND RECORDKEEPING, DISCLOSURE, OR REPORTING FORMAT (IF ANY), AND ON THE DATA ELEMENTS TO BE RECORDED, DISCLOSED, OR REPORTED.


CONSULTATION WITH REPRESENTATIVES OF THOSE FROM WHOM INFORMATION IS TO BE OBTAINED OR THOSE WHO MUST COMPILE RECORDS SHOULD OCCUR AT LEAST ONCE EVERY 3 YEARS -- EVEN IF THE COLLECTION OF INFORMATION ACTIVITY IS THE SAME AS IN PRIOR PERIODS. THERE MAY BE CIRCUMSTANCES THAT MAY PRECLUDE CONSULTATION IN A SPECIFIC SITUATION. THESE CIRCUMSTANCES SHOULD BE EXPLAINED.


The parent and youth surveys, the teacher focus group protocol, and this PRA clearance package were developed by NASA staff in consultation with several external experts, including Laura LoGerfo, the Project Officer for High School Longitudinal Study of 2009 at the U.S. Department of Education National Center for Education Statistics; Gil Noam, Founder and Director of the Program in Education, Afterschool & Resiliency (PEAR), Harvard University; and Sara Spiegel, Director of Administration at the Noyce Foundation. Several experts also advised on the evaluation design, including Henry Frierson, University of Florida; Anita Krishnamurthi, Afterschool Alliance; Carol Stoel, National Science Foundation; Robert Tai, University of Virginia; and Diego Zapata-Rivera, Educational Testing Service. Copies of the instruments were also distributed to representatives of Summer of Innovation awards, who are responsible for administration of two of the instruments (parent survey; baseline youth survey). The awardees provided some feedback on individual question items and survey administration.


The surveys are based on the theory of change depicted in the SoI logic model (revised in August 2012) and informed by the evaluators’ knowledge of the program. Survey question items were selected and/or adapted from previously field-tested and validated instruments, eliminating the need for cognitive testing. The source instruments for the survey question items are as follows:


  • Student Baseline Survey and Parent Baseline Survey, High School Longitudinal Study (HSLS) of 2009, IES/Department of Education

  • Assessing Women and Men In Engineering (AWE), Middle School Students Pre-Activity Surveys and Immediate Post-Activity Surveys for Middle School-Aged Participants – Science and Engineering (2009)

  • 4-H Science Youth Survey (2012)

  • Summer of Innovation Parent Survey and Baseline Student Survey (2011)

  • Excited, Engaged and Interested Science Learner Survey (2011), Noyce Foundation


In the case of the youth surveys, an entire scale of question items on youth interest in science (“Enthusiasm for Science”) was adopted from the Excited, Engaged and Interested Science Learner Survey developed by the Program in Education, Afterschool & Resiliency (PEAR) for the Noyce Foundation and recently validated with a middle school audience as part of the national Youth Engagement, Attitudes, and Knowledge study of the 4-H Science Initiative. This scale incorporates question items from NAEP - Science (2005, 2009), allowing comparison of SoI survey data to nationally representative NAEP results available from the Department of Education. This survey will be released as the Common Instrument in 2013 following the release of a validation study by Harvard University.


A.9 EXPLAIN ANY DECISION TO PROVIDE ANY PAYMENT OR GIFT TO RESPONDENTS, OTHER THAN REMUNERATION OF CONTRACTORS OR GRANTEES.


No payment or gifts will be provided to respondents.


A.10 DESCRIBE ANY ASSURANCE OF CONFIDENTIALITY PROVIDED TO RESPONDENTS AND THE BASIS FOR THE ASSURANCE IN STATUTE, REGULATION, OR AGENCY POLICY.


Every effort will be made to maintain the privacy of respondents to the extent provided by law, including the use of several procedural and control measures to protect the data from unauthorized use. Collected data will not be released with personally identifiable information, and results will be presented only in aggregated form. A statement to this effect will be included on all instruments and will be read to teachers prior to participating in focus group discussions. Respondents will be assured that all information identifying them will be kept private.


The procedures to protect data during information collection, data processing, and analysis activities are as follows:


  • All respondents included in the study sample will be informed that the information they provide will be used only for the purpose of this research. Individuals will not be cited as sources of information in prepared reports.

  • Hard-copy data collection forms will be delivered to a locked area at the external evaluator’s office for receipt and processing. The contractor will maintain restricted access to all data preparation areas (i.e., receipt, coding, and data entry). All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only.

  • Respondents accessing the online surveys will go through the external evaluator’s website where they are protected by the external evaluator’s strict data security system. Only those given the seven-digit personal identification numbers (PIN) can enter the survey. Upon entering the PIN, the respondent moves to a non-public directory inaccessible through the Internet. When entered, the data interfaces with a script located on a second non-public directory accessible only to the external evaluator system administrator. Sample screen shots of the introductory screens that respondents will see are included in Appendix 9. The final introductory screens of the online surveys will comply with NASA’s privacy and security requirements.

  • The external evaluator takes every precaution to ensure data collected on the Internet remains both secure and confidential. All external evaluator data collection servers are housed in a facility that has redundant power, expandable bandwidth and a high level of physical security. All data collection and data storage servers are built on a Storage Area Network (SAN). Database servers are mirrored with an active/passive configuration. Passive servers become active 30 seconds after a hardware failure. The external evaluator maintains enough excess hardware capacity to withstand a hardware failure on any single device. The facility is monitored 24 hours, 7 days a week by on-site professional security guards and monitored over continuous closed circuit video surveillance from a command Center via both stationary and 360° cameras located both outside and inside the facility. The external evaluator’s security measures comply with NASA's privacy and security requirements

  • Individual identifying information will be maintained separately from completed data collection forms and from computerized data files used for analysis.


A.11 PROVIDE ADDITIONAL JUSTIFICATION FOR ANY QUESTIONS OF A SENSITIVE NATURE, SUCH AS SEXUAL BEHAVIOR AND ATTITUDES, RELIGIOUS BELIEFS, AND OTHER MATTERS THAT ARE COMMONLY CONSIDERED PRIVATE. THIS JUSTIFICATION SHOULD INCLUDE THE REASONS WHY THE AGENCY CONSIDERS THE QUESTIONS NECESSARY, THE SPECIFIC USES TO BE MADE OF THE INFORMATION, THE EXPLANATION TO BE GIVEN TO PERSONS FROM WHOM THE INFORMATION IS REQUESTED, AND ANY STEPS TO BE TAKEN TO OBTAIN THEIR CONSENT.


Questions are included on the parent survey about race/ethnicity and gender. Data collected through these questions will be used to generate the percentage of participating students from underserved/ underrepresented groups. Respondents may skip questions items if they so wish.


A.12 PROVIDE ESTIMATES OF THE HOUR BURDEN OF THE COLLECTION OF INFORMATION.


THE STATEMENT SHOULD:


- INDICATE THE NUMBER OF RESPONDENTS, FREQUENCY OF RESPONSE, ANNUAL HOUR BURDEN, AND AN EXPLANATION OF HOW THE BURDEN WAS ESTIMATED. UNLESS DIRECTED TO DO SO, AGENCIES SHOULD NOT CONDUCT SPECIAL SURVEYS TO OBTAIN INFORMATION ON WHICH TO BASE HOUR BURDEN ESTIMATES. CONSULTATION WITH A SAMPLE (FEWER THAN 10) OF POTENTIAL RESPONDENTS IS DESIRABLE. IF THE HOUR BURDEN ON RESPONDENTS IS EXPECTED TO VARY WIDELY BECAUSE OF DIFFERENCE IN ACTIVITY, SIZE, OR COMPLEXITY, SHOW THE RANGE OF ESTIMATED HOUR BURDEN, AND EXPLAIN THE REASONS FOR THE VARIANCE. GENERALLY, ESTIMATES SHOULD NOT INCLUDE BURDEN HOURS FOR CUSTOMARY AND USUAL BUSINESS PRACTICES.


- IF THIS REQUEST FOR APPROVAL COVERS MORE THAN ONE FORM, PROVIDE SEPARATE HOUR BURDEN ESTIMATES FOR EACH FORM AND AGGREGATE THE HOUR BURDENS IN ITEM 13 OF OMB FORM 83-I.


Exhibit 2 presents estimates of the reporting burden for the parent survey, youth surveys, and the teacher focus group discussions: NASA estimates that the annualized response burden for the entire evaluation is 468.6 hours for youth for the baseline and follow-up surveys, 41.7 hours for teachers to participate in focus group discussions, and 312.4 hours for parents for the survey. The total burden associated with this evaluation is 822.7 hours.6


The estimate of the number of respondents is based on actual SoI enrollment numbers of the camps run by awardees in FY2012 that met the study and sampling criteria (e.g., stand-alone model, minimum 30 hours of SoI content, targeted to rising 6th through 8th grade students).


For the parent survey, this estimate assumes that it will take about 8 minutes for parents to respond to the survey questions. As the form will be included in registration materials and will be administered by the awardees as a mandatory form for completion, we assume that all parents registering youths will return the survey. Estimates for the burden are based on estimates derived from similar surveys conducted on comparable evaluations and on timed administration of this survey to instrument to six adults.


For the youth surveys, this estimate assumes that it will take youths about 6 minutes to read each survey’s introduction and answer the questions. Estimates for the youth burden are based on timed administration of the survey instruments to six youth within the targeted grade range.


Qualitative implementation data will also be collected through focus group discussions with teachers. Teachers will be recruited to participate in 50-minute discussions scheduled during site visits held during the summer 2013 implementation. Assuming that all attend, total burden of these discussions is 41.7 hours.


Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.


Estimates of respondents’ time to complete the surveys and forms, as well as the time to participate in interviews and focus groups, are provided in Exhibit 2. We estimate that the annualized cost burden for their time is $3,397.35 for youth (the baseline and follow-up surveys), $1,040.92 for teachers (focus group discussions), and $7,516.34 for parents (survey). The total annualized cost to respondents is estimated at $11,954.61.


The cost burden associated with the surveys is estimated as follows: for youths, we used the federal minimum wage, for teachers we used the median income of middle school teachers (as reported by the BLS, April 6, 2012), and for parents, we used 2011 national median household income.




Exhibit 2. Estimates of Annualized Burden Hours and Cost for Data Collection

Data Collection Sources

Number of Respondents

Frequency of Response

Total Minutes per Response

Total Response Burden in Hours

Average Response Burden in Hours per Awardee

Estimated Cost Per Hour

Total Cost Burden

Parent Surveys

2343.0

1.0

8.0

312.4

78.1

$24.06

$7,516.34

Youth Surveys

2343.0

2.0

6.0

468.6

117.2

$7.25

$3,397.35

Teacher Focus Groups

50.0

1.0

50.0

41.7

10.4

$24.98

$1,040.92

Total Burden for Evaluation

4736.0

 

 

822.7

205.7

 

$11,954.61

Notes:

a Estimated cost per hour for parents is calculated based on the national median household income of $50,054 (~24.06 per hour, assuming a 40 hour work week) for 2011 according to the Current Population Survey (http://www.census.gov/prod/2012pubs/p60-243.pdf, retrieved on October 22, 2012.

b Number of respondents based on estimated total universe.

c Estimated cost per hour for youths is calculated based on federal minimum wage of $7.25 per hour effective July 24, 2009.

d Estimated cost per hour for teachers is calculated by the 2010 median income of middle school teachers of $51,960 (as of April 6, 2012), or $24.98 per hour (http://www.bls.gov/ooh/education-training-and-library/middle-school-teachers.htm).


A.13 PROVIDE AN ESTIMATE OF THE TOTAL ANNUAL COST BURDEN TO RESPONDENTS OR RECORDKEEPERS RESULTING FROM THE COLLECTION OF INFORMATION. (DO NOT INCLUDE THE COST OF ANY HOUR BURDEN SHOWN IN ITEMS 12 AND 14).


- THE COST ESTIMATE SHOULD BE SPLIT INTO TWO COMPONENTS: (a) A TOTAL CAPITAL AND START-UP COST COMPONENT (ANNUALIZED OVER ITS EXPECTED USEFUL LIFE); AND (b) A TOTAL OPERATION AND MAINTENANCE AND PURCHASE OF SERVICES COMPONENT. THE ESTIMATES SHOULD TAKE INTO ACCOUNT COSTS ASSOCIATED WITH GENERATING, MAINTAINING, AND DISCLOSING OR PROVIDING THE INFORMATION. INCLUDE DESCRIPTIONS OF METHODS USED TO ESTIMATE MAJOR COST FACTORS INCLUDING SYSTEM AND TECHNOLOGY ACQUISITION, EXPECTED USEFUL LIFE OF CAPITAL EQUIPMENT, THE DISCOUNT RATE(S), AND THE TIME PERIOD OVER WHICH COSTS WILL BE INCURRED. CAPITAL AND START-UP COSTS INCLUDE, AMONG OTHER ITEMS, PREPARATIONS FOR COLLECTING INFORMATION SUCH AS PURCHASING COMPUTERS AND SOFTWARE; MONITORING, SAMPLING, DRILLING AND TESTING EQUIPMENT; AND RECORD STORAGE FACILITIES.


- IF COST ESTIMATES ARE EXPECTED TO VARY WIDELY, AGENCIES SHOULD PRESENT RANGES OF COST BURDENS AND EXPLAIN THE REASONS FOR THE VARIANCE. THE COST OF PURCHASING OR CONTRACTING OUT INFORMATION COLLECTION SERVICES SHOULD BE A PART OF THIS COST BURDEN ESTIMATE. IN DEVELOPING COST BURDEN ESTIMATES, AGENCIES MAY CONSULT WITH A SAMPLE OF RESPONDENTS (FEWER THAN 10), UTILIZE THE 60-DAY PRE-OMB SUBMISSION PUBLIC COMMENT PROCESS AND USE EXISTING ECONOMIC OR REGULATORY IMPACT ANALYSIS ASSOCIATED WITH THE RULEMAKING CONTAINING THE INFORMATION COLLECTION, AS APPROPRIATE.


- GENERALLY, ESTIMATES SHOULD NOT INCLUDE PURCHASES OF EQUIPMENT OR SERVICES, OR PORTIONS THEREOF, MADE: (1) PRIOR TO OCTOBER 1, 1995, (2) TO ACHIEVE REGULATORY COMPLIANCE WITH REQUIREMENTS NOT ASSOCIATED WITH THE INFORMATION COLLECTION, (3) FOR REASONS OTHER THAN TO PROVIDE INFORMATION OR KEEPING RECORDS FOR THE GOVERNMENT, OR (4) AS PART OF CUSTOMARY AND USUAL BUSINESS OR PRIVATE PRACTICES.


Other than their time to complete the surveys and forms, as well as the time to participate in interviews and focus groups, which are estimated in Exhibit 2, there are no direct monetary costs to respondents. That is, there are no capital and start-up costs nor are there total operation and maintenance and purchase of services costs.


A.14 PROVIDE ESTIMATES OF ANNUALIZED COST TO THE FEDERAL GOVERNMENT. ALSO, PROVIDE A DESCRIPTION OF THE METHOD USED TO ESTIMATE COST, WHICH SHOULD INCLUDE QUANTIFICATION OF HOURS, OPERATION EXPENSES (SUCH AS EQUIPMENT, OVERHEAD, PRINTING, AND SUPPORT STAFF), AND ANY OTHER EXPENSE THAT WOULD NOT HAVE BEEN INCURRED WITHOUT THIS COLLECTION OF INFORMATION. AGENCIES ALSO MAY AGGREGATE COST ESTIMATES FROM ITEMS 12, 13, AND 14 IN A SINGLE TABLE.


The total annualized cost of the SoI evaluation study is $380,127 based on a Government cost estimate developed by NASA staff. This estimate was developed using actual costs of past contracts for this type of evaluation work performed by the same contractor. Of this total cost, the estimated costs to the Federal Government for the data collection activities detailed in this PRA clearance package are $185,760. This estimate is based on the cost of administering, analyzing, and reporting on the parent and baseline/follow-up youth surveys; and scheduling, facilitating, transcribing, analyzing, and reporting the teacher focus group discussions.


A.15 EXPLAIN THE REASON FOR ANY PROGRAM CHANGES OR ADJUSTMENTS REPORTED IN ITEMS 13 OR 14 OF THE OMB FORM 83-1.


This data collection significantly reduces the respondent burden by narrowing the focus of the evaluation to 6th through 8th grade youth participating in stand-alone SoI camps with a minimum of 30 hours SoI content in a one-week camp. This data collection represents the third year of data collection related to the activities of the Summer of Innovation.


Exhibit 3. Program Change

Reason

Previous Burden

New Burden

Difference

Program change: Change in frequency and method of collection; Change in evaluation study focus and sampling strategy

7,034 burden hours

822.7 burden hours

6,211.3 burden hours



A.16 FOR COLLECTIONS OF INFORMATION WHOSE RESULTS WILL BE PUBLISHED, OUTLINE PLANS FOR TABULATION, AND PUBLICATION. ADDRESS ANY COMPLEX ANALYTICAL TECHNIQUES THAT WILL BE USED. PROVIDE THE TIME SCHEDULE FOR THE ENTIRE PROJECT, INCLUDING BEGINNING AND ENDING DATES OF THE COLLECTION OF INFORMATION, COMPLETION OF REPORT, PUBLICATION DATES, AND OTHER ACTIONS.


The schedule shown in Exhibit 3 displays the sequence of activities required to conduct the information collection activities and includes key dates for activities related to data collection, analysis, and reporting. Two evaluation reports based on findings from the surveys and implementation data will be prepared: the implementation study report will be completed following the completion of summer activities (by November 2013) and the outcome evaluation study report following analysis of the follow-up youth survey (by March 2014).


Exhibit 4. SoI Schedule


Activities and Deliverables

Responsible Party

Date

Status (as of December 19, 2012)

Kick-off meeting and consultation with SoI awardees about FY2013 data collection

NASA

November 2012

Completed

Modification of awardees’ statements of collaborations/agreements to ensure accountability for data collection

NASA

November – December 2012

In progress

Assistance to awardees in securing local IRB approval for data collection

External evaluator

November 2012 – January 2013

In progress

Finalization of evaluation plan

External evaluator

December 2012 – January 2013

In progress

Review of detailed evaluation plan by evaluation experts

NASA

January 2013

Not yet started

Parent survey collection

External evaluator & site administrators

February – June 2013

Not yet started

Baseline youth survey collection

External evaluator & site administrators

May – August 2013

Not yet started

Follow-up youth survey collection

External evaluator

October 2013

Not yet started

Data analysis of baseline/follow-up youth and parent survey collections, including non-response bias analysis

External evaluator

December 2013 – May 2014

Not yet started

Implementation study report (deliverable)

External evaluator

October – November 2013

Not yet started

Review of implementation study report by evaluation experts

NASA

November 2013

Not yet started

Revision of SoI program model based on implementation study findings

NASA

December 2013

Not yet started

Outcome evaluation report (deliverable)

External evaluator

July 2014

Not yet started

Review of outcome evaluation report by evaluation experts

NASA

July 2014

Not yet started

Program recommendations for NASA portfolio based on outcome evaluation findings by evaluation experts in collaboration with NASA staff

NASA

August 2014

Not yet started


Analysis of Survey Data

Below, the analysis plan for the survey data is summarized. It is discussed in fuller detail in Supporting Statement B.


Descriptive Cross-Sectional Analyses

Because the universe of youth and parents will be sampled, the descriptive statistics for a single point in time do not need to be adjusted for sampling design. Means and standard deviations will be used to describe central tendency and variation for survey items using continuous scales. Frequency distributions and percentages will be used to summarize answers given on ordinal scales. Descriptive analyses about all awardees will be conducted on the youth and parent respondents, while descriptive analyses about youth and parent within particular awardees will be restricted only to respondents from that awardee.


Descriptive Change Over Time Analyses

The evaluation team will examine the youth survey data to provide simple descriptions of change in a variable over time. For the youth surveys, we will test whether the difference in proportions and means between two time points is zero using a McNemar test or paired t-test, depending on the distribution of the outcome variables.


Analysis of Focus Group Data

Analysis of the focus group data will be qualitative in nature. Transcripts from the focus group discussions will be coded using NVivo, a qualitative analysis software program that facilitates tagging and retrieval of data associated with selected themes, and content analyzed. The focus group data will allow us to address the implementation evaluation questions.


A.17 IF SEEKING APPROVAL TO NOT DISPLAY THE EXPIRATION DATE FOR OMB APPROVAL OF THE INFORMATION COLLECTION, EXPLAIN THE REASONS THAT DISPLAY WOULD BE INAPPROPRIATE.


The agency plans to display the expiration date for OMB approval of the information collection on all instruments.

A.18 EXPLAIN EACH EXCEPTION TO THE CERTIFICATION STATEMENT IDENTIFIED IN ITEM 19, "CERTIFICATION FOR PAPERWORK REDUCTION ACT SUBMISSIONS," OF OMB FORM 83-1.


The agency is able to certify compliance with all provisions under Item 19 of OMB Form 83-I.

References


Assessing Women and Men in Engineering. (Undated). STEM assessment tools. Retrieved October 20, 2012, from: http://www.engr.psu.edu/awe/.


Berry, S., Pevar, J., & Zander-Cotugno, M. (2008). Use of incentives in surveys supported by Federal grants: Paper presented at Council of Professional Associations on Federal Statistics seminar titled “Survey Respondent Incentives: Research and Practice.” March 10, 2008. Retrieved October 22, 2012, from: http://www.rand.org/content/dam/rand/pubs/working_papers/2008/RAND_WR590.pdf.

Church, A. (1993). Incentives in Mail Surveys: A Meta-Analysis. Public Opinion Quarterly 57(1):62-79.

Conrad, F.G., Couper, M.P., Tourangeau, R., and Peytchev, A. (2006). Use and non-use of clarification features in web surveys. Journal of Official Statistics 22, 245-269.


Couper, M. P. (2008). Designing Effective Web Surveys. Cambridge, England: Cambridge University Press.


Dillman, D. A., Smyth, J.D.; and Christian, L.M. (2009). Internet, mail, and mixed-mode surveys: The

tailored design method, 3rd Edition, Hoboken, NJ: John Wiley & Sons.


Dillman, D. A. (undated), Token financial incentives and the reduction on nonresponse error in mail surveys. Retrieved October 22, 2012, from: http://sesrc.wsu.edu/pap_tok.htm.


Galesic, M., Tourangeau, R., Couper, M.P., and Conrad, F. (2008). Eye-tracking data: New insights on response order effects and other cognitive shortcuts in survey responding. Public Opinion Quarterly 72, 892-913.


Heerwegh, D. and Loosveldt, G. (2003). An evaluation of the semiautomatic login procedure to control web survey access. Social Science Computer Review 21, 223-234.


Hunt-White, T. (2007). The influence of selected factors on student survey participation and mode of completion. Retrieved December 19, 2012, from: http://www.fcsm.gov/07papers/Hunt-White.III-C.pdf.


James, J.M., and Bolstein, R. (1992). Large monetary incentives and their effect on mail survey response

rates. Public Opinion Quarterly, 56, 442-453.


Kerachsky, S. J., & Mallar, C.D. (1981). The effects of monetary payments on survey responses: Experimental evidence from a longitudinal study of economically disadvantaged youth. Proceedings of the Section on Survey Research Methods, pp. 258-263. Alexandria, VA: American Statistical Association.


McGrath, J. (2006). An Incentives Experiment in the U.S. Consumer Expenditure Quarterly Survey. Retrieved December 19, 2012, from: http://www.bls.gov/ore/pdf/st060030.pdf.


National Center for Education Statistics. (Undated). National Assessment of Educational Progress: More about NAEP science. Retrieved October 25, 2012, from: http://nces.ed.gov/nationsreportcard/science/moreabout.asp.


National Center for Education Statistics. (Undated). High School Longitudinal Study of 2009 (HSLS: 2009): Questionnaires. Retrieved October 20, 2012, from: http://nces.ed.gov/surveys/hsls09/questionnaires.asp.


Peytchev, A., Couper, M.P., McCabe, S., & Crawford, S. (2006). Web survey design: Paging versus scrolling. Public Opinion Quarterly 70, 596-607.


Policy Studies Associates, Inc. (2012). 4-H Science Initiative: Youth engagement, attitudes, and knowledge study. Prepared for the National 4-H Council. Retrieved October 13, 2012, from: http://www.pearweb.org/atis/tools/62.


Redline, C. & Dillman, D. (2002). The Influence of alternative visual designs on respondents’ performance with branching questions in self-administered questionnaires. In R.M. Groves, D.A. Dillman, J.A. Eltinge, and R.J.A. Little (Eds.), Survey Nonresponse. New York: Wiley, 179-193.


Rosoff, P., Werner, C., Clipp, E.C., Buill, A.B., Bonner, M., & Demark-Wahnefried, W. (2005). Response rates to a mailed survey of childhood cancer survivors: A comparison of conditional versus unconditional incentives. Cancer Epidemiology, Biomarkers & Prevention, May 2005, 14.


Singer, E. (2002). “The use of incentives to reduce nonresponse in household surveys” Chapter 11 in Survey Nonresponse, John Wiley and Sons, Inc. New York (2002).


Singer, E., & Kulka, R.A. (2004). Paying respondents for survey participation. Retrieved December 19, 2012, from: http://aspe.hhs.gov/hsp/welf-res-data-issues02/pdf/04.pdf.


U.S. Department of Health and Human Services (HHS). (2006). Research-Based Web Design & Usability Guidelines. Washington D.C.: Government Printing Office.


1 A minimum dosage of 40 hours had been tentatively recommended by experts participating in the SoI Program Design Forum, convened by the NASA Office of Education on June 18-20, 2012. However, Forum participant and RAND researcher Dr. Jennifer McCombs pointed out that while research shows a link between dosage and achievement outcomes, it does not clearly specify the appropriate duration for summer programs. In NASA’s final recommendations on SoI program design submitted to OMB on August 31, a dosage of 30 hours over a one-week period was proposed, since 40 hours of content—or an average of 8 hours of instruction per day—is too much for the average middle school student, who is accustomed to an instructional day during the academic year on average of 6.8 hours for a total of 34 hours per week. NASA also recognizes that summer programs typically include other program content, including physical exercise. Source: The Center for Public Education (2006), Making time: Q&A. Retrieved October 28, 2012, from: http://www.centerforpubliceducation.org/Main-Menu/Organizing-a-school/Copy-of-Making-time-At-a-glance/Making-time-QA-.html.

2 See, for instance, see Subpart A, Section 46.116 (General Requirements for Informed Consent) of the Code of Federal Regulations, TITLE 45, Public Welfare, Department of Health and Human Services, Part 46, Protection of Human Subjects. Retrieved January 22, 2013, from: http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html#46.116.

3 As described in greater depth in Part B, the parent survey collects information that will be used to create an enriched sampling frame for the non-response bias analysis including, but not limited to, demographic data.

4 Different approaches are used by awardees to register students for camps, so the evaluation design must be sensitive to local context. Some awardees, such as Puerto Rico Institute of Robotics, utilize a centralized registration system, while camps administered by other awardees, particularly centers such as NASA Langley Research Centers, coordinate their own registration apart from the awardee.

5 The count of unique parents/caregivers attending SoI events is a proxy indicator for engagement of parents/caregivers in SoI activities. The engagement of parents/caregivers will be introduced as a new project requirement for awardees in FY2013. This new requirement is based on research evidence that parent/caregiver involvement helps to support a student’s pursuit of a STEM career and formation of a STEM identity, therefore likely increasing the long-term impact of the intervention on the student with respect to STEM career aspirations. This assumption is based on research that has shown that parental encouragement and involvement in a student’s academic life is one of the most reliable predictors of whether or not a child will attend college and on sustaining motivation and academic achievements. NASA’s observations of SoI awardees confirm that the higher-performing awardees have engaged parents/caregivers through planned events. Sources: Gibbs, K. D., & Dou, R. (2012), Evidence-based framework for the design & evaluation of Federal STEM engagement interventions. Arlington, VA: Commissioned Paper, National Science Foundation; Cabrera, A. F. N., & Steven M. (2000). Understanding the college-choice process. New Directions for Institutional Research: 5-22; Choy, S. P. (2002). Access & persistence: Findings from 10 Years of Longitudinal Research on Students. Washington DC, American Council on Education Center for Policy Analysis; Hossler, D., Schmidt, J. & Vesper, N. (1999). Going to college: How social, economic, and educational factors influence the decisions students make. Baltimore: The Johns Hopkins University Press; Swail, W. S. & Hosford, S. (2007). Missouri students and the pathway to college. Virginia Beach, VA: Educational Policy Institute; McDonough, P. M. (1997). Choosing colleges: How social class and schools structure opportunity. Albany: State University of New York Press.


6 Note: small differences in sums due to rounding.

PART A-ii


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT
AuthorPatricia Moore Shaffer
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy