2700.0150.Supp Stmt B.RENEWAL2012

2700.0150.Supp Stmt B.RENEWAL2012.docx

NASA 2013 Summer of Innovation Program

OMB: 2700-0150

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

FOR OMB CLEARANCE

PART B



NASA Summer of Innovation FY2012


2012 PROGRAM DATA COLLECTION





National Aeronautics and Science Administration




October 30, 2012

(Revised January 24, 2013)




Part B: Collection of Information Employing Statistical Methods


Introduction

The National Aeronautics and Space Administration (NASA) Office of Education, requests that the Office of Management and Budget (OMB) approve, under the Paperwork Reduction Act of 1995, a clearance for NASA to collect parent survey, youth survey, and teacher focus group data as part of an implementation/outcome evaluation study of NASA’s Summer of Innovation (SoI) Project FY2013. The Summer of Innovation engages and supports external partners in the delivery of evidence-based summer engagement opportunities in STEM to youth from underserved/underrepresented populations with the intent of increasing interest and participation in STEM and contributing toward the national-level impact of increased numbers of high school graduates pursuing STEM majors and careers.


This clearance request pertains to information collection that will occur between February 15, 2013 and August 15, 2013. This information collection supports an implementation/outcome evaluation of NASA’s preferred SoI stand-alone camp program model.1 The implementation component of this evaluation is intended to provide information that will inform the continued improvement of this STEM education investment. The outcome evaluation is intended to assess the primary evaluation question of how the SoI stand-alone camp experience affects youth engagement with STEM through assessment of change in self-reported interest and participation data. While measuring outcomes at multiple points in time can provide evidence of whether the outcomes of interest change, it will not allow us to rule out the possibility that something other than the program is affecting this change. As such, we emphasize that this evaluation utilizes a non-experimental design and its findings, therefore, cannot be causally linked to Summer of Innovation. However, initial findings gained from this work should suggest if the preferred NASA SoI model engages student interest in STEM.


Part A of this clearance package describes the information collection activities for the FY 2013 SoI evaluation that require Paperwork Reduction Act clearance, including the parent survey, the baseline and follow-up student surveys, and the teacher focus group protocol. The consent process has been revised to require active consent for inclusion in the evaluation as part of camp registration. Parents that do not sign and submit the consent form allowing for inclusion in the study will not be eligible to register their children for SoI. The consent process has been reviewed and is acceptable to the IRB for Abt Associates, which serves as the external evaluator.


The parent survey will also be solicited via the camp registration process from all parents of 6th through 8th grade youth participating in purposive sample of SoI stand-alone camps. The parent survey provides demographic and other background information that will inform analyses and provide the data necessary to conduct non-response bias analysis. Baseline and follow-up surveys will be collected from the census of eligible youth (i.e., rising 6th through 8th graders) enrolled in the sample of awardee camps. The youth surveys will collect the data needed to respond to the central research question affiliated with the outcome evaluation: How does the SoI experience in stand-alone camps affect youth engagement with STEM? Teacher focus group discussions will be facilitated by the evaluator during visits to participating awardee sites; information collected through this means will inform the implementation evaluation of SoI. All three sources of information will provide insights into how SOI is implemented.


The Summer of Innovation (SoI) information collection for FY2013 will involve a purposive sample of camps from a minimum of four national awardees (FY2011 cohort) and/or NASA centers (subject to funding review in February 2013) which meet the minimum program criteria of offering stand-alone SoI camps to rising 6th through 8th grade students with a minimum dosage of 30 hours of SoI content during a one-week period.2 The selection criteria for the respondent universe and the rationale for not selecting a sampling approach is outlined in B.1.


B.1 DESCRIBE (INCLUDING A NUMERICAL ESTIMATE) THE POTENTIAL RESPONDENT UNIVERSE AND ANY SAMPLING OR OTHER RESPONDENT SELECTION METHOD TO BE USED. DATA ON THE NUMBER OF ENTITIES (E.G., ESTABLISHMENTS, STATE AND LOCAL GOVERNMENT UNITS, HOUSEHOLDS, OR PERSONS) IN THE UNIVERSE COVERED BY THE COLLECTION AND IN THE CORRESPONDING SAMPLE ARE TO BE PROVIDED IN TABULAR FORM FOR THE UNIVERSE AS A WHOLE AND FOR EACH OF THE STRATA IN THE PROPOSED SAMPLE. INDICATE EXPECTED RESPONSE RATES FOR THE COLLECTION AS A WHOLE. IF THE COLLECTION HAD BEEN CONDUCTED PREVIOUSLY, INCLUDE THE ACTUAL RESPONSE RATE ACHIEVED DURING THE LAST COLLECTION.


This section addresses the potential respondent universe and provides an outline of the selection criteria that defines the universe. It also provides justification for the decision to not utilize a sampling strategy for the surveys. The numerical estimate for the respondent universe, and anticipated response rates, are provided in tabular form in Exhibit 1. The projected unconditional response rate for youth surveys is 42%. The actual response rates from the previous evaluations conducted in FY2010 and FY2011 are provided in the narrative.


Exhibit 1. FY2013 Data Collection to Be Analyzed using Statistical Methods

Instrument

Timing of Data Collection

Respondent Universe

Estimated Response Rate

Student Data

Parent Survey

At time of registration (February – June 2013)

2,343

85%

Youth Baseline Survey

Prior to start of SoI camp (May – August 2013)

2,343

80%

Youth Follow-up Survey

Three months following the completion of camp (October 2013)

2,343

52%

Teacher Data

Teacher Focus Group Discussions

During camp implementation (May-August 2013)

50

80%

Total Respondents

4,736



Criteria to Define Respondent Universe. The potential respondent universe for an evaluation of Summer of Innovation are the students, parents of students, and teachers participating in the Summer of Innovation in camp experiences administered by the eight national awardees, the eight NASA centers, and the NASA Jet Propulsion Laboratory. According to the last evaluation study conducted of the FY2011 SoI project, the eight national awardees conducted over 50 summer camps, serving around 430 classrooms of middle school students between June and August 2011; overall, national awardees directly engaged 8,901 students. The Centers and the Jet Propulsion Laboratory were not responsible for direct implementation of camps, but via agreements with 137 organizations indirectly engaged 17,434 students during the same time period.


Criteria to Define Purposive Sample of Stand-alone Camps. Over the past year, NASA has invested in the development of themed camp guides that promote the stand-alone model. The week-long camp guides offer approximately 30-35 hours of SoI curriculum content featuring hands-on, problem-based activities in an appropriate learning progression. NASA is interested in evaluating the effectiveness of this stand-alone model, which holds promise as a summer engagement program model for middle school students that is replicable across Federal government.


NASA intends to identify for the FY2013 evaluation a purposive sample of SoI stand-alone camps administered by awardees or NASA centers. The selected camps, all of which were previously funded in FY 2012, currently implement the SoI stand-alone program model. 3 The selection of the camps will be based on specific programmatic criteria. In order to participate in this study, these camps must:


  • Offer stand-alone SoI camp experiences, typically one week in length, which utilize NASA SoI curricula for a minimum of 30 hours during the camp; and

  • Target 6th through 8th grade students exclusively.


In selecting the purposive sample of camps, ensuring demographic and geographic diversity of the sample is a key criterion. In addition, because of low response rates with the previous two SoI evaluations conducted in FY2010 and FY2011, selected camps will require parent consent at the point of camp registration. Within the selected camps, all youth will be included in the evaluation. Requiring active parent consent will allow for selection of a diverse group of camps for the evaluation.


There are logistical considerations also in the final selection of the camps. Since the evaluation also includes site visits with camp observations and the teacher focus group discussion, cost-effectiveness in scheduling will also be a consideration in camp selection. Recruitment for the study will be concentrated on camps administered by awardees with a large number of eligible camps, as it will be easier to obtain the full sample of youth in a limited number of awardee projects. By focusing efforts on camps in a limited number of awardee projects that have a significant number of camps, while still ensuring demographic and geographic diversity, NASA will also reduce travel costs associated with project management, data collection, and other activities.


Finally, the number of camps will be selected based on the number of youth required as determined by the power analysis (see discussion below).


Stand-Alone Camp Recruitment. The pool of potential camps was identified following an analysis of camp data from FY2012 and a series of discussions with awardee PIs and center POCs. Through these discussions, we supplemented our data on enrollment, student demographics, and previous survey response rates in our records with additional information, including the mode of registration (online versus paper), camp start and end dates, past challenges with survey administration, and camp organizational structure. Based on the selection criteria discussed above, NASA identified a sample of stand-alone camps administered by four awardees and centers, including:

  • NASA Glenn Research Center

  • NASA Langley Research Center

  • Puerto Rico Institute of Robotics Inc.

  • Rio Grande Valley Science Association


NASA Johnson Space Center and Chester County Intermediate School District were identified as alternatives.


Initial conversations have been held with each of the selected awardees as well as with the alternatives. Once the FY2013 camps have been identified by the awardees, NASA will contact each of the camp coordinators directly. We will request, in close collaboration with the awardee PIs/POCs, that camp coordinators provide us with a letter of intent to participate in the evaluation, which must be signed by the appropriate administrator (such as the superintendent, research director, etc.). An evaluation POC will be identified in each camp.


Sample Size. Data on the estimated number of respondents in the universe covered by the collection are provided in tabular form in Exhibit 1. Expected response rates for the collection are also provided. The success of the upcoming study requires that response rates for the study exceed those of previous SoI evaluation efforts, and the proposed design involves features intended to address issues of non-response that were faced by previous SoI evaluation efforts.


The evaluation proposes a sample size of 2,343 students for the evaluation of the SoI Stand-Alone Program Model. Power was calculated by first identifying the desired Minimum Detectable Effect (MDE). Although the design for FY13 will differ from the FY11 study, data from the FY11 evaluation provided reasonable expected pre- to post-SoI differences in outcomes of interest. While the pre-post difference for various outcomes was negligible (about zero), the effect size for career interest in science, a key outcome of SoI, was 0.36. Using this effect size as a starting point, and reasoning that because the FY13 follow-up will be administered at a later time than FY114, the effect size may be slightly smaller due to the delayed measure, the study has been designed with the power to detect an MDE of 0.2, which represents a small to medium sized effect (Cohen, 1969).


Specifications and assumptions for the power calculation:

  • Design – Students within subgroups, blocked by camps

  • Significance level (alpha) = 0.05 (two-tailed)

  • The variance of effect size of the outcome across camps is zero (this assumption is consistent with a fixed effects model for the treatment variable).  

  • The proportion of variation in outcomes explained by camp-level covariates (reported symbolically as B) is approximately 0.055 and that the proportion of variation explained by individual-level covariates (R-squared) is 0.2.6

  • MDE = 0.2

  • Assume 60 students per camp.

Note that the study is powered to detect significant differences among subgroups, as our design takes into account student subgroups of interest (e.g., under-represented students), blocked within camps7. The study is designed to anticipate potential non-responders. Thus, although an analytic sample of 900 has 80% power to detect an effect size of 0.2, the target sample size would allow the study to detect an effect of .2 despite non-responders. Specifically, calculations assumed a response rate of 80% at baseline and an attrition rate of 48% between baseline and follow-up (corresponding to a response rate of 52%), for an unconditional response rate of 42%. These adjustments increased the projected sample size to 2,143. Further explanation for the response rate projection is provided later in this section.


Data from FY12 and projected counts from FY13 indicate that there will likely be wide variation in the number of students recruited to individual camp sites. In particular, the projected number of rising 6th to 8th grade students at sites that might reasonably be selected for the FY13 study range from 56 to 248. To account for this large variation, and the possibility that the camps we administer surveys to may be on the lower end of this spectrum, the target sample size was further increased by an additional 200 students, resulting in a final target sample size of 2,343 students.


Response Rates. In determining the sample size for the study, we incorporated assumptions about response rates and attrition at different stages of data collection, and calculated a projected sample size that would allow us to detect an effect of .2 despite non-responders. Specifically, our calculations assumed a youth survey response rate of 80% at baseline and an attrition rate of 48% (or, a follow-up response rate of 52%) between baseline and follow-up youth survey, for an unconditional response rate of 42%. We relied on the FY2011 evaluation’s response rate to the second follow-up SoI youth survey, which was mailed to youth in March 2012 six months following the SoI experience, to estimate a follow-up youth attrition rate of 48%. However, given the revised administration of instruments, the revised consent process, dedicated external evaluation team member, camp-level evaluation point of contact, and inclusion of all grade-eligible students at a camp, we may achieve a higher response rate than in the previous evaluation. This attrition rate is higher than that supported by existing research evidence on response rates for mailed youth surveys.8


Since neither the parent survey nor the teacher focus group discussion was administered during previous SoI evaluations, no historic rate information is available. However, we are conservatively projecting an 85% response rate for the parent survey largely due to making it a mandatory form in the camp registration package but also because of the early administration of instruments and the incorporation of the survey into the camp registration process. The 80% response rate for the focus group discussion is based on early notification of awardees of this requirement, which will enable them to include participation in the focus group discussion in their teacher contracts.


Response rates will be further addressed in section B.3.



B.2. DESCRIBE THE PROCEDURES FOR THE COLLECTION OF INFORMATION INCLUDING:


- STATISTICAL METHODOLOGY FOR STRATIFICATION AND SAMPLE SELECTION;


- ESTIMATION PROCEDURE;


- DEGREE OF ACCURACY NEEDED FOR THE PURPOSE DESCRIBED IN THE JUSTIFICATION;


- UNUSUAL PROBLEMS REQUIRING SPECIALIZED SAMPLING PROCEDURES, AND


- ANY USE OF PERIODIC (LESS FREQUENT THAN ANNUAL) DATA COLLECTION CYCLES TO REDUCE BURDEN.



The data collection procedures and instruments were designed to capture information on the implementation of SoI at awardees and to investigate youth-related outcomes. Exhibit 1 in section B.1 outlines the data collection schedule to be implemented in the 2013 national evaluation.

This data collection will not be using any statistical methodology for stratification, estimation procedures, and sample selection because the youth and parent surveys will be administered to all participants meeting the grade-level criteria within the purposive sample of camps. Similarly, NASA anticipates inviting to participate in the focus group discussions all lead teachers for participating camps administered by a single awardee, eliminating the need for sampling. Since this evaluation was last conducted in FY2011, it should be noted that NASA saved the public considerable burden by canceling last summer’s evaluation and instead using that time to plan a more effective evaluation. What follows below is a summary of procedures for the collection of information.


Surveys: Procedures for Data Collection. Prior to the start of the summer program, the external evaluator will obtain Institutional Review Board (IRB) approval for the modifications to the FY2013 study design and instruments. NASA has consulted with awardees and centers and made decisions collaboratively regarding the most appropriate data collection strategies. Specifically, NASA discussed the following data collection strategies:


  1. Timing of administration of baseline student survey. In past SoI evaluations, the baseline student survey had been administered during the first morning of the summer camp. While this approach has generally produced a strong response rate provided the paper surveys were delivered to the awardee/center in sufficient time for administration, there can be greater variability in response since the camp instructors are responsible for administering the survey. OMB had recommended the administration of the baseline survey at the time of registration, with the survey (or survey link) given to the parent/caregiver. However, it also potentially widens the time period of administration from one week to several months and places burden on camp administrators and instructors to follow up with individual students who may not have responded on the first day of camp. NASA met with the external evaluator and the participating awardees to fully discuss the pros and cons of the baseline student survey timing. Following this consultation, NASA made the decision to administer the youth surveys on the first day of camp. The primary reason given for this choice is the stronger level of control the camp administrators have to ensure youth complete the surveys. NASA will continue to work closely with awardees and centers to ensure that response rates on the baseline student survey are high.


  1. Mode of administration for the parent survey and baseline youth survey. The parent survey will be administered during the registration process. Since awardees handle the registration process in different ways, NASA consulted with participating awardees and centers about the most appropriate mode for administration of these surveys. For at least one awardee using a centralized online registration system, an online parent survey is the most appropriate strategy for collecting survey data. However, the majority of awardees and centers requested a paper-based survey that could be included in their paper-based registration packet. Although there is a chance that mode effects may influence survey responses, NASA and its external evaluator believe that providing awardees with a survey administration mode (online vs. paper) that is most closely aligned with its own registration process will improve the response rates. Awardees and centers unanimously requested the administration of the student survey on paper, given the limited access to computers at many camp sites.


Following these decisions, the external evaluator, working under the oversight of the NASA Headquarters Evaluation Manager, will provide training to awardees’ PIs and evaluation points of contact9 to ensure rigorous and systematic data collection procedures. Throughout the program, the external evaluator will support the awardees and camps in their data collection efforts. Evaluation guidance will be provided in FY2013 by the external evaluator to awardees in the form of a comprehensive evaluation manual available online and in hardcopy.


Parent Survey


As part of the registration process, awardees will obtain active parent consent for study inclusion. Also during the registration process, parents will be asked to complete a short parent survey (Appendix 1). Completion of the parent survey will be a voluntary, but strongly encouraged, component of the camp registration. The data provided from the parent survey will be utilized for non-response bias analysis and is therefore critical to receive. NASA will work closely with awardees to fully integrate the parent survey into local SoI registration packages.


Awardees will administer the parent survey (Appendix 1) to all parents as part of the registration process using either a paper or online format. Offering multiple survey modes will ease burden on the awardees which collect the information, allowing the awardees to offer parents the most convenient mode. Awardees, especially those with online registration procedures, will opt to offer the parent survey form online by providing registering parents with the survey URL and a site-specific PIN to gain access to the survey (see Appendix 9 for an example of the PIN and agreement to participate screens). Each access to the survey will create a new record and existing records will not be accessible to the survey responder. Respondents accessing the online survey will go through the survey vendor’s website where they are protected by the vendor’s strict data security system. Only those given the PIN can enter the survey.  Upon entering the PIN, the respondent will need to fill in and submit their surveys. They will not be able to return to a submitted survey. A parent will not be able to save an incomplete survey. If they cannot complete the survey in one sitting, they would need to begin again. The PIN will only give respondents access to a single version of the survey; respondents will not have access to any other respondents’ surveys.  The data collected on the online surveys will be automatically maintained on the survey vendor’s secure server and then safely transferred to the external evaluator. These data will not be accessible by Awardee, Center, or camp administrative staff.  A link to the online survey via the survey vendor’s website will also be available on the NASA Summer of Innovation website.  Data from the online survey will be collected, stored, and transferred in accordance with NASA’s privacy and security requirements. The evaluation point of contact at the awardee will hold primary responsibility for ensuring administration and collection of the parent surveys; if registration is handled exclusively at the camp level, then the evaluation point of contact at the camp level will hold responsibility for the administration and collection of the parent surveys as part of the registration process. Awardees that choose to administer the paper version will return them to the external evaluator for safe-keeping and data entry.


Given the descriptive nature of the information to be collected from parents, the use of simple descriptive statistics, such as counts, ranges, and frequency, in conjunction with content analytic methods, is most appropriate for these data sources in this evaluation.


Baseline and Follow-Up Youth Surveys


As discussed earlier, the baseline youth survey form will be available in paper format only. The survey will be administered by camp instructors to all enrolled youth meeting the grade level requirement in the sample of camps during the first day of the camp experience. An evaluation point of contact at the camp level will hold responsibility for ensuring administration and collection of the baseline youth surveys. Paper survey forms will be returned to the external evaluator for safe-keeping and data entry.


Follow-up surveys will be administered by mail in October 2013, approximately three months following the completion of a camp. Since mobility is expected to be ongoing prior to the start of the school year, in September, the evaluation team will send via first-class mail a pre-notification letter to the parent at their home addresses on file reminding them that a survey for their child will be coming in October. Given the short time period between follow-up survey reminder letter and follow-up survey administration, only one reminder will be sent. In addition to the survey reminder, the first-class letter will contain a pre-paid postcard addressed to the external evaluator requesting any updated contact information. In addition to any postcards returned to the external evaluator, updated addresses will be obtained through letters returned by the U.S. Postal Service with forwarding addresses. Further, the external evaluator will use the Lexis-Nexis database which provides access to public records to verify information, to update home addresses. Follow-up student surveys will be mailed to the home address on file in October, with a return postage paid envelope. Follow-up efforts to increase response rates include up to three phone calls to encourage non-responders to complete their surveys and mailing another copy of the survey to non-responders.


For all returned youth surveys, the external evaluator will enter the student survey data into an electronic datafile. Open-ended responses will first be entered verbatim and will then be coded into categories that describe the type of response that was provided.


The external evaluator will conduct analyses to provide descriptive statistics (e.g., proportions and averages) of student data across all camps. In addition, the external evaluator will provide descriptions of change in a variable over time by comparing baseline with follow-up survey results. Statistical tests will be run to assess whether the difference in proportions and/or means between two time points is zero. To do so, the evaluator will use a McNemar test or paired t-test, depending on the distribution of the outcome variables. Further, survey data will also be integrated with the quantitative implementation data to conduct correlational analyses on camp characteristics and program quality and student attitudes and behaviors.


Teacher Focus Group Discussion: Procedures for Data Collection


The external evaluator will work closely with the participating awardees to issue an invitation to participate in a focus group discussion to all lead camp teachers from participating camps administered by each participating awardee. The emailed invitation will include in part the consent script included in Appendix 8. The number of lead teachers has been estimated at 10 teachers per awardees. As noted in Exhibit 1, we anticipate a response rate of 85% for this data collection. Teachers will not be directly compensated by NASA for their participation. However, support for NASA data collection will be part of awardees’ teacher contracts. Awardees have already been notified of this data collection requirement.


The focus group discussion will be facilitated on-site as part of a regularly scheduled evaluation site visit. The awardee will be responsible for coordinating the logistics of the focus group discussion, including providing a location appropriate to convening a focus group discussion. Representatives from the awardee administration and NASA will not be permitted to observe the discussion. The external evaluator, with the permission of the participating teachers, will audio-record the focus group discussion and produced a transcript for analysis. Content analysis will be supported using qualitative data analysis software (N-Vivo), with themes and sub-themes identified using the original questions as guiding categories.


There will not be any use of periodic data collection cycles to reduce burden since a one-time administration of all forms is anticipated.



B.3 DESCRIBE METHODS TO MAXIMIZE RESPONSE RATES AND TO DEAL WITH ISSUES OF NON-RESPONSE. THE ACCURACY AND RELIABILITY OF INFORMATION COLLECTED MUST BE SHOWN TO BE ADEQUATE FOR INTENDED USES. FOR COLLECTIONS BASED ON SAMPLING, A SPECIAL JUSTIFICATION MUST BE PROVIDED FOR ANY COLLECTION THAT WILL NOT YIELD "RELIABLE" DATA THAT CAN BE GENERALIZED TO THE UNIVERSE STUDIED.


As reported in B.1, response rates for parent consent and student surveys were all low for the national evaluation of summer 2011. Multiple factors were involved in preventing the return of the materials, including the brief of amount of time available for planning between the award announcement and program implementation, the late delivery of data collection instruments to awardees and Centers, delayed access to SoI funding, and lack of clarity and prescriptiveness regarding evaluation responsibilities and requirements.


The current evaluation design has been informed by lessons from previous SoI evaluation efforts. Importantly, changes have been made for FY2013 to address some of the hurdles to data collection that were encountered in previous efforts and reflected in lower than expected response rates. Some of the key changes include:



  1. Consulting with participating awardees about locally appropriate data collection strategies prior to finalizing data collection plan, including providing paper or online versions of parent survey if identified as appropriate by NASA and awardee/center (already completed);

  2. providing funding to awardees no later than February 2013 (in progress);

  3. revising the awardees’ statements of collaboration and center agreements to require a compliance strategy supporting NASA performance reporting requirements and evaluation (in progress);

  4. distributing the parent consent forms and parent surveys to awardees/centers by mid-February so that they can include them in registration materials;

  5. revising activities and timeline for PRA package and OMB approval so that approval is received prior to when camps begin recruitment (in progress);

  6. identifying an evaluation point of contact at each awardee and camp;

  7. assigning a designated external evaluation team member for each awardee and camp;

  8. conducting a webinar for awardee/center and camp evaluation points of contact when the evaluation materials are distributed to review data collection processes, reiterate grant requirements regarding the evaluation, and emphasize the importance of collecting the baseline survey before the start of SoI programming;

  9. stipulating that active parent consent for study inclusion is mandatory for all SoI camp registrants, making camp participation contingent on agreement tobe included in the study ;

  10. developing an evaluation manual that will outline the study and data collection responsibilities and processes for awardee/Centers and camps;

  11. close monitoring of evaluation activities and progress toward data collection goals at monthly meetings between NASA and participating awardee PIs (note that meeting frequency will increase to bi-weekly during active camp season);

  12. actively collaborating with the camp evaluation point of contact leading up to and during the administration and return of student baseline surveys; and

  13. sending a pre-notification mailing and making up to three follow-up calls to encourage parents to have youth complete the follow-up survey;

  14. surveying all 6th to 8th grade students in a camp, instead of a sample;

  15. updating the SoI website to include comprehensive evaluation information and materials for increased accessibility;

  16. providing a toll-free number that participants can call to ask questions and verify the legitimacy of the evaluation.

The projected improvement in response rates for the parent surveys are largely due to the early administration of data collection instruments and the inclusion of the surveys as part of the registration process. The estimated responses are projected based on experiences with SoI as well as a similar evaluation that Abt Associates completed of NASA’s Science, Engineering, Mathematics and Aerospace Academy (SEMAA) program. SEMAA is a science enrichment program that targets students in K through 12th grade. SEMAA activities were designed to be implemented in three-hour sessions on Saturday mornings for five to eight consecutive weeks. The SEMAA evaluation used a similar process of including the parent surveys in the application process; the study achieved 91 percent consent and return of baseline parent surveys.10


The projected improvement in student baseline surveys is a result of several of these features, namely the early administration of data collection instruments, revised consent process, the dedicated evaluation team member per camp, the camp-level evaluation point of contact, and the inclusion of all grade-eligible students at a camp. Administration of the baseline student survey is not conditional on having a completed parent survey and all parents will provide consent for student inclusion in the study at the point of registration. . Thus, on the first day of camp, surveys can be administered to all youth in the targeted study grades.. Second, limiting the responsibilities of the external evaluation team members to only one awardee/Center and their selected camps will allow for more constant communication between the awardees/Centers, their selected camps and the evaluation team and will likely result in more productive and collaborative relationships. In the past one or two evaluation team members served as the primary contacts for all awardees/Centers and their camps. Third, a designated camp-level evaluation point of contact will have been trained on the administration of the baseline survey procedure, and this person will be responsible for returning the completed surveys to the evaluation team. In the past, this responsibility fell to the PI who was not necessarily on site. Fourth, in previous evaluation efforts, classrooms within camps were sampled for inclusion in the survey administration. This sometimes led to confusion about which students should be surveyed, resulting in some targeted students not being surveyed. This year, all grade-eligible students will be surveyed at selected camps.


Non-Response Bias


With these revised operations, and given that parents will have opportunities to fill out the baseline surveys prior to camp start, NASA expects to achieve a response rate of 80% or higher for the baseline surveys and parent surveys. It is very difficult to estimate the response rate to the youth follow-up survey, since few previous studies have attempted to gather data from adolescents through the mail. As such, we relied on the FY2011 evaluation’s response rate to the second follow-up SoI youth survey, which was mailed to youth in March 2012 six months following the SoI experience. This resulted in an estimated follow-up youth survey response rate of 52%.


As explained more thoroughly in Section B.1, NASA’s most recent mail follow-up survey of SoI youth, conducted in March 2012 six months following the SoI experience, yielded a 52% response rate for the survey, and an unconditional response rate of 24%.


Given this projected response rate, non-response will pose a problem in our analyses as it introduces bias into our population estimates. Bias occurs if the youth that refuse to participate or leave the study have different characteristics and/or give systematically different responses to the survey (had they responded to it) than the youth who complete the surveys. Poor response rates do not guarantee a biased estimate, as the decision to not participate or leave the study could be completely unrelated to survey answers. Since it is anticipated that non-response will be an issue with the follow-up survey, the external evaluator will conduct a non-response bias analysis on that administration regardless of response rate.


Student Non-Response


NASA has expanded its plan for addressing potential non-response bias, to include a third step that involves a non-response bias study. While poor response rates alone do not guarantee a biased estimate, as the decision to not participate or leave the study could be completely unrelated to survey answers, NASA will examine the bias in estimates because of non-response to either youth survey by following the three steps described below.

1. Examination of Response Rates. The first step will be to monitor the overall response rate and response rate by relevant subgroups (e.g., by grade level, camp). High response rates (over 85 percent) for the entire sample as well as for subgroups might indicate no need for further analysis of bias due to non-response. Large differences in the response rates by strata and for subgroups serve as indicators that potential biases may exist. For example, if response rate from an important subgroup is very low then any difference in the characteristic of interest between this subgroup and other subgroups would result in a bias in the estimates. From the survey results we will examine whether there are differences in the characteristics in the subgroups, especially in a stratum where the response rate is low.


In order to conduct this comparison, the external evaluator will compare returned surveys to camp registration lists provided by the awardees. All awardees have agreed to share registration list information as a condition of their participation in the evaluation.


2. Non-Response Propensity Model. Should the response rate fall below 85 percent we will construct a propensity model to estimate the probability of a student in responding to the survey both for responding and non-responding students; this is called a propensity score. The estimated propensity scores come from a logistic regression model. The model will be based on variables which are available both for non-responding and responding students. Students will be grouped using the estimated propensity scores. Within each group we will compare the frame characteristics of responding and non-responding students. This grouping in addition to assessing the bias will also provide a method of forming weighting classes for adjusting the weights of responding students to reduce the bias due to non-response.


NASA intends to use an enriched sampling frame for this analysis. In addition to using basic demographic data on youth available through the parent survey data and camp-level registration lists (i.e., grade level, gender, race, and ethnicity), questions have been included on the parent survey that collect information on variables identified in the research literature as predictors of youth interest in science and/or STEM. The following questions on the parent survey will be used as variables in the enriched sampling frame:


  1. Do you have a degree in a science, technology, engineering, or mathematics field?  

Yes No I don’t know

  1. Do you work in a science, technology, engineering, or mathematics-related occupation?

Yes No I don’t know


  1. During the last 12 months, has your child participated in any of the following activities outside of school? Check all that apply.

 Science club

 Science competition

 Science camp

 Science study groups or a program where your child was tutored in science

 Visiting a science museum, planetarium, or environmental center

 Reading science books and magazines

 Accessing web sites for computer technology information

 Playing games or using kits or materials to do experiments or build things at home

 Watching programs on TV about nature and discoveries

 None of these


The first two questions collect information as to whether the parent has an academic degree or works in a STEM field. Research has shown that parental encouragement and involvement in a student’s academic life is one of the most reliable predictors of whether or not a child will attend college (Cabrera & Steven, 2000; Hossler, Schmidt & Vesper, 1999; Swail & Hosford, 2007) and on sustaining motivation and academic achievements (McDonough, 1997). Other research has demonstrated that identification with a parent’s occupation is particularly strong among children (Trice, Hughes, et al., 1995) and that parents provide their children with valuable learning experiences about career through their own role models and frequently taught career-related skills that provided youth with a broader understanding of their own aptitudes contributing to career choice (Ferry, 2006). Finally, there is some evidence that parent occupation has a direct correlation with student interest in science (Tripney, Newman, et al., 2010). Therefore, these two questions can provide information that may predict a student’s interest in science.


The third question collects information about the youth’s participation in science activities outside of school. There is growing research evidence that what best differentiates between students with a low and high interest in science is involvement in non-school science activities. In a 2010 study by Bulunuz and Jarrett, the most frequently mentioned activities were visits to science museums, nature centers, zoos, and aquaria. Also mentioned frequently were home-related activities such as playing with science kits, making science collections, taking things apart, and watching science programs on TV. Autobiographical studies of eminent scientists (Kegan, 1989; Shepard, 1988) and research on university science professors (Jarrett & Burnley, 2007; Rowsey, 1997) indicate that out-of-school science activities have a strong influence on selecting science as a career. These out-of-school science experiences are likely to be highly dependent on parental support and encouragement. The degree to which a student is involved in other science-related activities can serve as a useful predictor of their attitude toward science.


To statistically adjust procedures to account for student non-response, the external evaluator will alter the models to include weights that compensate for the missing data from non-responders. These weights will be derived from estimates of propensity scores, defined as the probability of being a complete case (i.e. a responder to multiple survey waves) given a responder’s demographic characteristics. The estimates will be derived from a logistic regression predicting whether or not a student is a responder to both the first and the second survey based on his/her demographic variable values. For students who responded to the first but not the second of the survey waves (i.e. partial responders), estimated probabilities can be obtained from the logistic regression, and multiplying these estimated probabilities by one minus the proportion of non-responders gives estimates of the propensity scores. Weights derived from these propensity score estimates can be used to prevent biased data analyses if (i) data from non-responders is missing “completely at random,” (ii) non-response on a single survey only is missing “at random” with respect to the demographic variables, and (iii) the logistic regression model is correct. While, in practice, it is unlikely that these assumptions strictly hold, if the non-response rate is relatively low, then they are sufficiently plausible that weights based on them will have some value in limiting bias due to non-response.


Under these assumptions, weights equal to the reciprocals of the estimated propensity scores can be used in complete case data analyses to produce approximately unbiased results; e.g., performing weighted t-tests on continuous outcomes. However, the presence of observations with large weights (i.e., reciprocals of very small propensity scores) may result in estimates with high variability. It is therefore often useful to “trade off” some bias for a lessening of variance by developing weighting classes based on the estimated propensity scores of complete cases and of students who only responded to one survey. All of these students are sorted by their estimated propensity scores, and the sorted list is partitioned into quintiles. Each quintile constitutes a weighting class, and all students in a weighting class are assigned the same weight, namely, the reciprocal of the proportion of complete cases in the weighting class.


3. Student Non-Response Study. Given the likelihood that final response rates will fall below the acceptable rate, NASA is prepared to conduct a targeted study of non-responders. The intent of the student non-response study is to select a sample of non-responding students, and engage in an intensive effort to locate them and obtain responses from then on a small subset of items from the original survey. The subset of items would be those that collect information on attitudes toward and participation in science.


Intensive tracking methods for the non-response study include:

  • Mailing of a first-class notification letter to the parents selected for the nonresponse study to their home addresses on file in December 2013. This letter will notify them of their inclusion in the sub-study, which will include participation in the shortened student survey during the spring of 2014 and will explain to the parent the importance of their child’s response to the shorten follow-up student survey.

  • The first-class letter will contain a pre-paid postcard addressed to the external evaluator requesting any updated contact information.

  • In addition to any postcards returned to the external evaluator, updated addresses will be obtained through letters returned by the U.S. Postal Service with forwarding addresses.

  • Mailing a second first-class notification letter and address-update-postcard to parents in February 2014.

  • Using the Lexis-Nexis database, which provides access to public records to verify information, to update home addresses prior to each mailing (advance notification letters and the survey).

  • For awardees/partners that maintain year-round contact with students (e.g., through year-round programs such as 21st Century Community Learning Center), updated addresses for non-responding students will be requested.


The non-response study would employ administration via paper and phone to try and maximize responses. The survey will be mailed and non-respondents will be called in March 2014. Further, follow-up efforts to increase response rates would include up to three phone calls to administer the survey over the phone and a second mailing of the survey to non-responders.

Sample Size Determination

The substudy is designed to estimate the average score of attitudes towards science (µ) of nonresponding students in the non-response survey, with a precision (d). In the previous SoI study, the mean and SD of the attitudes towards science score at follow-up was , σ=0.82. Assuming a simple random sample

using a 95% confidence level.

The plus-or-minus quantity .06 is the margin of error of the sample mean associated with a 95% confidence level (i.e. the confidence that µ is within .06 of the sample mean 3.47), d denotes the desired margin of error. Solving d= for n gives the following: n= , which is used to calculate the sample size.

Because of the resources necessary to conduct the non-responders subsidy, the margin of error for the study will be +/- .10. To estimate the mean, with a margin of error or +/- .10, we need an analysis sample of 269. Although rigorous efforts will be undertaken to obtain responses on the selected items from this group, the response rate may be low. Thus, if we design for a response rate of 50 percent, the initial target sample for the substudy would be 538.


B. 4 DESCRIBE ANY TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN. TESTING IS ENCOURAGED AS AN EFFECTIVE MEANS OF REFINING COLLECTIONS OF INFORMATION TO MINIMIZE BURDEN AND IMPROVE UTILITY. TESTS MUST BE APPROVED IF THEY CALL FOR ANSWERS TO IDENTICAL QUESTIONS FROM 10 OR MORE RESPONDENTS. A PROPOSED TEST OR SET OF TESTS MAY BE SUBMITTED FOR APPROVAL SEPARATELY OR IN COMBINATION WITH THE MAIN COLLECTION OF INFORMATION.


Survey development and procedures were tested and refined as follows. The 2010 pilot surveys were fielded in summer 2010, revised in fall 2010, and updated in winter 2010 to measure outcomes of interest in FY2011. Because the SoI project has administered parent and student surveys for two consecutive years, no further field testing was conducted.


For all surveys included in this clearance package, existing question items or validated item scales were selected after an extensive literature review and consultation with experts. Given that the student and parent surveys were developed using validated question items and scales that have been utilized in other national studies, no cognitive testing was completed. The parent surveys were piloted during October 2012 with 6 adults and the student surveys with 6 middle school students to assess any adapted question items for comprehensibility and to estimate time for completion. Estimated times for completion were adjusted based on these tests.


B.5 PROVIDE THE NAME AND TELEPHONE NUMBER OF INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS OF THE DESIGN AND THE NAME OF THE AGENCY UNIT, CONTRACTOR(S), GRANTEE(S), OR OTHER PERSON(S) WHO WILL ACTUALLY COLLECT AND/OR ANALYZE THE INFORMATION FOR THE AGENCY.


The plans for statistical analyses for this study were primarily developed by NASA staff. Laura LoGerfo, the Project Officer for High School Longitudinal Study of 2009 at the U.S. Department of Education National Center for Education Statistics, reviewed Part B of this application. Alina Martinez of Abt Associates provided information for the sub-sections on response rates and the non-response bias analysis.


The Office of Education Infrastructure Division (OEID) was responsible for developing this clearance package in consultation with Abt Associates of Cambridge, Massachusetts. OEID will provide oversight of the evaluation study. The data collection and analysis will be conducted by Abt Associates.



REFERENCES


Bulunuz, M. & Jarrett, O., (2009). Developing an interest in science: Background experiences of preservice elementary teachers. International Journal of Environmental and Science Education 5, 1 (January 2010), 65-84.


Cabrera, A. F. N., & Steven M. (2000). Understanding the college-choice process. New Directions for Institutional Research: 5-22; Choy, S. P. (2002). Access & persistence: Findings from 10 Years of Longitudinal Research on Students. Washington DC, American Council on Education Center for Policy Analysis.


Cohen, J. (1969). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.

Dillman, D. A. (2000). Mail and Internet surveys: The tailored design method (2nd ed.). New York: John Wiley.


Ferry, N. (2006). Factors influencing career choices of adolescents and young adults in rural Pennsylvania. Journal of Extension 44, 3 (June 2006). Retrieved January 3, 2013, from: http://www.joe.org/joe/2006june/rb7.php.


Hossler, D., Schmidt, J. & Vesper, N. (1999). Going to college: How social, economic, and educational factors influence the decisions students make. Baltimore: The Johns Hopkins University Press.


Jarrett, O.S., & Burnley, P.C. (2007). The role of fun, playfulness, and creativity in science: Lessons from geoscientists. In D. Sluss & O. Jarrett (Eds.) Investigating play in the 21st Century: Play and Culture Studies, Vol. 7 (pp. 188-202). Lanham, MD: University Press of America.


Kegan, T. R. (1989). How Charles Darwin became a psychologist. In D. B. Wallace & H. E. Gruber (Eds.), Creative people at work: Twelve cognitive case studies (pp. 107-126). New York: Ox-ford University Press.


L’Engle, K.L., Pardun, C., & Brown, J. (2004). Accessing adolescents: A school-recruited, home-based approach to conducting media and health research. Journal of Early Adolescence, 24, 2, 144-158. Retrieved January 2, 2013, from: http://www.unc.edu/depts/jomc/teenmedia/pdf/Accessing.pdf.


McDonough, P. M. (1997). Choosing colleges: How social class and schools structure opportunity. Albany: State University of New York Press.


Rowsey, R.E. (1997). The effects of teachers and schooling on the vocational choice of university research scientists. School Science and Mathematics, 97(1), 20-27.


Shepard, R. (1988). The imagination of the scientist. In K. Egan & D. Nadaner (Eds.), Imagination across the curriculum (pp.153-185). New York: Teachers College Press.


Swail, W. S. & Hosford, S. (2007). Missouri students and the pathway to college. Virginia Beach, VA: Educational Policy Institute.


Trice, A. D., Hughes, M. A., Odom, C., Woods, K., McClellan, N. C. (1995). The origins of children's career aspirations: IV. Testing hypotheses from four theories. The Career Development Quarterly 43, 4 (June 1996), 307-322.


Tripney, J., Newman, M., Bangpan, M., Niza, C., MacKintosh, M., & Sinclair, J. (2010). Factors influencing young people (aged 14-19) in education about STEM subject choices: A systematic review of the UK literature. London: EPPI-Centre, University of London.

1 The evaluation will focus on awardees utilizing NASA’s preferred stand-alone camp program model, which provides a minimum dosage of 30 hours of SoI content during a one-week period. OMB also requested that the evaluation focus on a middle school audience (6th through 8th grade), although SoI is offered to youth ranging from 4th through 9th grade.

2 A minimum dosage of 40 hours had been tentatively recommended by experts participating in the SoI Program Design Forum, convened by the NASA Office of Education on June 18-20, 2012. However, Forum participant and RAND researcher Dr. Jennifer McCombs pointed out that while research shows a link between dosage and achievement outcomes, it does not clearly specify the appropriate duration for summer programs. In NASA’s final recommendations on SoI program design submitted to OMB on August 31, a dosage of 30 hours over a one-week period was proposed, since 40 hours of content—or an average of 8 hours of instruction per day—is too much for the average middle school student, who is accustomed to an instructional day during the academic year on average of 6.8 hours for a total of 34 hours per week. NASA also recognizes that summer programs typically include other program content, including physical exercise. Source: The Center for Public Education (2006), Making time: Q&A. Retrieved October 28, 2012, from: http://www.centerforpubliceducation.org/Main-Menu/Organizing-a-school/Copy-of-Making-time-At-a-glance/Making-time-QA-.html.

3 SoI awardees and NASA centers typically implemented three basic approaches for engaging students in the summer. They:


1. Created new programs or substantially bolstered their own preexisting program: this is commonly referred to as the stand-alone SoI model;


2. Embedded the SoI content in partner programs or included SoI in their own program; or


3. Used a dual approach either by embedding SoI into their partners’ pre-existing programs

while holding some stand-alone summer camps .


Overall, the embedded approach is more strongly used in Center activities and the stand-alone model is more strongly used in Awardee activities.



4 The FY11 follow-up survey was administered on the last day of camp, while the FY13 follow-up surveys will be administered about 3 months following involvement in SoI programming.

5 B, or the proportion of variance explained by camp-level covariates was estimated using data from the FY2011 evaluation The following formula was used: , where represents the variance between camps and represents the variance within camps.

6 The R2, or, the proportion of variance explained by the individual-level predictors, was calculated using data from the FY2011 evaluation. The following formula was used: , where represents the camp-level variance in the full model (i.e. the model that include individual characteristics) and represents the camp-level variance in the unconditional model (i.e. the model that does not include individual characteristics).

7 As compared to powering the study based on a simple paired t-test design, which does not take into account student subgroups.

8 Although there is a sizeable literature on conducting mail surveys of adults (see, for example, Dillman, 2000), few previous studies have attempted to gather data from adolescents through the mail (L’Engle, Pardon & Brown, 2004). In one of the few studies conducted on this topic—the L’Engle, Pardon & Brown study (2004)—the initial mailed survey generated a response rate of 40%. Additional contact similar to what is proposed for SoI raised the final response rate to 65% or 35% attrition.

9 Evaluation points of contact are awardee and camp staff designated with responsibility for coordinating the NASA data collection requirements and administering the parent survey and baseline student survey.

10 Martinez, A., Cosentino, C., Smith, W.C., Maree, K., Parsad, A., Shlager, C., Cook, D., Tsui, D., & Levy, A.J. The national evaluation of NASA’s Ecience, Engineering, Mathematics and Aerospace Academy (SEMAA) Program. Prepared for National Aeronautics and Space Administration (NASA). Cambridge, MA: Abt Associates, 2010.

National Aeronautics and Space Administration Part B: Collection of Information B-18


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAbt Double-Sided Body Template
AuthorAbt Associates Inc
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy