Cambridge, MA Lexington, MA Hadley, MA Bethesda, MD Chicago, IL
Abt Associates Inc. 55 Wheeler Street Cambridge, MA 02138
Evaluation of NASA’s Science, Engineering, Mathematics, and Aerospace Academy (SEMAA)
January 5, 2009
Prepared for
National Aeronautics and Space Administration
300 E Street SW
Washington, DC, 20546
Project Officer:
Bernice Alston
Prepared by
Alina Martinez, Carter Smith
Abt Associates Inc.
55 Wheeler St.
Cambridge, MA 02138
Clemencia Cosentino de Cohen
The Urban Institute
2100 M Street, NW
Washington, DC 20037
Project Director:
Alina Martinez
Supporting Statement B for
Evaluation of the NASA SEMAA Project
Name: Walter Kit, NASA Clearance Officer,
Address: NASA Headquarters
300 E Street, SW., JE0000
Washington, DC 20546
Telephone: (202) 358–1350
Fax:
Email: [email protected]
Table of contents
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
B.1 Respondent Universe and Sampling Methods 1
B.2 Procedures for the Collection of Information 1
B.3 Methods to Maximize Response Rates and Deal with Nonresponse 2
B.4 Test of Procedures or Methods to be Undertaken 2
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 3
List of Attachments:
Attachment 1. Site Staff Interview Protocol
Attachment 2. Student Survey
Attachment 3. Parent Survey
Attachment 4. Parent Permission and Consent Form
Attachment 5. Child Assent Form
This evaluation of the NASA SEMAA project has two components: an impact evaluation and an implementation study. The impact evaluation is a randomized control trial (RCT) designed to determine the impact of SEMAA on student and parent participants at sites utilizing a Saturday model. The implementation study is designed to examine how program characteristics are related to cross-site variation in such factors as student and parent attendance, program experiences, and enthusiasm for STEM learning. Below, we discuss the impact module and the implementation module separately.
B.1.1 Impact module
The universe of SEMAA sites for the impact evaluation is the census of 8 SEMAA sites that will offer an eight-week Saturday model in the 2009-2010 academic year. In a Saturday model (versus an in-school or after-school model), students and parents attend a three-hour morning session each Saturday for eight consecutive weekends in classrooms located at a partnering university or other locations that the local SEMAA site has obtained in the community. During the academic year, SEMAA offers three 8-week sessions in the Fall, Winter, and Spring. These sessions are identical in content: for example, the Fall eighth-grade SEMAA session is identical to the Winter eighth-grade SEMAA session for a given academic year. All such sites will be included in the study (of the 14 SEMAA sites, 8 expect to offer a Saturday model in the 2009-2010 academic year).
The impact module of the study focuses on the 8 SEMAA sites using the Saturday model for several reasons. First, the SEMAA sites offering Saturday sessions are implementing the SEMAA program in the manner that NASA intended and prefers. Other models of implementation (after-school, or in-school models) were adopted for largely idiosyncratic, local reasons on a trial basis. For example, some sites are experimenting with an after-school model to test the sustainability of this approach or in rural sites where parents and students live many miles from the actual SEMAA site. Implementing SEMAA as an after-school program can ameliorate transportation burdens on such families. The after-school sites differ significantly from the Saturday model sites and are not representative of SEMAA as a whole. Second, sites using in-school models of implementation have expanded or altered the SEMAA content materials for adoption as units in a school’s existing science courses. At these sites, random assignment of students to treatment or control is not possible.
The universe of respondents is the pool of 4th to 8th grade student applicants (and their parents) to the Fall 2009 SEMAA session at each site participating in the impact module of the study. Typically, SEMAA sites are oversubscribed for the Fall session: more students apply than can be accommodated in the Fall 8-week session. Assignment to the Fall session (rather than the Winter or Spring session) is determined on a “first-come, first-serve” basis: those families whose applications are received first are more likely to receive their first choice of sessions for the academic year. In many sites, prior participants receive information about upcoming enrollment directly from the site; new participants hear about SEMAA via advertisements, brochures, and SEMAA staff presentations at community or school events.
All eligible students (that is, those students in the 4th through 8th grade levels in school) who apply for the Fall 2009 session will be recruited--along with one parent--to participate in the study and will be randomly assigned to either a treatment or control group. The unit of assignment is the household. Each household (i.e., a student-parent pair) will be assigned either to a treatment condition (enrollment in the Fall 2009 SEMAA session at their site) or a control condition (embargoed enrollment until the Winter or Spring 2010 SEMAA session). Both prior SEMAA participants and new SEMAA applicants will be subject to random assignment. Those assigned to the control group will be free to enroll in other (i.e., non-SEMAA) science and/or mathematics enrichment activities.
Because some households may include more than one eligible student applicant to the Fall 2009 session, all student applicants from the same household will be assigned to the same experimental condition as follows:
Prior to random assignment, one sibling will be selected at random to be the “target” student for purposes of the study;
The target student will be randomly assigned to treatment or control status;
Any siblings of the target child who are also applying to SEMAA will be assigned by default to this same condition.
The purposes of this random assignment procedure are twofold:
to reduce burden on families;
to reduce the likelihood of contamination of the control group.
Burden: If two children in the same household were assigned to different conditions, many parents might incur extra transportation or childcare costs: while the parent accompanied one child to the Saturday SEMAA session, another adult could be needed to care for the non-SEMAA sibling.
Contamination: If one sibling were assigned to treatment and the other to control, it is possible—perhaps even likely—that a parent would have to bring the control child along to the SEMAA site during Saturday sessions. This control child could thus be directly exposed to SEMAA. Alternatively, indirect exposure could occur even if a parent were able to provide separate care during SEMAA sessions for each sibling. For example, if the SEMAA-enrolled (treatment group) child brought home projects, activities, or materials, the control child conceivably would have access to these materials. This access would interfere with the intended assignment of the control child (embargoed exposure to SEMAA) during the Fall 2009 session.
Any student or parent who declines to participate in the study will still be subject to random assignment for enrollment in the Fall 2009 SEMAA session. However, no survey data will be collected from these declinees.
Sampling: The number of anticipated applicants for the SEMAA project at sites that offer Saturday sessions was calculated based on estimates provided by the local sites. Students in grades 4, 5, 6, 7, and 8 will be selected for the evaluation. The actual number of applicants will be identified at the time that sites receive applications.
The proposed number of students to be included in the evaluation and surveyed is 990. Of these, 660 will be assigned to the treatment group and 330 to the control group. This sample size was calculated based on the following assumptions:
Desired minimum detectable effect (MDE) size is .21;
Participants will be assigned to treatment with probability .67 (and to the control group with probability = .33);1
Baseline survey will account for 50 percent or more of the variation in responses on the post-test survey;2
We will calculate estimates separately for the prior participants and the new applicants; and
We anticipate response rates of at least 75 percent from treatment and control groups.
To achieve the desired MDE with 80 percent power and a confidence level of 95 percent, we need a final analytic sample size of 742 students. If we assume response rates of 75 percent for students we need to recruit a total of 990 students for the study to achieve this analytic sample size (990 x .75 = 743).
The total number of parents recruited for the study will, we anticipate, equal the number of students recruited for the study. For parents with multiple students participating in the study, the total sample size for parents may be smaller than that estimate for students. The actual number of parents will be identified at the time that sites receive applications.
B.1.2 Implementation module
The universe of respondents for the implementation study is key staff (mostly site directors) knowledgeable about detailed characteristics of implementation at each site. All sites will be included—namely, those running Saturday, in-school and after-school programs.
Directors at each of these SEMAA sites should be able to provide the information needed. In some instances, however, directors may request that we contact a member of their staff to obtain answers to some of the questions. If so, then partial interviews will be held with other staff to elicit answers only to a subset of questions. The calculated burden includes an estimate of these potential partial interviews. Across the sites, up to 50 staff informants will be interviewed.
This request for clearance includes two types of data collection—interviews (for the Implementation Module) and surveys (for the Impact Module). The interview protocols have been developed to gather information that complements extant data on the implementation of the SEMAA project at the local sites.
Surveys gather information on outcomes for students and parents that have been identified in the program theory. The surveys have been constructed using existing instruments that measure the constructs of interest and have been field tested and modified to apply to the project under study.
Interviews. Interviews will be conducted by telephone. Detailed information about the sites will be collected in advance to the extent possible using extant data sources (e.g., site annual reports, site monitoring data) and verified with respondents during the interview. The interview protocol is included as Attachment 1.
Surveys of Students and Parents. A student and parent survey will be administered twice to both a treatment and control group. The first round of survey data collection (the pretest round) will occur prior to the beginning of the Fall 2009 SEMAA session. The second round (the post-test round) will occur after the conclusion of the eight week Fall session at each site.
Pre-test Survey: Procedures for Data Collection
Each SEMAA sites will mail SEMAA application packets to each household that requests an application. Along with the SEMAA application, this packet will include:
Study information sheet
Parent consent and permission form (Attachment 4)
Child assent form (Attachment 5)
Student survey (pretest; see Attachment 3) w/ manila privacy envelope
Parent survey (pretest; see Attachment 2) w/ manila privacy envelope
Pre-paid Business Reply Envelope (BRE)
Whether or not they agree to participate in the study, parents will return the SEMAA application. Parents who agree to participate and grant permission for their child to participate will also return the following materials in the Study BRE:
Signed Parent Permission and Consent form
Child assent form with child’s name signed or printed;
Completed student survey (pretest) sealed in manila privacy envelope
Completed parent survey (pretest) sealed in manila privacy envelope
Parents who decline to participate in the study will return the SEMAA application. These non-participants will still be subject to random assignment but the study will not collect survey data from them. Such parents may dispose of the pre-test surveys; they will not receive post-test surveys.
Participating and non-participating households will be randomly assigned to the Fall 2009 SEMAA session at each site. During the eight-week SEMAA session, no data collection from treatment or control students or parents will take place.
At the conclusion of the eight-week SEMAA session (attended by students and parents assigned to the treatment group), all study participants (that is, both those in the treatment and those in the control group) will receive the post-test survey by mail.
Post-test Survey: Procedures for Data Collection
The purpose of the post-test survey is to collect data on the same interests and attitudes about science surveyed at pre-test at a second point in time (post-treatment receipt for those in the treatment group; and eight-weeks later for the control group). For the post-test round of data collection, we will mail families a packet containing the following:
Study information sheet
Student survey (pretest; see Attachment 3) w/ manila privacy envelope
Parent survey (pretest; see Attachment 2) w/ manila privacy envelope
Pre-paid Business Reply Envelope (BRE)
Postcards and phone calls will be used to prompt non-respondents to complete their surveys. Non-respondents will also be given the option to complete the survey via a Computer Assisted Telephone Interviewing system. As is true throughout the duration of the study, consent and parental permission to participate may be withdrawn at any time without penalty or change in assignment status.
Analysis and statistical tests
We will test the impact of SEMAA on students separately from its impact on parents. Impact will be tested by comparing any observed post- to pre-test difference in outcomes for the treatment group to observed post- to pre-test difference in outcomes for the control group. To test these impacts, we will use regression models that account for the clustering of respondents within sites (multi-level modeling) and key covariates. We describe these models in more detail below.
The estimation of program impacts on students will take into account the fact that students are nested within SEMAA project sites (and within families), meaning that the outcomes for students within a particular SEMAA project site will be more highly intercorrelated with each other than will outcomes for students participating in different SEMAA sites. Estimates of the program impact on student outcomes must appropriately adjust standard errors to account for this nesting. We will use multi-level modeling to parse out the variance between students and sites (Raudenbush and Bryk, 2002). To improve the precision of estimates, we will include student-level (and potentially site-level) covariates in our analyses, such as baseline outcomes from the pre-test survey, minority status, gender, and grade. Student and parent outcomes will be analyzed and interpreted separately. As described above, we will communicate regularly with SEMAA sites to monitor any attrition or contamination (that is, students assigned to SEMAA who end up not participating in SEMAA activities, or vice versa). Should we detect cross-overs, we will test the extent to which attrition (or contamination) threatens study validity and deploy appropriate weighting procedures in analyses as needed.
We will use a two-level multi-level model to investigate the overall impact of SEMAA. Exhibit 2 displays an example of the type of model we will use. Exhibit 2 uses “student level of interest in formal STEM education” as an example outcome; the control variables listed are likely covariates; others may be included also. In the Level 2 equation shown, the parameter 10 represents the overall average impact of SEMAA across all sites. We will use the standard error associated with 10 to test the null hypothesis that SEMAA has no effect; if the t-statistic is statistically significant and positive, we will reject this null hypothesis and conclude that SEMAA has a positive impact on students’ level of interest in formal STEM education. For each student and parent outcome of interest, similar models will be estimated.
Exhibit 2. Prototypical Analytic Model for Impact Analysis of SEMAA Program |
Level-1: Yij = B0j + B1j(trtij) + B2j(preij) + B3j(minorityij) + B4j(femaleij) + B5j(gradeij) + εij Level-2: B1j = 10 + 1j Yij = the level of interest in formal STEM education for the ith student in the jth SEMAA program trtij = a treatment indicator denoting whether the ith student in the jth program is treatment student or a control student (trtij = 1 for treatment, trtij = 0 for control) preij = the baseline (i.e., pre-program) level of interest in formal STEM education for the ith student in the jth SEMAA program minorityij = 1 if the ith student in the jth program is an ethnic/racial minority group member traditionally underrepresented in STEM, and = 0 otherwise femaleij = 1 if the ith student in the jth program is female, and = 0 otherwise gradeij = the grade of the ith student in the jth program at the time of random assignment
εij
= the student-level residual of the ith student in the jth
program |
For each outcome of interest (e.g., student interest in pursuing future science coursework at post-test) we will test the difference between the treatment (SEMAA participation) and control groups (embargoed SEMAA participation) controlling for key covariates such as pretest score on the outcome measure (baseline interest in pursuing future science coursework at post-test), extent of involvement in non-SEMAA informal science activities, number of years’ prior participation in SEMAA, and key demographic variables such as ethnicity and parent’s highest level of education obtained. The statistical significance of impact estimates will be tested using F-tests and an alpha level of .05.
Impacts on parents will be tested separately from impacts on students.
For the implementation module, we will employ simple descriptive statistics—such as counts, ranges, and frequencies—and statistical tests, such as 2 test or t-test, to test for differences between groups. The analyses of the interviews will include simple frequencies as well as descriptive summaries of emergent themes.
Several methods will be used to maximize response rates and to deal with non-response, such as:
Providing a sufficient timeframe for data collection. For example, interviews will be carried out over the course of several weeks, to make sure that busy schedules of site respondents can be accommodated;
Providing respondents with a pre-paid, pre-addressed, Business Reply Envelope for ease of return mailing. Respondents can simply return the completed surveys via the U.S. Postal Service, rather than having to locate a FedEx or UPS drop-box
Designing the interview protocol to target the specific information that each respondent group is likely to possess;
Distributing pre-test (baseline) surveys with SEMAA enrollment application materials to be returned together;
Providing a $20 cash incentive for each family that returns a completed survey packet at post-test, including a completed student survey and a completed parent survey;
Barcoding surveys for efficient tracking of survey receipt to identify respondents and non-respondents;
Following up with non-respondents via postcards and phone calls; and
Offering non-respondents the option of completing the survey by telephone using Computer Assisted Telephone Interviewing (CATI) capabilities.
We expect to achieve response rates of 75 percent or higher for both treatment and control groups.
Impact Module
These procedures were tested and refined as follows. First, existing instruments with established psychometric characteristics were selected after an extensive literature review (Modified Attitudes Towards Science Inventory, Weinburgh and Steele, 2000; and the Math and Science Interest Survey, Hulett, Williams, Twitty, Turner, Salamo, and Hobson, 2004).. These instruments were modified by adding items to collect additional data needed for the study not included in the existing instruments. Experts in the field reviewed draft and final instruments for content validity and clarity. In addition, survey instruments were pilot tested in order to identify barriers to, or difficulties in, completing the survey forms. Revisions were made to each instrument based on feedback from pilot testing; surveys eight individuals from grades 4 though 8 and their parents.
Implementation Module
The interview protocol to be administered to Project Directors (see Attachment 1) was especially designed for this study. Although based on protocols used in similar studies conducted in the past, the protocol is tailored to the SEMAA project to ensure that relevant project information is obtained. Responses and general feedback obtained during field tests (details provided below) were used to finalize the wording of questions, add probes, and exclude questions deemed unnecessary.
Field Test. The interview protocol was field tested with former Directors at two different SEMAA sites. One site serves mainly African American students through a Saturday model, while the other serves Native American students through an in-school model. As expected, both interviews lasted about one hour (one slightly less and one slightly more). After completing the interview, respondents were also asked to comment on the clarity and wording of the questions to ensure content validity. The information gathered was reviewed by, and discussed among, evaluation project staff and subsequently used to finalize the attached interview protocol.
The contractors for collection and analysis of data in this study are Abt Associates Inc., Cambridge, MA, and The Urban Institute, Washington DC. Staff from these organizations have knowledge of statistical methods, experience in evaluation of research programs, and expertise in scientific research.
Key personnel include involved in the statistical aspects and who will be involved in collecting and analyzing data include:
Abt Associates |
|
|
Alina Martinez |
Project Director, Abt Associates |
617-349-2312 |
W. Carter Smith |
Task Leader, Impact Module of SEMAA evaluation |
617-349-2543 |
Amanda Parsad |
Director of Analysis, Impact Module |
301-634-1791 |
Cristopher Price |
Technical Advisor, Impact Module |
301-634-1852 |
Urban Institute |
|
|
Clemencia Cosentino |
Task Leader, Implementation Module |
202-261-5409 |
1 Because sites may vary in the extent to which they are oversubscribed, the probability of assignment to the treatment (vs. control) condition may vary by site.
2 This estimate is based on a preliminary review of similar measure of attitudes and self-efficacy (see Cowe et al., 1991; Harty & Beall, 1991; Lewis, Cruise, McGuckin & Francis, 2006; and Tapia & Marsh, 2004).
File Type | application/msword |
File Title | Supporting Statement 'B' Preparation - 02/12/2008 |
Subject | Supporting Statement 'B' Preparation - 02/12/2008 |
Author | OD/USER |
Last Modified By | Walter Kit, DSc |
File Modified | 2009-01-09 |
File Created | 2009-01-09 |